I want to convert SuperPoint to run on Google Coral.
I have the model in ONNX format. I have removed NMS from the model architecture since MaxPool with index return is not supported on Coral. In general, the obtained model is a feature extractor-like architecture.
Then I converted from ONNX format to a Tensorflow saved model. To convert to .tflite format, I used TFLiteConverter.from_saved_model. For quantization, I used post-training quantization. To convert, I used random data to check that everything was working correctly. The conversion was successful, I received the quantized model in int8 tflite format.
Is there any way to convert this model properly or figure out what is the problem with layers?
When I tried to compile for Coral, I ran into a problem. When I tried to run this model on the Coral device, the inference time was very long, about 15 seconds. I have figured out, that not all layers were converted. The output of the compilation is in the table below:
Operator | Count | Status |
---|---|---|
MAX_POOL_2D | 3 | More than one subgraph is not supported |
PAD | 8 | More than one subgraph is not supported |
PAD | 1 | Operation is otherwise supported, but not mapped due to some unspecified limitation |
TRANSPOSE | 19 | Operation is otherwise supported, but not mapped due to some unspecified limitation |
RESHAPE | 1 | |
Cell 3 | Cell 4 | Operation is otherwise supported, but not mapped due to some unspecified limitation |
CONV_2D | 1 | Mapped to Edge TPU |
CONV_2D | 9 | More than one subgraph is not supported |
SOFTMAX | 1 | More than one subgraph is not supported |