user3428154
user3428154

Reputation: 1364

NN model format conversion tflite -> onnx

I'd like to convert the hosted models TensorFlow-Lite hosted models mainly the mobilenets into ONNX format. So I'd like to try the quantized version of those hosted models and run them with onnx-runtime.

What would be the right procedure for converting those models to be consumed by onnx-runtime?

Upvotes: 1

Views: 2800

Answers (2)

Hitesh Kumar
Hitesh Kumar

Reputation: 313

There is a tflite to onnx converter. I have tried this and it has worked for me. I'm not sure about the tensorflow-lite hosted models, but you can give it a try. In my case I used this converter for the models developed by me and it has worked fine.

import tflite2onnx

tflite_path = "path/to/the/tflitemodel"
onnx_path = "path/where/you/want/to/save/your/model" #modelname.onnx
tflite2onnx.convert(tflite_path,onnx_path)

Upvotes: 0

yyoon
yyoon

Reputation: 3855

I'm pretty new to ONNX, but according to their official tutorial page, there isn't a TFLite to ONNX converter.

You could still use the TensorFlow -> ONNX conversion path. When you download one of the TFLite hosted models, you'll get an archive, which contains the original TensorFlow frozen graph used for conversion (xxx_frozen.pb), as well as the converted .tflite. You could take the frozen graph format and feed it into the TensorFlow-ONNX converter as instructed below:

https://github.com/onnx/tensorflow-onnx#getting-started

Upvotes: 0

Related Questions