Joel
Joel

Reputation: 2361

Converting ONNX model to TensorFlow Lite

I've got some models for the ONNX Model Zoo. I'd like to use models from here in a TensorFlow Lite (Android) application and I'm running into problems figuring out how to get the models converted.

From what I've read, the process I need to follow is to convert the ONNX model to a TensorFlow model, then convert that TensorFlow model to a TensorFlow Lite model.

import onnx
from onnx_tf.backend import prepare
import tensorflow as tf

onnx_model = onnx.load('./some-model.onnx') 
tf_rep = prepare(onnx_model)
tf_rep.export_graph("some-model.pb") 

After the above executes, I have the file some-model.pb which I believe contains a TensorFlow Freeze Graph. From here I am not sure where to go. When I search I find a lot of answers that are for TensorFlow 1.x (which I only realize after the samples I find fail to execute). I'm trying to use TensorFlow 2.x.

If it matters, the specific model I'm starting off with is here.

Per the ReadMe.md, the shape of the input is (1x3x416x416) and the output shape is (1x125x13x13).

Upvotes: 4

Views: 3719

Answers (1)

Joel
Joel

Reputation: 2361

I got my anser. I was able to use the code below to complete the conversion.

import tensorflow as tf
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph('model.pb', #TensorFlow freezegraph 
                                                  input_arrays=['input.1'], # name of input
                                                  output_arrays=['218']  # name of output
                                                  )
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,
                                   tf.lite.OpsSet.SELECT_TF_OPS]      
# tell converter which type of optimization techniques to use
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tf_lite_model = converter.convert()
open('model.tflite', 'wb').write(tf_lite_model)

Upvotes: 4

Related Questions