Aashutosh Rathi
Aashutosh Rathi

Reputation: 785

How to convert a retrained model to tflite format?

I have retrained an image classifier model on MobileNet, I have these files. Files generated on retraining

Further, I used toco to compress the retrained model to convert the model to .lite format, but I need it in .tflite format. Is there anyway I can get to tflite format from existing files?

Upvotes: 1

Views: 4066

Answers (3)

MrKhan
MrKhan

Reputation: 154

Here is a simple python script which you can use to convert .pb format graph into tflite.

import tensorflow as tf

graph_def_file = "output_graph.pb"  ##Your frozen graph

input_arrays = ["input"]         ##Input Node
output_arrays = ["final_result"] ##Output Node

converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays)

tflite_model = converter.convert()
open("converted_model.tflite","wb").write(tflite_model)

Upvotes: 2

Nupur Garg
Nupur Garg

Reputation: 524

In order to convert TensorFlow checkpoints and GraphDef to a TensorFlow Lite FlatBuffer:

  1. Freeze the checkpoints and graph using freeze_graph.py
  2. Convert the frozen graph to a TensorFlow Lite FlatBuffer using TOCO.

Your freeze_graph.py command will look similar to the following:

freeze_graph -- \
--input_graph=output_graph.pb \
--input_binary=true \
--input_checkpoint=checkpoint \
--output_graph=frozen_graph.pb \
--output_node_names= MobilenetV1/Predictions/Softmax

You can use either TocoConverter (Python API) or tflite_convert (command line tool) with your model. TocoConverter accepts a tf.Session, frozen graph def, SavedModel directory or a Keras model file. tflite_convert accepts the later three formats.

When using TOCO, specify the output_file parameter with a .tflite extension.

Upvotes: 1

harshithdwivedi
harshithdwivedi

Reputation: 1421

You can rename the .lite model to .tflite and it should work just fine. Alternatively, with toco, you can rename the output as it is created :

toco \
  --input_file=tf_files/retrained_graph.pb \
  --output_file=tf_files/optimized_graph.lite \ //change this to tflite
  --input_format=TENSORFLOW_GRAPHDEF \
  --output_format=TFLITE \
  --input_shape=1,224,224,3 \
  --input_array=input \
  --output_array=final_result \
  --inference_type=FLOAT \
  --input_data_type=FLOAT

Upvotes: 1

Related Questions