Reputation: 697
I tried to convert my already existing frozen graph, which is saved in a .pb-file, with the following code (tf_lite_converter.py):
#!/usr/bin/env python
import sys
import tensorflow as tf
from tf.contrib.lite import convert_savedmodel
convert_savedmodel.convert(
saved_model_dir="/frozen_inference_graph.pb",
output_tflite="/TF_Lite_Model")
When running the code with
python tf_lite_converter.py
in my anaconda environment, it gives me the error:
ImportError: No module named tf.contrib.lite
My goal is to get a tensorflowlite-model out of my .pb-graph to use it in an Android application. I already tried to build tflite with toco through bazel, but maybe (or most likely) I did something wrong or it was not the right way to fix the problem.
Referencing Video from Tensorflow-Developers: https://youtu.be/FAMfy7izB6A?t=11m49s
Upvotes: 1
Views: 6143
Reputation: 16317
None of the above things worked for me.
I downgraded to Tensorflow 1.7 and converted the .pb model to .tflite model using toco.
$ pip install --upgrade "tensorflow==1.7.*"
$ toco \
--input_file=tf_files/retrained_graph.pb \
--output_file=tf_files/optimized_graph.lite \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--input_shape=1,${IMAGE_SIZE},${IMAGE_SIZE},3 \
--input_array=input \
--output_array=final_result \
--inference_type=FLOAT \
--input_data_type=FLOAT
Ref : https://codelabs.developers.google.com/codelabs/tensorflow-for-poets-2-tflite/#2 https://github.com/googlecodelabs/tensorflow-for-poets-2/issues/52 https://medium.com/@rdeep/tensorflow-lite-tutorial-easy-implementation-in-android-145443ec3775
Upvotes: 1
Reputation: 2924
The code in that video is probably from an internal development version.
convert_savedmodel has been renamed to convert_saved_model in https://github.com/tensorflow/tensorflow/commit/db076ca01f12368c9476fa4db9d87756f22f9670
The following seems to work for tensorflow 1.8:
from tensorflow.contrib.lite.python import convert_saved_model
convert_saved_model.convert(saved_model_dir="/frozen_inference_graph.pb",output_tflite="/TF_Lite_Model")
The following is for tensorflow built from the current master (method and parameter have been renamed):
from tensorflow.contrib.lite.python import convert_saved_model
convert_saved_model.tflite_from_saved_model(saved_model_dir="/frozen_inference_graph.pb",output_file="/TF_Lite_Model")
Upvotes: 1