Reputation: 14315
I'm not able to convert .pb to tflite
Here is the command that I'm executing to generate .pb I am successful in generating it.
IMAGE_SIZE=224
ARCHITECTURE="mobilenet_1_1.0_${IMAGE_SIZE}"
python retrain.py
--bottleneck_dir=tf_files/bottlenecks
--how_many_training_steps=500
--model_dir=tf_files/models/
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}"
--output_graph=tf_files/retrained_graph.pb
--output_labels=tf_files/retrained_labels.txt
--architecture="${ARCHITECTURE}"
--image_dir=tf_files/flower_photos
Once I am trying to create that .pb to .tflite get fail with same error "ValueError: Invalid tensors 'input' were found."
tflite_convert \
--output_file=foo.tflite \
--graph_def_file=retrained_graph.pb \
--input_arrays=input \
--output_arrays=MobilenetV1/Predictions/Reshape_1
Upvotes: 2
Views: 4139
Reputation: 14315
I just follow this google code demo.
https://codelabs.developers.google.com/codelabs/tensorflow-for-poets/#0
Working fine
IMAGE_SIZE=224
ARCHITECTURE="mobilenet_1.0_${IMAGE_SIZE}"
python -m scripts.retrain \
--bottleneck_dir=tf_files/bottlenecks \
--how_many_training_steps=500 \
--model_dir=tf_files/models/ \
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \
--output_graph=tf_files/retrained_graph.pb \
--output_labels=tf_files/retrained_labels.txt \
--architecture="${ARCHITECTURE}" \
--image_dir=tf_files/flower_photos
tflite_convert --graph_def_file=tf_files/retrained_graph.pb --output_file=tf_files/optimized_graph.tflite --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE --input_shape=1,224,224,3 --input_array=input --output_array=final_result --inference_type=FLOAT --input_data_type=FLOAT
I made one change for it simpley change mobilenet version.
Upvotes: 2
Reputation: 193
I got the same error as you with tflite converter python api.
This caused by the params we passed in input_arrays.
input_arrays
need tensor_name
defined in tf.placeholder(name="input")
not proto map key string
defined in build_signature_def(inputs={"input": tensor_info_proto},outputs...)
.
Here is a simple example.
x = tf.placeholder(tf.float32, [None], name="input_x")
...
builder = tf.saved_model.builder.SavedModelBuilder(saved_model_path)
input_tensor_info = {"input": tf.saved_model.build_tensor_info(x)}
output_tensor_info = ...
signature_def = tf.saved_model.build_signature_def(inputs=input_tensor_info,
outputs=...,
method_name=...)
builder.add_meta_graph_and_variables(...)
builder.save()
# convert saved_model to tflite format.
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_path,
input_arrays=["input"],
...)
...
...
Once you run a code like this will raise an error "ValueError: Invalid tensors 'input' were found."
If we make a small change as bellow, it will succeed.
# a small change when convert
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_path,
input_arrays=["input_x"],
...)
Upvotes: 0