Reputation: 31
I'm new to all this, and I need some help with running inference using a custom tflite yolov3 tiny model. The error I am getting is:
File "/usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/interpreter.py", line 524, in invoke
self._interpreter.Invoke()
RuntimeError: tensorflow/lite/kernels/reshape.cc:55 stretch_dim != -1 (0 != -1)Node number 35 (RESHAPE) failed to prepare.
What have I done to get here:
When the model was trained I tested the SavedModel by running inference and it worked. Converted the SavedModel to tflite, run inference on it using the following code, and received the error from the title:
interpreter = tf.lite.Interpreter(model_path="converted_model.tflite")
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
interpreter.set_tensor(input_details[0]['index'], input_data)
interpreter.invoke()
(this code is from here, btw https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_python )
Data for node 35:
type: Reshape
location: 35
inputs
data: name: functional_1/tf_op_layer_Tile_3/Tile_3;StatefulPartitionedCall/functional_1/tf_op_layer_Tile_3/Tile_3
shape: name: functional_1/tf_op_layer_strided_slice_6/strided_slice_6;StatefulPartitionedCall/functional_1/tf_op_layer_strided_slice_6/strided_slice_6
outputs
reshaped: name: functional_1/tf_op_layer_strided_slice_16/strided_slice_16;StatefulPartitionedCall/functional_1/tf_op_layer_strided_slice_16/strided_slice_16
Please help. I am out of ideas.
Upvotes: 3
Views: 1954
Reputation: 29
I was able to solve the exact same problem by following this similar post: https://stackoverflow.com/a/62552677/11334316
In essence you would have to do the following when converting your model:
batch_size = 1
model = tf.keras.models.load_model('./yolo_model')
input_shape = model.inputs[0].shape.as_list()
input_shape[0] = batch_size
func = tf.function(model).get_concrete_function(tf.TensorSpec(input_shape, model.inputs[0].dtype))
model_converter = tf.lite.TFLiteConverter.from_concrete_functions([func])
model_lite = model_converter.convert()
f = open("./yolo_model.tflite", "wb")
f.write(model_lite)
f.close()
Upvotes: 1