Reputation: 1887
I am using tflite for semantic segmentation. I have a model trained to segment objects from background, this model is trained on deeplab.
I have converted this model(frozen inference graph) into tflite format using the below code:
tflite_convert \
--output_file=test.lite \
--graph_def_file=frozen_inference_graph.pb \
--input_arrays=ImageTensor \
--output_arrays=SemanticPredictions \
--input_shapes=1,600,450,3 \
--inference_input_type=QUANTIZED_UINT8 \
--inference_type=FLOAT \
--mean_values=128 \
--std_dev_values=128
The model loads on android, but when I try to run inference it gives me this error:
Caused by: java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: third_party/tensorflow/lite/kernels/unpack.cc:54 NumDimensions(input)
1 was not true.Node number 4 (UNPACK) failed to prepare.
How do I resove this error?
Upvotes: 0
Views: 2470
Reputation: 1887
It worked for the below command:
bazel-bin/tensorflow/lite/toco/toco \
--input_file=deeplabv3_mnv2_pascal_tain.pb \
--output_file=test.tflite \
--inference_input_type=QUANTIZED_UINT8 \
--inference_type=FLOAT \
--input_arrays=ImageTensor \
--output_arrays=SemanticPredictions \
--input_shapes=1,513,513,3 \
--mean_values=128 \
--std_dev_values=128
It worked when I installed tensorflow from source. In order to install tensorflow from source I used (link)
Upvotes: 0