PriyanshuG
PriyanshuG

Reputation: 145

Cannot convert between a TensorFlowLite buffer with 307200 bytes and a Java Buffer with 270000 bytes

I am trying to run a pre-trained Object Detection TensorFlowLite model from Tensorflow detection model zoo. I used the ssd_mobilenet_v3_small_coco model from this site under the Mobile Models heading. According to the instructions under Running our model on Android, I commented out the model download script to avoid the assets being overwritten: // apply from:'download_model.gradle' in build.gradle file and replaced the detect.tflite and labelmap.txt file in assets directory. Build was successful without any errors and the app was installed in my android device but it crashed as soon as it launched and the logcat showed:

E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.lite.examples.detection, PID: 16960
java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite buffer with 307200 bytes and a Java Buffer with 270000 bytes.
    at org.tensorflow.lite.Tensor.throwIfShapeIsIncompatible(Tensor.java:425)
    at org.tensorflow.lite.Tensor.throwIfDataIsIncompatible(Tensor.java:392)
    at org.tensorflow.lite.Tensor.setTo(Tensor.java:188)
    at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:150)
    at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:314)
    at org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:196)
    at org.tensorflow.lite.examples.detection.DetectorActivity$2.run(DetectorActivity.java:185)
    at android.os.Handler.handleCallback(Handler.java:873)
    at android.os.Handler.dispatchMessage(Handler.java:99)
    at android.os.Looper.loop(Looper.java:201)
    at android.os.HandlerThread.run(HandlerThread.java:65)

I have searched through many TensorFlowLite documentations but did not find anything related to this error and I found some questions on stackoverflow having same error message but for a custom trained model, so that did not help. The same error keeps on coming even on a custom trained model. What should I do to eliminate this error?

Upvotes: 1

Views: 2058

Answers (1)

Mahsa Hassankashi
Mahsa Hassankashi

Reputation: 2139

You should resize your input tensors, so your model can take data of any size, pixels or batches.

The below code is for image classification and yours is object detection: TFLiteObjectDetectionAPIModel is responsible to get size. Try to manipulate the size in some where TFLiteObjectDetectionAPIModel.

enter image description here

The labels length needs to be match the output tensor length for your trained model.

  int[] dimensions = new int[4];
  dimensions[0] = 1; // Batch_size // No of frames at a time
  dimensions[1] = 224; // Image Width required by model
  dimensions[2] = 224; // Image Height required by model
  dimensions[3] = 3; // No of Pixels
  Tensor tensor = c.tfLite.getInputTensor(0);
  c.tfLite.resizeInput(0, dimensions);
  Tensor tensor1 = c.tfLite.getInputTensor(0);

Change input size

Upvotes: 2

Related Questions