Haroon S.
Haroon S.

Reputation: 2613

Inferencing from tflite model in Java

I have exported a tflite model and using Python code on this link, I am able to do inferencing from this model. However, now I am trying to do inferencing in Android app using Java. I have been following official documentation here but I am unable to get it work. Can someone guide me how to get it done? All I need is following steps.

  1. Read tflite model.
  2. Generate dummy record of shape [1,200,3]
  3. Get inference from tflite model and print it.

I have been reading tflite demos but still could not get around it. To load model, I use

Interpreter interpreter = new Interpreter(file_of_a_tensorflowlite_model)

from official document and get following error:

error: no suitable constructor found for Interpreter(String)
constructor Interpreter.Interpreter(File) is not applicable
  (argument mismatch; String cannot be converted to File)

I am unable to resolve it. How do I do this simple task?

Upvotes: 1

Views: 2158

Answers (2)

Shubham Panchal
Shubham Panchal

Reputation: 4289

You can paste the TFLite model in your assets folder of your app. And then, use this code to load its MappedByteBuffer.

 

private MappedByteBuffer loadModelFile() throws IOException {
        String MODEL_ASSETS_PATH = "recog_model.tflite";
        AssetFileDescriptor assetFileDescriptor = context.getAssets().openFd(MODEL_ASSETS_PATH) ;
        FileInputStream fileInputStream = new FileInputStream( assetFileDescriptor.getFileDescriptor() ) ;
        FileChannel fileChannel = fileInputStream.getChannel() ;
        long startoffset = assetFileDescriptor.getStartOffset() ;
        long declaredLength = assetFileDescriptor.getDeclaredLength() ;
        return fileChannel.map( FileChannel.MapMode.READ_ONLY , startoffset , declaredLength ) ;
}

And then call it in the constructor.

Interpreter interpreter = new Interpreter( loadModelFile() )

Upvotes: 3

Haroon S.
Haroon S.

Reputation: 2613

I have found the solution. The problem is, new Interpreter(file_of_a_tensorflowlite_model) does not take string file name as input. You have to make it a MappedByteBuffer and pass that to Interpreter.

new Interpreter(my_byte_buffer_method(abc.tflite))

After which it works fine. Just posting if someone else is facing the same issue.

Upvotes: 2

Related Questions