Reputation: 1150
I am trying to load converted_tflite.tflite
from the assets directory.
It's giving me an error java.lang.IllegalArgumentException: Contents of /file:/android_asset/converted_model.tflite does not encode a valid TensorFlowLite model: Could not open '/file:/android_asset/converted_model.tflite'.The model is not a valid Flatbuffer file
File file = new File("file:///android_asset/converted_model.tflite");
try (Interpreter interpreter = new Interpreter(file)) {
interpreter.run(inputData, output);
Log.d("TF LOG", output);
}catch(Exception e){
e.printStackTrace();
}
What I tried based on the stackoverflow answers
aaptOptions {
noCompress "tflite"
}
tensoflow-lite nighty version
implementation 'org.tensorflow:tensorflow-lite:0.1.2-nightly'
Upvotes: 0
Views: 5854
Reputation: 341
Adding to the existing answers, if you have created the tflite model with latest Tensorflow version (2.4.0) and facing a similar issue, add the below line to dependencies of build.gradle file
implementation 'org.tensorflow:tensorflow-lite:2.4.0'
and use the function provided by @LalitSharma for loading the model from 'assets' directory.
Can find the most recent release here https://bintray.com/google/tensorflow/tensorflow-lite
Upvotes: 0
Reputation: 1150
I used tensorflow-nighty build gradle version 0.1.2
implementation 'org.tensorflow:tensorflow-lite:0.1.2-nightly'
To load the model
/** Memory-map the model file in Assets. */
private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
throws IOException {
AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}
Upvotes: 2
Reputation: 1253
As error indicates model is not a valid Flatbuffer file
. In your implementation, model is as a File
. It supposed to be convert to flatbuffer file as implemented below,
FileInputStream f_input_stream= new FileInputStream(new File("file:///android_asset/converted_model.tflite"));
FileChannel f_channel = f_input_stream.getChannel();
MappedByteBuffer tflite_model = f_channel.map(FileChannel.MapMode.READ_ONLY, 0, f_channel .size());
And than you can use this tflite_model
to create tflite interpreter as New Interpreter(...)
.
Upvotes: 1