Yahya Nik
Yahya Nik

Reputation: 85

Load TFLite model in android project using android studio

I am in the middle of importing a tflite model (SsdMobilenetV2320x320Coco17Tpu8 from TF detection zoo and ocnverter to tf lite with the help of tflite converter.) into the Android studio for an app. I used the new->other->tf lite model approach for importing the model and added the dependencies. There is however a "context" that I have no idea about.

SsdMobilenetV2320x320Coco17Tpu8 model = SsdMobilenetV2320x320Coco17Tpu8.newInstance(context);

The help documentation from tf (https://www.tensorflow.org/lite/guide/android) does not give any indication to what this "context" is!

The other solutions I found are for the Classifier class that I believe would not be of use to me since this is an object detection netwrok.

here is the code sample given in the .ml.SsdMobilenetV2320x320Coco17Tpu8 file:

try {
        SsdMobilenetV2320x320Coco17Tpu8 model = SsdMobilenetV2320x320Coco17Tpu8.newInstance(context);

        // Creates inputs for reference.
        TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 1, 1, 3}, DataType.UINT8);
        inputFeature0.loadBuffer(byteBuffer);

        // Runs model inference and gets result.
        SsdMobilenetV2320x320Coco17Tpu8.Outputs outputs = model.process(inputFeature0);
        TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
        TensorBuffer outputFeature1 = outputs.getOutputFeature1AsTensorBuffer();
        TensorBuffer outputFeature2 = outputs.getOutputFeature2AsTensorBuffer();
        TensorBuffer outputFeature3 = outputs.getOutputFeature3AsTensorBuffer();
        TensorBuffer outputFeature4 = outputs.getOutputFeature4AsTensorBuffer();
        TensorBuffer outputFeature5 = outputs.getOutputFeature5AsTensorBuffer();
        TensorBuffer outputFeature6 = outputs.getOutputFeature6AsTensorBuffer();
        TensorBuffer outputFeature7 = outputs.getOutputFeature7AsTensorBuffer();

        // Releases model resources if no longer used.
        model.close();
    } catch (IOException e) {
        // TODO Handle the exception
    }
}

any help regarding the inputFeature0.loadBuffer(byteBuffer); is also appritiated, I am guessing that is to lead data from memory after instance is created.

Upvotes: 0

Views: 2192

Answers (1)

Alex K.
Alex K.

Reputation: 861

There is however a "context" that I have no idea about.

I have not tried this new Android studio feature, but android offers getApplicationContext() method, I would give it a try

any help regarding the inputFeature0.loadBuffer(byteBuffer); is also appritiated, I am guessing that is to lead data from memory after instance is created.

TensorBuffer inputFeature0 is your input, take a look at possible loading methods (It consumes byte buffers, as well as ordinary 1d arrays). Also specify your input size/type prior load by change TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 1, 1, 3}, DataType.UINT8); See details it comes handy when pre/postprocessing required.

Upvotes: 1

Related Questions