ZookKep
ZookKep

Reputation: 501

How to use onnxruntime with .ort model in Android Studio

I'm trying to create an Android App that incorporates a Machine Learning Model. I had an onnx model, along with a Python script file, two json files with the label names, and some numpy data for mel spectrograms computation.

I tried to go with onnxruntime, and followed these instructions. So now I have created the model.ort file out of the onnx model and "A minimal build for Android with NNAPI support", so I have the Build onnxruntime pkg.

Since I'm completely new at this, how do I continue from here? How do I "inference on device"? enter image description here

And also, will I have to convert my python script that runs the model to Java? Thank you!

Upvotes: 4

Views: 4537

Answers (1)

Rachel Guo
Rachel Guo

Reputation: 21

In order to use onnxruntime in an android app, you need to build an onnxruntime AAR(Android Archive) package. This AAR package can be directly imported into android studio and you can find the instructions on how to build an AAR package from source in the above link.

We now have an end-to-end example, which is a sample ORT Mobile image classification application using MobileNetV2.ort. In this example, a built AAR package is provided. You can download the sample android app, import the onnxruntime aar package into android studio and try on your android device to get a sense of how the inference works.

Upvotes: 2

Related Questions