Reputation: 4168
I'm looking the TF Lite Android App
Which can be found on GIT: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/lite/java/demo
How can I compile the tensorflow lite framework to use the optimized "atom" cpu type?
Is it possible to compile it on a MAC os with the CPU optimizations for the "atom" cpu?
I want to run the app on an Android device (SDK 22) with an "Intel Atom" Processor. When I run the application without any changes through Android Studio the rate was about 1200ms per frame. Compering the same APK installed on my Galaxy S9 (arm - snapdragon processor) was about 30ms per frame.
In the "build.gradle" there is this section:
dependencies {
...
compile 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
...
}
So it's seems that it's downloading the framework,
How can I compile it locally with the CPU optimization and set the app to use it instead of downloading the non optimized nightly version?
I tried to run this tutorial : Installing TensorFlow from Sources with the cpu flags but not sure exactly how it's helping me with the Android scenario..
Upvotes: 1
Views: 1257
Reputation: 13473
Assuming that your Atom device is x86, use the --fat_apk_cpu
flag to specify the x86
ABI:
$ bazel build -c opt --cxxopt='--std=c++11' \
--fat_apk_cpu=x86 \
//tensorflow/contrib/lite/java/demo/app/src/main:TfLiteCameraDemo
Switch x86
with x86_64
if you're building for a 64-bit device.
The built APK, available at bazel-bin/tensorflow/contrib/lite/java/demo/app/src/main/TfLiteCameraDemo.apk
, will contain the x86 .so
file:
$ zipinfo bazel-bin/tensorflow/contrib/lite/java/demo/app/src/main/TfLiteCameraDemo.apk | grep lib
-rw---- 2.0 fat 1434712 b- defN 80-Jan-01 00:00 lib/x86/libtensorflowlite_jni.so
If your device is connected, you can use bazel mobile-install
instead of bazel build
to directly install the app:
$ bazel mobile-install -c opt --cxxopt='--std=c++11' \
--fat_apk_cpu=x86 \
--start_app \
//tensorflow/contrib/lite/java/demo/app/src/main:TfLiteCameraDemo
Upvotes: 3