Shahrear Bin Amin
Shahrear Bin Amin

Reputation: 1125

How to run Neural Network model on Android taking advantage of GPU?

Anyone tried to run object detection or crnn model on Android? I tried to run crnn model (serialized pytorch) but it takes 1s on Huawei P30 lite and 5s on Samsung J4 Core.

Huawei P30 lite
    CPU : octa core processor
    GPU : Mali-G51 MP4

Samsung J4
  CPU : quad core
  GPU : Adreno 308

GPU's in the android device are different from dedicated GPU in the sense that they don't have VRAM and power management. Both CPU and GPU shares the same RAM. Before running model on PC with GPU we specify to place my computation on GPU like

model = MyModel()
model.cuda()

But when I try to run model on Android does it take advantage of this built in GPU? or computation is faster in my Huawei because of this octa core processor, but Huawei obviously has better GPU than my Samsung device.

Upvotes: 3

Views: 1382

Answers (2)

Ben Clark
Ben Clark

Reputation: 487

TFLite works well if you can move from PyTorch.

Specifically, for your Mali GPU as per https://developer.arm.com/ip-products/processors/machine-learning/arm-nn - ArmNN will farm out work appropriately to CPU/GPU, working beneath the Android NNAPI that interfaces with TFLite / Caffe2.

NNAPI info: https://developer.android.com/ndk/guides/neuralnetworks

Upvotes: 1

sebi0920
sebi0920

Reputation: 300

At the moment it is not possible to run pytorch on am ARM-GPU:

Github Issue

PyTorch Forum

I think the differences in speed result out of the differnten cpu's!

Upvotes: 1

Related Questions