N Ruiz
N Ruiz

Reputation: 9

Android app crashes after loading trained tensorflow protobuf model

I am trying to load a model that I trained myself on the tensorflow android app. I trained the model in Caffe and then converted it to Tensorflow, but I am sure that this is not the problem because I tested it using classify.py and it works.

I then serialize the model and put it in a .pb, I replace the tensorflow_inception_graph.pb with mine (and name it the same). I can build the app using bazel but when I install it on the phone and run it, it crashes instantly. I think the culprit is the following error:

F/native  (26026): tensorflow_jni.cc:309 Error during inference: Invalid argument: No OpKernel was registered to support Op 'FIFOQueue' with these attrs
F/native  (26026):   [[Node: processed_queue = FIFOQueue[capacity=1, component_types=[DT_INT32, DT_FLOAT], container="", shapes=[[], [224,224,3]], shared_name=""]()]]

But I don't really know how to fix it.

Also the apk has the entire protobuf inside.

Thanks for the help.

Upvotes: 1

Views: 306

Answers (1)

Pete Warden
Pete Warden

Reputation: 2878

Unfortunately you're hitting a limitation of the mobile build target for TensorFlow. By default, we only include the operations that are typically needed for inference, which doesn't include FIFOQueue.

Assuming that you don't actually need to run that op during your inference pass on mobile, you can look at using the strip_unused.py script and passing in the input and output op names. That will remove everything except the ops that are directly required to produce the output starting at the given input nodes, and so should strip FIFOQueue and other training ops.

Upvotes: 1

Related Questions