Reputation: 31
I am trying to integrate on-device machine learning using react-native. I have converted a transformers model from huggingface to a tensorflow lite file. Doing so, I can get the model to successfully run on the android side of things. When I try to do the same for iOS I am getting this following error:
TensorFlow Lite Error: Select TensorFlow op(s), included in the given model is(are) not supported by this interpreter. Make sure you apply/link Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency.
TensorFlow Lite Error: Node number 95 (FlexErf) failed to prepare.
I had a similar error on the android side and I solved it using the guide here: https://www.tensorflow.org/lite/guide/ops_select
I have followed the steps on the above link for iOS as well and yet I am still getting this error: This includes:
How can I get past this error?
Upvotes: 3
Views: 1641
Reputation: 1
are you following the instruction here? Are you building for armv7 or arm64? armv7 is no longer supported in latest nightly builds and the latest nightly builds distribute xcframeworks now. So when you -force_load
the framework, you might need to do -force_load $(SRCROOT)/Pods/TensorFlowLiteSelectTfOps/Frameworks/TensorFlowLiteSelectTfOps.xcframework/ios-arm64/TensorFlowLiteSelectTfOps.framework/TensorFlowLiteSelectTfOps
Upvotes: 0