Huy Nguyễn
Huy Nguyễn

Reputation: 41

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference

I have a problem when running a tflite model in my image classification app project( flutter ).

I have converted my CNN model to a tflite model via the code below:

import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model('/content/drive/MyDrive/CNN_Model/10.07.CNNmodel.100.92')
# converter.optimizations = [tf.lite.Optimize.OPTIMIZE_FOR_LATENCY]
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
open("converted_model.tflite", "wb").write(tflite_model)

I tested the output of the tflite model and believe that it was converted successful. But when I added to my dart file in my project by the code:

Future loadModel() async {
    await Tflite.loadModel(
      model: 'assets/tflite/converted_model.tflite',
      labels: 'assets/tflite/labels.txt',
    );
  }

After I ran the main.dart, took a picture to test, I got these errors:

E/AndroidRuntime( 6189): FATAL EXCEPTION: AsyncTask #1
E/AndroidRuntime( 6189): Process: com.hajaj_projects.cat_vs_dog, PID: 6189
E/AndroidRuntime( 6189): java.lang.RuntimeException: An error occurred while executing doInBackground()
E/AndroidRuntime( 6189):    at android.os.AsyncTask$4.done(AsyncTask.java:415)
E/AndroidRuntime( 6189):    at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:383)
E/AndroidRuntime( 6189):    at java.util.concurrent.FutureTask.setException(FutureTask.java:252)
E/AndroidRuntime( 6189):    at java.util.concurrent.FutureTask.run(FutureTask.java:271)
E/AndroidRuntime( 6189):    at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:305)
E/AndroidRuntime( 6189):    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
E/AndroidRuntime( 6189):    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
E/AndroidRuntime( 6189):    at java.lang.Thread.run(Thread.java:923)
E/AndroidRuntime( 6189): Caused by: java.lang.IllegalArgumentException: Internal error: Failed to run on the given Interpreter: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
E/AndroidRuntime( 6189): Node number 21 (FlexAddV2) failed to prepare.
E/AndroidRuntime( 6189): 
E/AndroidRuntime( 6189):    at org.tensorflow.lite.NativeInterpreterWrapper.run(Native Method)
E/AndroidRuntime( 6189):    at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:204)
E/AndroidRuntime( 6189):    at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:374)
E/AndroidRuntime( 6189):    at org.tensorflow.lite.Interpreter.run(Interpreter.java:332)
E/AndroidRuntime( 6189):    at sq.flutter.tflite.TflitePlugin$RunModelOnImage.runTflite(TflitePlugin.java:504)
E/AndroidRuntime( 6189):    at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:471)
E/AndroidRuntime( 6189):    at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:445)
E/AndroidRuntime( 6189):    at android.os.AsyncTask$3.call(AsyncTask.java:394)
E/AndroidRuntime( 6189):    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
E/AndroidRuntime( 6189):    ... 4 more
I/Process ( 6189): Sending signal. PID: 6189 SIG: 9
Lost connection to device.

When I removed the Normalization layer in my CNN model then converted it again, the app runs smoothly. But removing Normalization model decreased lots of my model accuracy.

I'm new to flutter. By any chance, has anybody encountered and resolved this before? Thanks for spending time.

Upvotes: 4

Views: 4346

Answers (1)

Jae sung Chung
Jae sung Chung

Reputation: 903

dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
    // This dependency adds the necessary TF op support.
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly-SNAPSHOT'
}

The converted model contains the Select TF ops. This means the model requires the Select TF op dependency.

If the above dependency is too big footprint, please consider creating a custom AAR through the selective build https://www.tensorflow.org/lite/guide/reduce_binary_size

See also https://www.tensorflow.org/lite/guide/ops_select.

Upvotes: 2

Related Questions