Can not enable dropout for inference after model conversion to TFLite format, in order to do monte carlo dropout

I am trying to do the Monte Carlo Dropout technique in Keras for a tensorflow lite model, in order to get various uncertainty metrics for an image classification model (InceptionV3 CNN + custom classification layer with mc dropout).

I have enabled dropout at inference time in the model by setting "Training=true" in the dropout layer parameters.

Mc Dropout works perfectly when doing inference on the PC : dropout is applied at inference and I am able to sample various softmax values and compute the metrics.

But after converting the model to tflite (post training quantization), it seems that the dropout layers are not active anymore when doing inference with this tflite model. Even before the compilation to edgeTPU.

I'm using tensorflow 2.7.4 which seems to be the last version supported by the edgetpu_compiler of Coral. If I try a more recent version of tensorflow, tflite models will not be accepted by the compiler.

Is there any chance for activating dropout at inference time on a tflite model ? and after edgetpu compilation ?

I tried these converter options : converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

I also tried converter.experimental_new_converter = True

I tried to hard code my own dropout function, without success either (tflite converter crashes).

Upvotes: 0

Views: 32

Answers (0)

Related Questions