Souvik Sankar Mitra
Souvik Sankar Mitra

Reputation: 41

CameraX ML KIT giving error " java.lang.IllegalStateException: Image is already closed

I want to make a real time image classifier using Google ML Kit and CameraX API. I am using Preview and Analysis of CameraX API. It is giving error as

    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err: Caused by: java.lang.IllegalStateException: Image is already closed
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at android.media.Image.throwISEIfImageIsInvalid(Image.java:68)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at android.media.ImageReader$SurfaceImage$SurfacePlane.getBuffer(ImageReader.java:832)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.zza(com.google.mlkit:vision-common@@16.0.0:139)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.convertToUpRightBitmap(com.google.mlkit:vision-common@@16.0.0:89)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.ImageConvertUtils.getUpRightBitmap(com.google.mlkit:vision-common@@16.0.0:10)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.label.automl.internal.zzo.zza(com.google.mlkit:image-labeling-automl@@16.0.0:16)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.label.automl.internal.zzo.run(com.google.mlkit:image-labeling-automl@@16.0.0:60)
    2020-07-27 01:17:18.145 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.MobileVisionBase.zza(com.google.mlkit:vision-common@@16.0.0:23)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.vision.common.internal.zzb.call(com.google.mlkit:vision-common@@16.0.0)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:     at com.google.mlkit.common.sdkinternal.ModelResource.zza(com.google.mlkit:common@@16.0.0:26)
    2020-07-27 01:17:18.146 11009-11009/com.example.camerax_automl W/System.err:    ... 9 more`

Here I have used a TextureView and A Textview to show the classification result. I have also put .tflite model in the assets folder and insert required dependencies. My code is given below -

public class MainActivity extends AppCompatActivity {

    private int REQUEST_CODE_PERMISSIONS = 101;

    private final String[] REQUIRED_PERMISSIONS = new String[]{"android.permission.CAMERA"};

    TextureView textureView;
    ImageButton imgbutton;
    //LinearLayout linear1;
    TextView text1;



    //automal objects

    AutoMLImageLabelerLocalModel localModel =
            new AutoMLImageLabelerLocalModel.Builder()
                    .setAssetFilePath("model/manifest.json")
                    // or .setAbsoluteFilePath(absolute file path to manifest file)
                    .build();


    AutoMLImageLabelerOptions autoMLImageLabelerOptions =
            new AutoMLImageLabelerOptions.Builder(localModel)
                    .setConfidenceThreshold(0.0f)  // Evaluate your model in the Firebase console
                    // to determine an appropriate value.
                    .build();
    ImageLabeler labeler = ImageLabeling.getClient(autoMLImageLabelerOptions);


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        textureView = findViewById(R.id.view_finder);
        imgbutton = findViewById(R.id.imgCapture);
        text1 = findViewById(R.id.textView2);

        if(allPermissionsGranted())
        {
            startCamera();

        }
        else
        {

            ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS);
        }
    }


    private void startCamera() {

        CameraX.unbindAll();

        Rational aspectRatio = new Rational (textureView.getWidth(), textureView.getHeight());
        Size screen = new Size(textureView.getWidth(), textureView.getHeight()); //size of the screen

        PreviewConfig pConfig = new PreviewConfig.Builder()
                .setTargetAspectRatio(aspectRatio)
                .setTargetResolution(screen)
                .build();

        Preview preview = new Preview(pConfig);

        preview.setOnPreviewOutputUpdateListener(new Preview.OnPreviewOutputUpdateListener() {
            @Override
            public void onUpdated(Preview.PreviewOutput output) {
                ViewGroup parent = (ViewGroup) textureView.getParent();
                parent.removeView(textureView);
                parent.addView(textureView, 0);

                textureView.setSurfaceTexture(output.getSurfaceTexture());
                updateTransform();

            }
        });

        ImageAnalysisConfig imconfig = new ImageAnalysisConfig.Builder().setTargetAspectRatio(aspectRatio)
                .setTargetResolution(screen)
                .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE).build();

        final ImageAnalysis analysis = new ImageAnalysis(imconfig);





                analysis.setAnalyzer(new ImageAnalysis.Analyzer() {
                    @Override
                    public void analyze(ImageProxy image, int rotationDegrees) {


                        Image img = image.getImage();
                        if (image.getImage() == null) {
                            Log.d("Null", "Image is Null");
                        } else {
                            InputImage img1 = InputImage.fromMediaImage(img, rotationDegrees);

                            labeler.process(img1)
                                    .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
                                        @Override
                                        public void onSuccess(List<ImageLabel> labels) {
                                            // Task completed successfully
                                            for (ImageLabel label : labels) {
                                                String text = label.getText();
                                                float confidence = label.getConfidence();
                                                int index = label.getIndex();

                                                text1.setText(text + " " + confidence);


                                            }


                                        }

                                    })
                                    .addOnFailureListener(new OnFailureListener() {
                                        @Override
                                        public void onFailure(@NonNull Exception e) {
                                            // Task failed with an exception
                                            // ...

                                            e.printStackTrace();
                                        }


                                    });


                        }

                        image.close();
                    }
                });

        CameraX.bindToLifecycle((LifecycleOwner)this,analysis, preview);


    }

    private void updateTransform() {

        Matrix mx = new Matrix();
        float w = textureView.getMeasuredWidth();
        float h = textureView.getMeasuredHeight();

        float cX = w / 2f;
        float cY = h / 2f;

        int rotationDgr;
        int rotation = (int)textureView.getRotation();

        switch(rotation){
            case Surface.ROTATION_0:
                rotationDgr = 0;
                break;
            case Surface.ROTATION_90:
                rotationDgr = 90;
                break;
            case Surface.ROTATION_180:
                rotationDgr = 180;
                break;
            case Surface.ROTATION_270:
                rotationDgr = 270;
                break;
            default:
                return;
        }

        mx.postRotate((float)rotationDgr, cX, cY);
        textureView.setTransform(mx);
    }

    private boolean allPermissionsGranted(){

        for(String permission:REQUIRED_PERMISSIONS) {
            if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
        }

        return true;
    }



}

What wrong I am doing here?

Upvotes: 3

Views: 3217

Answers (3)

Braian Coronel
Braian Coronel

Reputation: 22867

Each use case should be setted in different threads for avoid interference in processing.

For use cases:

  • Barcode Scanning
  • Text Recognition

private fun setsAnalyzersAsUseCase(): ImageAnalysis {
    val analysisUseCase = ImageAnalysis.Builder()
        .build()

    if (BARCODE_SCANNING_ENABLED) {
        analysisUseCase.setAnalyzer(
            Executors.newSingleThreadExecutor()
        ) { imageProxy ->
            processImageWithBarcodeScanner(imageProxy = imageProxy)
        }
    }


    if (TEXT_RECOGNITION_ENABLED) {
        analysisUseCase.setAnalyzer(
            Executors.newSingleThreadExecutor()
        ) { imageProxy ->
            processImageWithTextRecognition(imageProxy = imageProxy)
        }
    }

    return analysisUseCase
}

GL

Upvotes: 0

topher217
topher217

Reputation: 1347

Alternatively to repeating the image.close() within two separate listeners, you can just use the OnCompleteListener, which will be called regardless of success or failure after all task processing has completed. Also I added img.close() for good housekeeping and avoiding potential errors of similar nature.

Errors


labeler.process(image)
    .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
        @Override
        public void onSuccess(List<ImageLabel> labels) {
            // Task completed successfully
            Log.i(TAG, "labeler task successful");
            // Do something with the labels
            // ...
        }
    }).addOnFailureListener(new OnFailureListener() {
        @Override
        public void onFailure(@NonNull Exception e) {
            // Task failed with an exception
            Log.i(TAG, "labeler task failed with Error:" + e);
    }
});
img.close();
image.close();

No Errors

labeler.process(image)
    .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
        @Override
        public void onSuccess(List<ImageLabel> labels) {
            // Task completed successfully
            Log.i(TAG, "labeler task successful");
            // Do something with the labels
            // ...
        }
    }).addOnFailureListener(new OnFailureListener() {
        @Override
        public void onFailure(@NonNull Exception e) {
            // Task failed with an exception
            Log.i(TAG, "labeler task failed with Error:" + e);
    }).addOnCompleteListener(new OnCompleteListener<List<ImageLabel>>() {
        @Override
        public void onComplete(@NonNull Task<List<ImageLabel>> task) {
            img.close();
            image.close();
        }
    });

Upvotes: 2

Jenea Vranceanu
Jenea Vranceanu

Reputation: 4694

You must not close the image before it is processed. Closing the image triggers another image to be taken by the camera to be processed by your app.

Processing takes time. It is not immediate.

labeler.process(img1)
        .addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
            @Override
            public void onSuccess(List<ImageLabel> labels) {
                // Close the image
                image.close();
                ...
            }
        })
        .addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(@NonNull Exception e) {
                // Close the image
                image.close();
                ...
            }
        });

Upvotes: 6

Related Questions