fytao
fytao

Reputation: 331

Real-time image process and display using Android Camera2 api and ANativeWindow

I need to do some real-time image processing with the camera preview data, such as face detection which is a c++ library, and then display the processed preview with face labeled on screen.

I have read http://nezarobot.blogspot.com/2016/03/android-surfacetexture-camera2-opencv.html and Eddy Talvala's answer from Android camera2 API - Display processed frame in real time. Following the two webpages, I managed to build the app(no calling the face detection lib, only trying to display preview using ANativeWindow), but everytime I run this app on Google Pixel - 7.1.0 - API 25 running on Genymotion, the app always collapses throwing the following log

08-28 14:23:09.598 2099-2127/tau.camera2demo A/libc: Fatal signal 11 (SIGSEGV), code 2, fault addr 0xd3a96000 in tid 2127 (CAMERA2)
                  [ 08-28 14:23:09.599   117:  117 W/         ]
                  debuggerd: handling request: pid=2099 uid=10067 gid=10067 tid=2127

I googled this but no answer found.

The whole project on Github:https://github.com/Fung-yuantao/android-camera2demo

Here is the key code(I think).

Code in Camera2Demo.java:

private void startPreview(CameraDevice camera) throws CameraAccessException {
    SurfaceTexture texture = mPreviewView.getSurfaceTexture();

    // to set PREVIEW size
    texture.setDefaultBufferSize(mPreviewSize.getWidth(),mPreviewSize.getHeight());
    surface = new Surface(texture);
    try {
        // to set request for PREVIEW
        mPreviewBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }

    mImageReader = ImageReader.newInstance(mImageWidth, mImageHeight, ImageFormat.YUV_420_888, 2);

    mImageReader.setOnImageAvailableListener(mOnImageAvailableListener,mHandler);

    mPreviewBuilder.addTarget(mImageReader.getSurface());

    //output Surface
    List<Surface> outputSurfaces = new ArrayList<>();
    outputSurfaces.add(mImageReader.getSurface());

    /*camera.createCaptureSession(
            Arrays.asList(surface, mImageReader.getSurface()),
            mSessionStateCallback, mHandler);
            */
    camera.createCaptureSession(outputSurfaces, mSessionStateCallback, mHandler);
}


private CameraCaptureSession.StateCallback mSessionStateCallback = new CameraCaptureSession.StateCallback() {

    @Override
    public void onConfigured(CameraCaptureSession session) {
        try {
            updatePreview(session);
        } catch (CameraAccessException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onConfigureFailed(CameraCaptureSession session) {

    }
};

private void updatePreview(CameraCaptureSession session)
        throws CameraAccessException {
    mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);

    session.setRepeatingRequest(mPreviewBuilder.build(), null, mHandler);
}


private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {

    @Override
    public void onImageAvailable(ImageReader reader) {
        // get the newest frame
        Image image = reader.acquireNextImage();

        if (image == null) {
            return;
        }

        // print image format
        int format = reader.getImageFormat();
        Log.d(TAG, "the format of captured frame: " + format);

        // HERE to call jni methods
        JNIUtils.display(image.getWidth(), image.getHeight(), image.getPlanes()[0].getBuffer(), surface);


        //ByteBuffer buffer = image.getPlanes()[0].getBuffer();
        //byte[] bytes = new byte[buffer.remaining()];


        image.close();
    }
};

Code in JNIUtils.java:

import android.media.Image;
import android.view.Surface;

import java.nio.ByteBuffer;


public class JNIUtils {
    // TAG for JNIUtils class
    private static final String TAG = "JNIUtils";

    // Load native library.
    static {
        System.loadLibrary("native-lib");
    }

    public static native void display(int srcWidth, int srcHeight, ByteBuffer srcBuffer, Surface surface);
}

Code in native-lib.cpp:

#include <jni.h>
#include <string>
#include <android/log.h>
//#include <android/bitmap.h>
#include <android/native_window_jni.h>

#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, "Camera2Demo", __VA_ARGS__)

extern "C" {
JNIEXPORT jstring JNICALL Java_tau_camera2demo_JNIUtils_display(
        JNIEnv *env,
        jobject obj,
        jint srcWidth,
        jint srcHeight,
        jobject srcBuffer,
        jobject surface) {
    /*
    uint8_t *srcLumaPtr = reinterpret_cast<uint8_t *>(env->GetDirectBufferAddress(srcBuffer));

    if (srcLumaPtr == nullptr) {
        LOGE("srcLumaPtr null ERROR!");
        return NULL;
    }
    */

    ANativeWindow * window = ANativeWindow_fromSurface(env, surface);
    ANativeWindow_acquire(window);

    ANativeWindow_Buffer buffer;

    ANativeWindow_setBuffersGeometry(window, srcWidth, srcHeight, 0/* format unchanged */);

    if (int32_t err = ANativeWindow_lock(window, &buffer, NULL)) {
        LOGE("ANativeWindow_lock failed with error code: %d\n", err);
        ANativeWindow_release(window);
        return NULL;
    }

    memcpy(buffer.bits, srcBuffer,  srcWidth * srcHeight * 4);


    ANativeWindow_unlockAndPost(window);
    ANativeWindow_release(window);

    return NULL;
}
}

After I commented the memcpy out, the app no longer collapses but displays nothing. So I guess the problem is now turning to how to correctly use memcpy to copy the captured/processed buffer to buffer.bits.

Update:

I change

memcpy(buffer.bits, srcBuffer, srcWidth * srcHeight * 4);

to

memcpy(buffer.bits, srcLumaPtr, srcWidth * srcHeight * 4);

the app no longer collapses and starts to display but it's displaying something strange.

Upvotes: 3

Views: 7065

Answers (2)

Eddy Talvala
Eddy Talvala

Reputation: 18117

As mentioned by yakobom, you're trying to copy a YUV_420_888 image directly into a RGBA_8888 destination (that's the default, if you haven't changed it). That won't work with just a memcpy.

You need to actually convert the data, and you need to ensure you don't copy too much - the sample code you have copies width*height*4 bytes, while a YUV_420_888 image takes up only stride*height*1.5 bytes (roughly). So when you copied, you were running way off the end of the buffer.

You also have to account for the stride provided at the Java level to correctly index into the buffer. This link from Microsoft has a useful diagram.

If you just care about the luminance (so grayscale output is enough), just duplicate the luminance channel into the R, G, and B channels. The pseudocode would be roughly:

uint8_t *outPtr = buffer.bits;
for (size_t y = 0; y < height; y++) {
   uint8_t *rowPtr = srcLumaPtr + y * srcLumaStride;
   for (size_t x = 0; x < width; x++) {
      *(outPtr++) = *rowPtr;
      *(outPtr++) = *rowPtr;
      *(outPtr++) = *rowPtr;
      *(outPtr++) = 255; // gamma for RGBA_8888
      ++rowPtr;
    }
}

You'll need to read the srcLumaStride from the Image object (row stride of the first Plane) and pass it down via JNI as well.

Upvotes: 2

yakobom
yakobom

Reputation: 2711

Just to put it as an answer, to avoid a long chain of comments - such a crash issue may be due to improper size of bites being copied by the memcpy (UPDATE following other comments: In this case it was due to forbidden direct copy).

If you are now getting a weird image, it is probably another issue - I would suspect the image format, try to modify that.

Upvotes: 2

Related Questions