uelordi
uelordi

Reputation: 2209

How could I distinguish between NV21 and YV12 codification in imageReader camera API 2?

I am developing custom camera API 2 app, and I notice that the capture format conversion is different on some devices when I use ImageReader callback.

For example in Nexus 4 doesn't work fine and in Nexus5X looks OK, here is the output.

enter image description here

I initialize the ImageReader in this form:

mImageReader = ImageReader.newInstance(320, 240, ImageFormat.YUV_420_888,2); 

And my callback is simple callback ImageReader Callback.

 mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {

    @Override
    public void onImageAvailable( ImageReader reader) {

       try {
             mBackgroundHandler.post(
                 new ImageController(reader.acquireNextImage())
             );
        }
        catch(Exception e)
        {
          //exception
        }
        }

};

And in the case of Nexus 4: I had this error.

D/qdgralloc: gralloc_lock_ycbcr: Invalid format passed: 0x32315659

When I try to write the raw file in both devices, I have these different images. So I understand that the Nexus 5X image has NV21 codification and the Nexus 4 has YV12 codification. enter image description here

I found a specification of image format and I try to get the format in ImageReader. There are YV12 and NV21 options, but obviously, I get the YUV_420_888 format when I try to obtain the format.

 int test=mImageReader.getImageFormat();

So is there any way to get the camera input format (NV21 or YV12) to discriminate this codification types in the camera class? CameraCharacteristics maybe?

Thanks in advance.

Unai. PD: I use OpenGL for displayin RGB images, and I use Opencv to make the conversions to YUV_420_888.

Upvotes: 9

Views: 2960

Answers (2)

Alex Cohn
Alex Cohn

Reputation: 57163

YUV_420_888 is a wrapper that can host (among others) both NV21 and YV12 images. You must use the planes and strides to access individual colors:

ByteBuffer Y = image.getPlanes()[0];
ByteBuffer U = image.getPlanes()[1];
ByteBuffer V = image.getPlanes()[2];

If the underlying pixels are in NV21 format (as on Nexus 4), the pixelStride will be 2, and

int getU(image, col, row) {
    return getPixel(image.getPlanes()[1], col/2, row/2);
}

int getPixel(plane, col, row) {
    return plane.getBuffer().get(col*plane.getPixelStride() + row*plane.getRowStride());
}

We take half column and half row because this is how U and V (chroma) planes are stored in 420 image.

This code is for illustration, it is very inefficient, you probably want to access pixels at bulk, using get(byte[], int, int), or via a fragment shader, or via JNI function GetDirectBufferAddress in native code. What you cannot use, is method plane.array(), because the planes are guaranteed to be direct byte buffers.

Upvotes: 3

kostyabakay
kostyabakay

Reputation: 1689

Here useful method which converts from YV12 to NV21.

public static byte[] fromYV12toNV21(@NonNull final byte[] yv12,
                                    final int width,
                                    final int height) {
    byte[] nv21 = new byte[yv12.length];
    final int size = width * height;
    final int quarter = size / 4;
    final int vPosition = size; // This is where V starts
    final int uPosition = size + quarter; // This is where U starts

    System.arraycopy(yv12, 0, nv21, 0, size); // Y is same

    for (int i = 0; i < quarter; i++) {
        nv21[size + i * 2] = yv12[vPosition + i]; // For NV21, V first
        nv21[size + i * 2 + 1] = yv12[uPosition + i]; // For Nv21, U second
    }
    return nv21;
}

Upvotes: 0

Related Questions