Reputation: 47
I am trying to save yuv 420 preview frames obtained in android camera2 to jpeg. The only way I found of doing it was to convert the yuv420 to nv21, construct a yuvimage and then use the compresstojpeg method to get the jpeg. In order to convert from yuv420 to jpeg I am using the logic below
Image.Plane Y = img.getPlanes()[0];
Image.Plane U = img.getPlanes()[2];
Image.Plane V = img.getPlanes()[1];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
byte[] data = new byte[Yb + Ub + Vb];
Y.getBuffer().get(data, 0, Yb);
U.getBuffer().get(data, Yb, Ub);
V.getBuffer().get(data, Yb + Ub, Vb);
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21,
mPreviewSize.getWidth(), mPreviewSize.getHeight(), null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuvImage.compressToJpeg(new Rect(0, 0,
mPreviewSize.getWidth(), mPreviewSize.getHeight()),
100, out);
However this results in getting green images for certain resolutions 144 x 176 , 176x144, 352x288, 480x360, 1280x960. Is the logic for converting to nv21 correct? What other way can I use for converting to jpeg from yuv420. Is there any Java/Android api for this?
Upvotes: 0
Views: 1811
Reputation: 401
JFYR, here's the Java implementation from Android CameraX project
/** {@link android.media.Image} to NV21 byte array. */
@NonNull
public static byte[] yuv_420_888toNv21(@NonNull Image image) {
Image.Plane yPlane = image.getPlanes()[0];
Image.Plane uPlane = image.getPlanes()[1];
Image.Plane vPlane = image.getPlanes()[2];
ByteBuffer yBuffer = yPlane.getBuffer();
ByteBuffer uBuffer = uPlane.getBuffer();
ByteBuffer vBuffer = vPlane.getBuffer();
yBuffer.rewind();
uBuffer.rewind();
vBuffer.rewind();
int ySize = yBuffer.remaining();
int position = 0;
// TODO(b/115743986): Pull these bytes from a pool instead of allocating for every image.
byte[] nv21 = new byte[ySize + (image.getWidth() * image.getHeight() / 2)];
// Add the full y buffer to the array. If rowStride > 1, some padding may be skipped.
for (int row = 0; row < image.getHeight(); row++) {
yBuffer.get(nv21, position, image.getWidth());
position += image.getWidth();
yBuffer.position(
Math.min(ySize, yBuffer.position() - image.getWidth() + yPlane.getRowStride()));
}
int chromaHeight = image.getHeight() / 2;
int chromaWidth = image.getWidth() / 2;
int vRowStride = vPlane.getRowStride();
int uRowStride = uPlane.getRowStride();
int vPixelStride = vPlane.getPixelStride();
int uPixelStride = uPlane.getPixelStride();
// Interleave the u and v frames, filling up the rest of the buffer. Use two line buffers to
// perform faster bulk gets from the byte buffers.
byte[] vLineBuffer = new byte[vRowStride];
byte[] uLineBuffer = new byte[uRowStride];
for (int row = 0; row < chromaHeight; row++) {
vBuffer.get(vLineBuffer, 0, Math.min(vRowStride, vBuffer.remaining()));
uBuffer.get(uLineBuffer, 0, Math.min(uRowStride, uBuffer.remaining()));
int vLineBufferPosition = 0;
int uLineBufferPosition = 0;
for (int col = 0; col < chromaWidth; col++) {
nv21[position++] = vLineBuffer[vLineBufferPosition];
nv21[position++] = uLineBuffer[uLineBufferPosition];
vLineBufferPosition += vPixelStride;
uLineBufferPosition += uPixelStride;
}
}
return nv21;
}
Once you get NV21
from YUV420_888
Image, you can use YuvImage
to compress it to JPEG
(You can also find this in the same file as mentioned below)
Upvotes: 0
Reputation: 18097
No, this isn't correct - you're not paying attention to the Plane's row stride or pixel stride.
You have to parse those, and make sure that your output buffer actually matches the input expectations of YuvImage's NV21 input, which assumes row stride = width, and interleaved V/U planes.
The code you have will only work if the input Image U/V planes are actually interleaved (in which case you're adding twice the UV data you need, but the first copy happens to be right layout...), and if width==row stride. Whether width==row stride depends on the resolution; usually the stride has to be a multiple of 16 pixels or something similar due to hardware restrictions. So for resolutions that aren't a multiple of 16, for example, your code wouldn't work.
Please fix both issues - paying attention to row and pixel stride; otherwise you might make it work on your device by accident, and still have it broken on devices with different parameters for strides.
Edit:
Some sample C++ code that does this kind of conversion can be found in the Android AOSP camera service code: CallbackProcessor::convertFromFlexibleYuv.
Mappings for reference:
Upvotes: 1