Reputation: 25
I am working with camera2 and in ImageReader i have YUV_420_888 format. I looked around and found some formulas for converting it to RGB. But i have problem with some colors. Here is the code that converts it to RGB:
ByteBuffer buffer0 = image.getPlanes()[0].getBuffer();
byte[] Y1 = new byte[buffer0.remaining()];
buffer0.get(Y1);
ByteBuffer buffer1 = image.getPlanes()[1].getBuffer();
byte[] U1 = new byte[buffer1.remaining()];
buffer1.get(U1);
ByteBuffer buffer2 = image.getPlanes()[2].getBuffer();
byte[] V1 = new byte[buffer2.remaining()];
buffer2.get(V1);
int Width = image.getWidth();
int Heigh = image.getHeight();
byte[] ImageRGB = new byte[image.getHeight()*image.getWidth()*4];
for(int i = 0; i<Heigh-1; i++){
for (int j = 0; j<Width; j++){
int Y = Y1[i*Width+j]&0xFF;
int U = U1[(i/2)*(Width/2)+j/2]&0xFF;
int V = V1[(i/2)*(Width/2)+j/2]&0xFF;
U = U-128;
V = V-128;
int R,G,B;
R = (int)(Y + 1.140*V);
G = (int)(Y - 0.395*U - 0.581*V);
B = (int)(Y + 2.032*U);
if (R>255) {
R = 255;
} else if (R<0) {
R = 0;
}
if (G>255) {
G = 255;
} else if (G<0) {
G = 0;
}
if (B>255) {
R = 255;
} else if (B<0) {
B = 0;
}
ImageRGB[i*4*Width+j*4] = (byte)R;
ImageRGB[i*4*Width+j*4+1] = (byte)G;
ImageRGB[i*4*Width+j*4+2] = (byte)B;
ImageRGB[i*4*Width+j*4+3] = -1;
}
}
And when i point camera towards some color this happens. Any idea why and how i can fix this?
EDIT: Here is code i used for posting on SurfaceView but i think its correct
Bitmap bm = Bitmap.createBitmap(image.getWidth(), image.getHeight(),
Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(ImageRGB));
Bitmap scaled = Bitmap.createScaledBitmap(bm, surfaceView.getWidth(), surfaceView.getHeight(), true);
Canvas c;
c = surfaceHolder.lockCanvas();
c.drawBitmap(scaled, 0, 0, null);
surfaceHolder.unlockCanvasAndPost(c);
image.close();
Upvotes: 1
Views: 4391
Reputation: 97
I found working solution in Minhaz blog post: How to use YUV (YUV_420_888) Image in Android and slightly modified it for better performance.
If this was helpful to you, please send your thanks to @Minhaz.
package YourPackage;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.media.Image;
import java.nio.ByteBuffer;
public class Convert
{
// Based on: https://blog.minhazav.dev/how-to-convert-yuv-420-sp-android.media.Image-to-Bitmap-or-jpeg/
//
// With additions from:
// - https://stackoverflow.com/q/40885602/
// - https://stackoverflow.com/a/8394202/
static public Bitmap YUV_420_888_to_ARGB_8888(Image image)
{
if (image.getFormat() != ImageFormat.YUV_420_888)
{
throw new IllegalArgumentException("Invalid image format (must be YUV_420_888)");
}
final int width = image.getWidth();
final int height = image.getHeight();
// RGBA array, needed to construct Bitmap from it
byte[] ImageRGBA = new byte[width * height * 4];
// ---------------------------------------------------------------------
/*
A YUV Image could be implemented with 'planar' or 'semi-planar'
layout.
A 'planar' YUV image would have following structure:
YYYYYYYYYYYYYYYY
................
UUUUUUUU
........
VVVVVVVV
........
While a 'semi-planar' YUV image would have layout like this:
YYYYYYYYYYYYYYYY
................
UVUVUVUVUVUVUVUV <-- Interleaved UV channel
................
This is defined by row stride and pixel strides in the planes of the
image.
You can find by: image.getPlanes()[1].getPixelStride(). If it's 2,
the image format is 'semi-planar'.
*/
// ---------------------------------------------------------------------
/*
Extract Y/U/V planes bytes (via: https://stackoverflow.com/a/28744228/)
For performance reason, we copy each plane bytes into related byte[]
array. ByteBuffer `get()` method is too slow to use in a loop
(possibly due to bounds checking):
- `byte ByteBuffer::get(int index)` --> Y_buffer.get(Y_index) --> slow
- `byte byte[] operator [] (int index)` --> Y_bytes[Y_index] --> fast
Plane #0 is always Y;
Plane #1 is always U (Cb);
Plane #2 is always V (Cr);
Reference: https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888
*/
ByteBuffer Y_buffer = image.getPlanes()[0].getBuffer();
byte[] Y_bytes = new byte[Y_buffer.remaining()];
Y_buffer.get(Y_bytes);
ByteBuffer U_buffer = image.getPlanes()[1].getBuffer();
byte[] U_bytes = new byte[U_buffer.remaining()];
U_buffer.get(U_bytes);
ByteBuffer V_buffer = image.getPlanes()[2].getBuffer();
byte[] V_bytes = new byte[V_buffer.remaining()];
V_buffer.get(V_bytes);
// ---------------------------------------------------------------------
// The Y-plane is guaranteed not to be interleaved with the U/V planes
// (in particular, pixel stride is always 1).
final int Y_RowStride = image.getPlanes()[0].getRowStride();
final int Y_PixelStride = image.getPlanes()[0].getPixelStride();
// The U/V planes are guaranteed to have the same row stride and pixel
// stride.
final int UV_RowStride = image.getPlanes()[1].getRowStride();
final int UV_PixelStride = image.getPlanes()[1].getPixelStride();
// ---------------------------------------------------------------------
// Reusable variables, stored here to not construct them in the loop.
int Y_value = 0, U_value = 0, V_value = 0;
int R = 0, G = 0, B = 0;
int Y_index = 0;
int UV_x = 0, UV_y = 0, UV_index = 0;
int pixel_index = 0;
// ---------------------------------------------------------------------
for (int y = 0; y < height; ++y)
{
for (int x = 0; x < width; ++x)
{
Y_index = (y * Y_RowStride) + (x * Y_PixelStride);
// Y plane should have positive values belonging to [0...255]
Y_value = (Y_bytes[Y_index] & 0xff);
UV_x = x / 2;
UV_y = y / 2;
// U/V Values are subsampled i.e. each pixel in U/V chanel in a
// YUV_420 image act as chroma value for 4 neighbouring pixels
UV_index = (UV_y * UV_RowStride) + (UV_x * UV_PixelStride);
// U/V values ideally fall under [-0.5, 0.5] range. To fit them
// into [0, 255] range they are scaled up and centered to 128.
// Operation below brings U/V values to [-128, 127].
U_value = (U_bytes[UV_index] & 0xff) - 128;
V_value = (V_bytes[UV_index] & 0xff) - 128;
// Compute RGB values from YUV.
R = (int) (Y_value + 1.370705f * V_value);
G = (int) (Y_value - (0.698001f * V_value) - (0.337633f * U_value));
B = (int) (Y_value + 1.732446f * U_value);
// Clamp R/G/B. Similar to: 'clamp(r, 0, 255)'
R = R < 0 ? 0 : (R > 255 ? 255 : R);
G = G < 0 ? 0 : (G > 255 ? 255 : G);
B = B < 0 ? 0 : (B > 255 ? 255 : B);
pixel_index = (x * 4) + ((y * 4) * width);
ImageRGBA[pixel_index + 0] = (byte) R;
ImageRGBA[pixel_index + 1] = (byte) G;
ImageRGBA[pixel_index + 2] = (byte) B;
ImageRGBA[pixel_index + 3] = (byte) 255; // A
}
}
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer( ByteBuffer.wrap(ImageRGBA) );
return bitmap;
}
}
Upvotes: 0
Reputation: 51
The U,V planes have dimensions (x, y/2) so try
int offset = (i/2)*Width + j;
int U = U1[offset]&0xFF;
int V = V1[offset+1]&0xFF;
Upvotes: 0
Reputation: 261
There is an error in your code
if (B>255) {
B = 255; //was R = 255;
} else if (B<0) {
B = 0;
}
And try to use both of variants
R = Y + 1.402 * V
G = Y - 0.34414 * U - 0.71414 * V
B = Y + 1.772 * U
R = yValue + (1.370705 * V);
G = yValue - (0.698001 * V) - (0.337633 * U);
B = yValue + (1.732446 * U);
Upvotes: 0
Reputation: 18117
That doesn't look like the correct YUV->RGB transform. The camera2 API color space for YUV_420_888 from the camera device is the JFIF YUV colorspace (same as what's inside JPEG files). This is unfortunately not clearly documented currently.
The JFIF YUV->RGB transform is defined to be as follows in the JPEG JFIF specification:
R = Y + 1.402 (Cr-128)
G = Y - 0.34414 (Cb-128) - 0.71414 (Cr-128)
B = Y + 1.772 (Cb-128)
So try that to start with. And for full clarify, Cb = U, Cr = V.
Upvotes: 2