YUV_420_888 到 RGB 转换

问题描述 投票:0回答:4

我正在使用camera2,在ImageReader中我有YUV_420_888格式。我环顾四周,发现了一些将其转换为 RGB 的公式。但我对某些颜色有疑问。这是将其转换为 RGB 的代码:

ByteBuffer buffer0 = image.getPlanes()[0].getBuffer();
byte[] Y1 = new byte[buffer0.remaining()];
buffer0.get(Y1);
ByteBuffer buffer1 = image.getPlanes()[1].getBuffer();
byte[] U1 = new byte[buffer1.remaining()];
buffer1.get(U1);
ByteBuffer buffer2 = image.getPlanes()[2].getBuffer();
byte[] V1 = new byte[buffer2.remaining()];
buffer2.get(V1);
int Width = image.getWidth();
int Heigh = image.getHeight();
byte[] ImageRGB = new byte[image.getHeight()*image.getWidth()*4];

for(int i = 0; i<Heigh-1; i++){
    for (int j = 0; j<Width; j++){
        int Y = Y1[i*Width+j]&0xFF;
        int U = U1[(i/2)*(Width/2)+j/2]&0xFF;
        int V = V1[(i/2)*(Width/2)+j/2]&0xFF;
        U = U-128;
        V = V-128;
        int R,G,B;
        R = (int)(Y + 1.140*V);
        G = (int)(Y - 0.395*U - 0.581*V);
        B = (int)(Y + 2.032*U);
        if (R>255) {
            R = 255;
        } else if (R<0) {
            R = 0;
        }
        if (G>255) {
            G = 255;
        } else if (G<0) {
            G = 0;
        }
        if (B>255) {
            R = 255;
        } else if (B<0) {
            B = 0;
        }
        ImageRGB[i*4*Width+j*4] = (byte)R;
        ImageRGB[i*4*Width+j*4+1] = (byte)G;
        ImageRGB[i*4*Width+j*4+2] = (byte)B;
        ImageRGB[i*4*Width+j*4+3] = -1;
    }
}

当我将相机指向某种颜色时就会发生这种情况。知道为什么以及如何解决这个问题吗?

编辑: 这是我在 SurfaceView 上发布的代码,但我认为它是正确的

Bitmap bm = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888); bm.copyPixelsFromBuffer(ByteBuffer.wrap(ImageRGB)); Bitmap scaled = Bitmap.createScaledBitmap(bm, surfaceView.getWidth(), surfaceView.getHeight(), true); Canvas c; c = surfaceHolder.lockCanvas(); c.drawBitmap(scaled, 0, 0, null); surfaceHolder.unlockCanvasAndPost(c); image.close();
    
java image computer-vision rgb yuv
4个回答
2
投票
这看起来不像正确的 YUV->RGB 变换。相机设备中 YUV_420_888 的camera2 API 颜色空间是 JFIF YUV 颜色空间(与 JPEG 文件内的颜色空间相同)。不幸的是,目前还没有明确记录这一点。

JFIF YUV->RGB 变换在

JPEG JFIF 规范中定义如下:

R = Y + 1.402 (Cr-128) G = Y - 0.34414 (Cb-128) - 0.71414 (Cr-128) B = Y + 1.772 (Cb-128)

所以先尝试一下。为了充分澄清,Cb = U,Cr = V。


0
投票
您的代码有错误

if (B>255) { B = 255; //was R = 255; } else if (B<0) { B = 0; }

并尝试使用这两种变体

R = Y + 1.402 * V G = Y - 0.34414 * U - 0.71414 * V B = Y + 1.772 * U

或者从这里:

R = yValue + (1.370705 * V); G = yValue - (0.698001 * V) - (0.337633 * U); B = yValue + (1.732446 * U);
    

0
投票
U、V 平面的尺寸为 (x, y/2),所以请尝试

int offset = (i/2)*Width + j; int U = U1[offset]&0xFF; int V = V1[offset+1]&0xFF;
    

0
投票
我在

Minhaz博客文章中找到了工作解决方案:How to use YUV (YUV_420_888) Image in Android并对其进行了轻微修改以获得更好的性能。

如果这对您有帮助,请将您的感谢发送给@Minhaz。

package YourPackage; import android.graphics.Bitmap; import android.graphics.ImageFormat; import android.media.Image; import java.nio.ByteBuffer; public class Convert { // Based on: https://blog.minhazav.dev/how-to-convert-yuv-420-sp-android.media.Image-to-Bitmap-or-jpeg/ // // With additions from: // - https://stackoverflow.com/q/40885602/ // - https://stackoverflow.com/a/8394202/ static public Bitmap YUV_420_888_to_ARGB_8888(Image image) { if (image.getFormat() != ImageFormat.YUV_420_888) { throw new IllegalArgumentException("Invalid image format (must be YUV_420_888)"); } final int width = image.getWidth(); final int height = image.getHeight(); // RGBA array, needed to construct Bitmap from it byte[] ImageRGBA = new byte[width * height * 4]; // --------------------------------------------------------------------- /* A YUV Image could be implemented with 'planar' or 'semi-planar' layout. A 'planar' YUV image would have following structure: YYYYYYYYYYYYYYYY ................ UUUUUUUU ........ VVVVVVVV ........ While a 'semi-planar' YUV image would have layout like this: YYYYYYYYYYYYYYYY ................ UVUVUVUVUVUVUVUV <-- Interleaved UV channel ................ This is defined by row stride and pixel strides in the planes of the image. You can find by: image.getPlanes()[1].getPixelStride(). If it's 2, the image format is 'semi-planar'. */ // --------------------------------------------------------------------- /* Extract Y/U/V planes bytes (via: https://stackoverflow.com/a/28744228/) For performance reason, we copy each plane bytes into related byte[] array. ByteBuffer `get()` method is too slow to use in a loop (possibly due to bounds checking): - `byte ByteBuffer::get(int index)` --> Y_buffer.get(Y_index) --> slow - `byte byte[] operator [] (int index)` --> Y_bytes[Y_index] --> fast Plane #0 is always Y; Plane #1 is always U (Cb); Plane #2 is always V (Cr); Reference: https://developer.android.com/reference/android/graphics/ImageFormat#YUV_420_888 */ ByteBuffer Y_buffer = image.getPlanes()[0].getBuffer(); byte[] Y_bytes = new byte[Y_buffer.remaining()]; Y_buffer.get(Y_bytes); ByteBuffer U_buffer = image.getPlanes()[1].getBuffer(); byte[] U_bytes = new byte[U_buffer.remaining()]; U_buffer.get(U_bytes); ByteBuffer V_buffer = image.getPlanes()[2].getBuffer(); byte[] V_bytes = new byte[V_buffer.remaining()]; V_buffer.get(V_bytes); // --------------------------------------------------------------------- // The Y-plane is guaranteed not to be interleaved with the U/V planes // (in particular, pixel stride is always 1). final int Y_RowStride = image.getPlanes()[0].getRowStride(); final int Y_PixelStride = image.getPlanes()[0].getPixelStride(); // The U/V planes are guaranteed to have the same row stride and pixel // stride. final int UV_RowStride = image.getPlanes()[1].getRowStride(); final int UV_PixelStride = image.getPlanes()[1].getPixelStride(); // --------------------------------------------------------------------- // Reusable variables, stored here to not construct them in the loop. int Y_value = 0, U_value = 0, V_value = 0; int R = 0, G = 0, B = 0; int Y_index = 0; int UV_x = 0, UV_y = 0, UV_index = 0; int pixel_index = 0; // --------------------------------------------------------------------- for (int y = 0; y < height; ++y) { for (int x = 0; x < width; ++x) { Y_index = (y * Y_RowStride) + (x * Y_PixelStride); // Y plane should have positive values belonging to [0...255] Y_value = (Y_bytes[Y_index] & 0xff); UV_x = x / 2; UV_y = y / 2; // U/V Values are subsampled i.e. each pixel in U/V chanel in a // YUV_420 image act as chroma value for 4 neighbouring pixels UV_index = (UV_y * UV_RowStride) + (UV_x * UV_PixelStride); // U/V values ideally fall under [-0.5, 0.5] range. To fit them // into [0, 255] range they are scaled up and centered to 128. // Operation below brings U/V values to [-128, 127]. U_value = (U_bytes[UV_index] & 0xff) - 128; V_value = (V_bytes[UV_index] & 0xff) - 128; // Compute RGB values from YUV. R = (int) (Y_value + 1.370705f * V_value); G = (int) (Y_value - (0.698001f * V_value) - (0.337633f * U_value)); B = (int) (Y_value + 1.732446f * U_value); // Clamp R/G/B. Similar to: 'clamp(r, 0, 255)' R = R < 0 ? 0 : (R > 255 ? 255 : R); G = G < 0 ? 0 : (G > 255 ? 255 : G); B = B < 0 ? 0 : (B > 255 ? 255 : B); pixel_index = (x * 4) + ((y * 4) * width); ImageRGBA[pixel_index + 0] = (byte) R; ImageRGBA[pixel_index + 1] = (byte) G; ImageRGBA[pixel_index + 2] = (byte) B; ImageRGBA[pixel_index + 3] = (byte) 255; // A } } Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); bitmap.copyPixelsFromBuffer( ByteBuffer.wrap(ImageRGBA) ); return bitmap; } }
    
© www.soinside.com 2019 - 2024. All rights reserved.