kCVPixelFormatType_420YpCbCr8BiPlanarFullRange帧到UIImage转换

问题描述 投票:8回答:3

我有一个应用程序以kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式捕获实时视频以处理Y通道。根据Apple的文档:

kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 双平面分量Y'CbCr 8位4:2:0,全范围(亮度= [0,255]色度= [1,255])。 baseAddr指向一个大端CVPlanarPixelBufferInfo_YCbCrBiPlanar结构。

我想在UIViewController中呈现其中一些框架,是否有任何API可以转换为kCVPixelFormatType_32BGRA格式?您能否给出一些提示来调整Apple提供的此方法?

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer  {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

谢谢!

iphone objective-c ios avfoundation
3个回答
15
投票

我不知道在iOS中将双平面Y / CbCr图像转换为RGB的任何可访问的内置方法。但是,您应该可以自己在软件中执行转换,例如

uint8_t clamp(int16_t input)
{
    // clamp negative numbers to 0; assumes signed shifts
    // (a valid assumption on iOS)
    input &= ~(num >> 16);

    // clamp numbers greater than 255 to 255; the accumulation
    // of the mask looks odd but is an attempt to avoid
    // pipeline stalls
    uint8_t saturationMask = num >> 8;
    saturationMask |= saturationMask << 4;
    saturationMask |= saturationMask << 2;
    saturationMask |= saturationMask << 1;
    num |= saturationMask;

    return num&0xff;
}

...

CVPixelBufferLockBaseAddress(imageBuffer, 0);

size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);

uint8_t *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
CVPlanarPixelBufferInfo_YCbCrBiPlanar *bufferInfo = (CVPlanarPixelBufferInfo_YCbCrBiPlanar *)baseAddress;

NSUInteger yOffset = EndianU32_BtoN(bufferInfo->componentInfoY.offset);
NSUInteger yPitch = EndianU32_BtoN(bufferInfo->componentInfoY.rowBytes);

NSUInteger cbCrOffset = EndianU32_BtoN(bufferInfo->componentInfoCbCr.offset);
NSUInteger cbCrPitch = EndianU32_BtoN(bufferInfo->componentInfoCbCr.rowBytes);

uint8_t *rgbBuffer = malloc(width * height * 3);
uint8_t *yBuffer = baseAddress + yOffset;
uint8_t *cbCrBuffer = baseAddress + cbCrOffset;

for(int y = 0; y < height; y++)
{
    uint8_t *rgbBufferLine = &rgbBuffer[y * width * 3];
    uint8_t *yBufferLine = &yBuffer[y * yPitch];
    uint8_t *cbCrBufferLine = &cbCrBuffer[(y >> 1) * cbCrPitch];

    for(int x = 0; x < width; x++)
    {
        // from ITU-R BT.601, rounded to integers
        uint8_t y = yBufferLine[x] - 16;
        uint8_t cb = cbCrBufferLine[x & ~1] - 128;
        uint8_t cr = cbCrBufferLine[x | 1] - 128;

        uint8_t *rgbOutput = &rgbBufferLine[x*3];

        rgbOutput[0] = clamp(((298 * y + 409 * cr - 223) >> 8) - 223);
        rgbOutput[1] = clamp(((298 * y - 100 * cb - 208 * cr + 136) >> 8) + 136);
        rgbOutput[2] = clamp(((298 * y + 516 * cb - 277) >> 8) - 277);
    }

}

只是直接写到此框中并且未经测试,我认为我已经正确提取了cb / cr。然后,您可以将CGBitmapContextCreatergbBuffer结合使用来创建CGImage,从而创建UIImage


15
投票

如果您在videoOrientation中更改AVCaptureConnection,我发现的大多数实现(包括此处的上一个答案)将无法正常工作(由于某种原因,我不完全了解,在这种情况下,CVPlanarPixelBufferInfo_YCbCrBiPlanar结构将为空),因此我编写了一个代码(大多数代码基于this answer)。我的实现还向RGB缓冲区添加了一个空的alpha通道,并使用CGBitmapContext标志创建了kCGImageAlphaNoneSkipLast(没有alpha数据,但iOS似乎每个像素需要4个字节)。这是:

#define clamp(a) (a>255?255:(a<0?0:a))

- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    uint8_t *yBuffer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
    size_t yPitch = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0);
    uint8_t *cbCrBuffer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 1);
    size_t cbCrPitch = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 1);

    int bytesPerPixel = 4;
    uint8_t *rgbBuffer = malloc(width * height * bytesPerPixel);

    for(int y = 0; y < height; y++) {
        uint8_t *rgbBufferLine = &rgbBuffer[y * width * bytesPerPixel];
        uint8_t *yBufferLine = &yBuffer[y * yPitch];
        uint8_t *cbCrBufferLine = &cbCrBuffer[(y >> 1) * cbCrPitch];

        for(int x = 0; x < width; x++) {
            int16_t y = yBufferLine[x];
            int16_t cb = cbCrBufferLine[x & ~1] - 128;
            int16_t cr = cbCrBufferLine[x | 1] - 128;

            uint8_t *rgbOutput = &rgbBufferLine[x*bytesPerPixel];

            int16_t r = (int16_t)roundf( y + cr *  1.4 );
            int16_t g = (int16_t)roundf( y + cb * -0.343 + cr * -0.711 );
            int16_t b = (int16_t)roundf( y + cb *  1.765);

            rgbOutput[0] = 0xff;
            rgbOutput[1] = clamp(b);
            rgbOutput[2] = clamp(g);
            rgbOutput[3] = clamp(r);
        }
    }

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(rgbBuffer, width, height, 8, width * bytesPerPixel, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    CGImageRelease(quartzImage);
    free(rgbBuffer);

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    return image;
}

0
投票

这些其他有关位移和魔术变量的答案很疯狂。这是在Swift 5中使用Accelerate框架的另一种方法。它从像素格式为kCVPixelFormatType_420YpCbCr8BiPlanarFullRange(双平面分量Y'CbCr 8位4:2:0)的缓冲区中获取一帧,并从中获取UIImage将其转换为ARGB8888之后。但是您可以修改它以处理任何输入/输出格式:

import Accelerate
import CoreGraphics
import CoreMedia
import Foundation
import QuartzCore
import UIKit

func createImage(from sampleBuffer: CMSampleBuffer) -> UIImage? {
    guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
      return nil
    }

    // pixel format is Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).
    // baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct.
    //
    guard CVPixelBufferGetPixelFormatType(imageBuffer) == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange else {
        return nil
    }


    guard CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) == kCVReturnSuccess else {
        return nil
    }

    defer {
        // be sure to unlock the base address before returning
        CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly)
    }

    // 1st plane is luminance, 2nd plane is chrominance
    guard CVPixelBufferGetPlaneCount(imageBuffer) == 2 else {
        return nil
    }

    // 1st plane
    guard let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0) else {
        return nil
    }

    let lumaWidth = CVPixelBufferGetWidthOfPlane(imageBuffer, 0)
    let lumaHeight = CVPixelBufferGetHeightOfPlane(imageBuffer, 0)
    let lumaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0)
    var lumaBuffer = vImage_Buffer(
        data: lumaBaseAddress,
        height: vImagePixelCount(lumaHeight),
        width: vImagePixelCount(lumaWidth),
        rowBytes: lumaBytesPerRow
    )

    // 2nd plane
    guard let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 1) else {
        return nil
    }

    let chromaWidth = CVPixelBufferGetWidthOfPlane(imageBuffer, 1)
    let chromaHeight = CVPixelBufferGetHeightOfPlane(imageBuffer, 1)
    let chromaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 1)
    var chromaBuffer = vImage_Buffer(
        data: chromaBaseAddress,
        height: vImagePixelCount(chromaHeight),
        width: vImagePixelCount(chromaWidth),
        rowBytes: chromaBytesPerRow
    )

    var argbBuffer = vImage_Buffer()

    defer {
        // we are responsible for freeing the buffer data
        free(argbBuffer.data)
    }

    // initialize the empty buffer
    guard vImageBuffer_Init(
        &argbBuffer,
        lumaBuffer.height,
        lumaBuffer.width,
        32,
        vImage_Flags(kvImageNoFlags)
        ) == kvImageNoError else {
            return nil
    }

    // full range 8-bit, clamped to full range, is necessary for correct color reproduction
    var pixelRange = vImage_YpCbCrPixelRange(
        Yp_bias: 0,
        CbCr_bias: 128,
        YpRangeMax: 255,
        CbCrRangeMax: 255,
        YpMax: 255,
        YpMin: 1,
        CbCrMax: 255,
        CbCrMin: 0
    )

    var conversionInfo = vImage_YpCbCrToARGB()

    // initialize the conversion info
    guard vImageConvert_YpCbCrToARGB_GenerateConversion(
        kvImage_YpCbCrToARGBMatrix_ITU_R_601_4, // Y'CbCr-to-RGB conversion matrix for ITU Recommendation BT.601-4.
        &pixelRange,
        &conversionInfo,
        kvImage420Yp8_CbCr8, // converting from
        kvImageARGB8888, // converting to
        vImage_Flags(kvImageNoFlags)
        ) == kvImageNoError else {
            return nil
    }

    // do the conversion
    guard vImageConvert_420Yp8_CbCr8ToARGB8888(
        &lumaBuffer, // in
        &chromaBuffer, // in
        &argbBuffer, // out
        &conversionInfo,
        nil,
        255,
        vImage_Flags(kvImageNoFlags)
        ) == kvImageNoError else {
            return nil
    }

    // core foundation objects are automatically memory mananged. no need to call CGContextRelease() or CGColorSpaceRelease()
    guard let context = CGContext(
        data: argbBuffer.data,
        width: Int(argbBuffer.width),
        height: Int(argbBuffer.height),
        bitsPerComponent: 8,
        bytesPerRow: argbBuffer.rowBytes,
        space: CGColorSpaceCreateDeviceRGB(),
        bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue
        ) else {
            return nil
    }

    guard let cgImage = context.makeImage() else {
        return nil
    }

    return UIImage(cgImage: cgImage) 
}
© www.soinside.com 2019 - 2024. All rights reserved.