How to draw on CVPixelBufferRef which is planar / ycbcr / 420f / yuv / NV12 / not rgb?

I got it CMSampleBufferRef

from the system API that contains CVPixelBufferRef

which are not RGBA

(linear pixels). The buffer contains flat pixels (like 420f

aka kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange

aka yCbCr

aka YUV

).

I would like to change some manipulation of this video data before submitting VideoToolkit

for encoding to h264

(drawing text, overlaying a logo, rotating an image, etc.), but I would like to be efficient and in real time. Buuuut flat image data looks like a suuuper messy to work with - there is a color plane and a luminance plane and they are different sizes and ... Dealing with this at the byte level seems like a lot of work.

I could use CGContextRef

and just paint right on top of the pixels, but from what I can gather, it only supports RGBA pixels. Any advice on how I can do this with as few copies of data as possible, but as few lines of code as possible?

+2


source to share


1 answer


CGBitmapContextRef

can only draw something like 32ARGB

, fix it. This means that you will want to create buffers ARGB

(or RGBA

) and then find a way to transfer YUV

pixels to that surface very quickly ARGB

. This recipe involves pooling CoreImage

home , referencing your home pixel buffer, and then recreating it , which resembles your input buffer, but referencing your output pixels. In other words,CVPixelBufferRef

CGBitmapContextRef

CMSampleBufferRef

  • Get incoming pixels in CIImage

    .
  • Build CVPixelBufferPool

    with the pixel size and release dimensions you create. You don't want to create CVPixelBuffer

    without a real-time pool: you run out of memory if your producer is too fast; you will fragment your RAM as you will not reuse buffers; and it is a waste of cycles.
  • Build CIContext

    with a standard constructor that you split between buffers. It does not contain external state, but the documentation says that recreating it on every frame is very expensive.
  • On the incoming frame, create a new pixel buffer. Remember to use an allocation threshold so you don't get rampant RAM usage.
  • Pixel Buffer Lock
  • Create a bitmap context referencing the bytes in the pixel buffer
  • Use CIContext to render planar image data to a line buffer
  • Execute your drawing for the application in the CGContext!
  • Unlock Pixel Buffer
  • Get information about the time of the source buffer.
  • Create CMVideoFormatDescriptionRef

    by requesting a pixel buffer for its exact format.
  • Create a sample buffer for the pixel buffer. Done!


Here's an example implementation where I chose 32ARGB as an image format for the job, because it is something that works like CGBitmapContext

and CoreVideo

with iOS:

{
    CGPixelBufferPoolRef *_pool;
    CGSize _poolBufferDimensions;
}
- (void)_processSampleBuffer:(CMSampleBufferRef)inputBuffer
{
    // 1. Input data
    CVPixelBufferRef inputPixels = CMSampleBufferGetImageBuffer(inputBuffer);
    CIImage *inputImage = [CIImage imageWithCVPixelBuffer:inputPixels];

    // 2. Create a new pool if the old pool doesn't have the right format.
    CGSize bufferDimensions = {CVPixelBufferGetWidth(inputPixels), CVPixelBufferGetHeight(inputPixels)};
    if(!_pool || !CGSizeEqualToSize(bufferDimensions, _poolBufferDimensions)) {
        if(_pool) {
            CFRelease(_pool);
        }
        OSStatus ok0 = CVPixelBufferPoolCreate(NULL,
            NULL, // pool attrs
            (__bridge CFDictionaryRef)(@{
                (id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32ARGB),
                (id)kCVPixelBufferWidthKey: @(bufferDimensions.width),
                (id)kCVPixelBufferHeightKey: @(bufferDimensions.height),
            }), // buffer attrs
            &_pool
        );
        _poolBufferDimensions = bufferDimensions;
        assert(ok0 == noErr);
    }

    // 4. Create pixel buffer
    CVPixelBufferRef outputPixels;
    OSStatus ok1 = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(NULL,
        _pool,
        (__bridge CFDictionaryRef)@{
            // Opt to fail buffer creation in case of slow buffer consumption
            // rather than to exhaust all memory.
            (__bridge id)kCVPixelBufferPoolAllocationThresholdKey: @20
        }, // aux attributes
        &outputPixels
    );
    if(ok1 == kCVReturnWouldExceedAllocationThreshold) {
        // Dropping frame because consumer is too slow
        return;
    }
    assert(ok1 == noErr);

    // 5, 6. Graphics context to draw in
    CGColorSpaceRef deviceColors = CGColorSpaceCreateDeviceRGB();
    OSStatus ok2 = CVPixelBufferLockBaseAddress(outputPixels, 0);
    assert(ok2 == noErr);
    CGContextRef cg = CGBitmapContextCreate(
        CVPixelBufferGetBaseAddress(outputPixels), // bytes
        CVPixelBufferGetWidth(inputPixels), CVPixelBufferGetHeight(inputPixels), // dimensions
        8, // bits per component
        CVPixelBufferGetBytesPerRow(outputPixels), // bytes per row
        deviceColors, // color space
        kCGImageAlphaPremultipliedFirst // bitmap info
    );
    CFRelease(deviceColors);
    assert(cg != NULL);

    // 7
    [_imageContext render:inputImage toCVPixelBuffer:outputPixels];

    // 8. DRAW
    CGContextSetRGBFillColor(cg, 0.5, 0, 0, 1);
    CGContextSetTextDrawingMode(cg, kCGTextFill);
    NSAttributedString *text = [[NSAttributedString alloc] initWithString:@"Hello world" attributes:NULL];
    CTLineRef line = CTLineCreateWithAttributedString((__bridge CFAttributedStringRef)text);
    CTLineDraw(line, cg);
    CFRelease(line);

    // 9. Unlock and stop drawing
    CFRelease(cg);
    CVPixelBufferUnlockBaseAddress(outputPixels, 0);

    // 10. Timings
    CMSampleTimingInfo timingInfo;
    OSStatus ok4 = CMSampleBufferGetSampleTimingInfo(inputBuffer, 0, &timingInfo);
    assert(ok4 == noErr);

    // 11. VIdeo format
    CMVideoFormatDescriptionRef videoFormat;
    OSStatus ok5 = CMVideoFormatDescriptionCreateForImageBuffer(NULL, outputPixels, &videoFormat);
    assert(ok5 == noErr);

    // 12. Output sample buffer
    CMSampleBufferRef outputBuffer;
    OSStatus ok3 = CMSampleBufferCreateForImageBuffer(NULL, // allocator
        outputPixels, // image buffer 
        YES, // data ready
        NULL, // make ready callback
        NULL, // make ready refcon
        videoFormat,
        &timingInfo, // timing info
        &outputBuffer // out
    );
    assert(ok3 == noErr);

    [_consumer consumeSampleBuffer:outputBuffer];
    CFRelease(outputPixels);
    CFRelease(videoFormat);
    CFRelease(outputBuffer);
}

      

+3


source







All Articles