Convert h264 compressed CMSampleBuffer to FFmpeg AVPacket

I am trying to export CMSampleBufferRef from VTCompressionSession to FFmpeg rendered AVPacket. I saw that FFmpeg provides a function av_read_frame()

to get a package from an encoded file. So using this, I can successfully get the AVPacket from an H264 encoded file (for example using AVAssetWritter). So I went into av_read_frame () implementation and saw that it reads imageBuffer from sampleBuffer. So I tried to use the same implementation in VTCompressionSessionCallback function

void VTCompressionOutputCallback(
    void *outputCallbackRefCon, 
    void *sourceFrameRefCon, 
    OSStatus status, 
    VTEncodeInfoFlags infoFlags,
    CMSampleBufferRef sampleBuffer) {
NSLog(@"VTCompressionSession Callback");
CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
}

      

But it turns out the pixelBuffer is zero here and I only get blockBuffer as valid output. So how do I get the frame from blockBuffer. Or, more specifically, how can I get an AVPacket using blockBuffer here.

+3


source to share





All Articles