CIImage back to CMSampleBuffer

I am recording video (.mp4 file) using AVAssetWriter

with CMSampleBuffer

data (from video, audio inputs).

While recording I want to process frames, I convert CMSampleBuffer

to CIImage

and process it.

but how to update CMSampleBuffer

with my new rendered image buffer from CIImage

?

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    if output == videoOutput {
       let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
       let ciimage: CIImage = CIImage(cvPixelBuffer: imageBuffer)
       ... // my code to process CIImage (for example add augmented reality)
       // but how to convert it back to CMSampleBuffer?
       // because AVAssetWriterInput to encode video/audio in file needs CMSampleBuffer
       ...
    } 
    ...
}

      

0


source to share


1 answer


You need to render your CIImage in CVPixelBuffer using CIContext method render(_:to:bounds:colorSpace:)

.

Then you can create CMSampleBuffer from CVPixelBuffer using for example CMSampleBufferCreateReadyWithImageBuffer(_:_:_:_:_:)



You may need to use the CVPixelBuffer pool for efficiency reasons, as shown in the Apple AVCamPhotoFilter sample code. In particular, see the RosyCIRenderer class .

Also see this answer which might help you Applying CIFilter to a video file and saving it

0


source







All Articles