IPhone real time camera

I downloaded the ffmpeg lib file and executed it for armv7. I have successfully added ffmpeg lib files to my project. I can receive live streams using AVFoundation.

Now the problem is how would I convert the output of the iphone camera streams as ffmpeg input for decoding? Check my code

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {


CMBlockBufferRef bufferData = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t lengthAtOffset;
size_t totalLength; char* data; 

      

if (CMBlockBufferGetDataPointer (bufferData, 0, & lengthAtOffset, & totalLength, & data)! = noErr) {NSLog (@ "error!"); }}

Directly suggested to me which ffmpeg lib function is used to decode and how I would put CMBlockBufferRef

as input of this

thank

+3


source to share


1 answer


you can use OPenTok to stream live video



https://github.com/opentok/opentok-ios-sdk

0


source







All Articles