AVAssetWriter / AVAssetWriterInputPixelBufferAdaptor - Black frames and frame rate

I take the camera feed and record it in the movie. The problem I'm running into is that after exporting, the movie has a couple of black seconds in front of it (relative to the actual recording start time).

I think it has something to do with [self.assetWriter startSessionAtSourceTime:kCMTimeZero];

I had a semi-working solution, having the frameStart variable just counted up in the samplebuffer delegate method.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    frameStart++;
    if (self.startRecording == YES) {

        static int64_t frameNumber = 0;
        if(self.assetWriterInput.readyForMoreMediaData) {
            [self.pixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:CMTimeMake(frameNumber, 25)];
        }
        frameNumber++;
    }
}

      

and then call this method when the user clicked the button:

[self.assetWriter startSessionAtSourceTime:CMTimeMake(frameStart,25)];

      

it works. but just once ... if I want to record a second movie the black frames come back again.

Also, when I look at the rendered movie, the frame rate is 25 frames per second, as I want it to be. but the video looks like it sped up. as if there isn't enough space between frames. Thus, the movie plays about twice as fast.

NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, AVVideoCodecH264, AVVideoCodecKey, nil];

self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
self.assetWriterInput.expectsMediaDataInRealTime = YES;

      

+3


source to share


2 answers


You don't need to calculate frame timestamps yourself. You can get the timestamp of the current sample with

CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);

      



However, it seems to me that you are just passing the pixel framebuffer to the adapter unchanged. Wouldn't it be easier to directly pass the sample buffer directly to the assetWriterInput

following?

[self.assetWriterInput appendSampleBuffer:sampleBuffer];

      

+2


source


First of all, why are you increasing the number of frames per second for each frame? Increment once, remove the first. This should fix the playback speed.

Second, reload frameNumber to 0 when you finish recording? If this is not your problem. If not, I need more explanation on what's going on here.



respectfully

0


source







All Articles