AVFoundation adds the first frame to the video

I am trying to control how videos generated by my app are displayed in the Photos app on iOS. All videos I create start with a black border, then everything fades in and out, etc. When they are saved to Photos, Apple takes the first frame (black square) and uses it as a thumbnail in Photos. I would like to change this so that I can customize my own thumbnail for people to easily recognize the video.

Since I cannot find a built-in API for this, I am trying to hack it by adding a thumbnail that I create as the first frame of the video. I am trying to use AVFoundation for this but am having some problems.

My code throws the following error: [AVAssetReaderTrackOutput copyNextSampleBuffer] cannot copy next sample buffer before adding this output to an instance of AVAssetReader (using -addOutput:) and calling -startReading on that asset reader'

despite the method being called.

Here is my code:

AVAsset *asset = [[AVURLAsset alloc] initWithURL:fileUrl options:nil];
UIImage *frame = [self generateThumbnail:asset];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:640], AVVideoWidthKey,
                               [NSNumber numberWithInt:360], AVVideoHeightKey,
                               nil];

AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:asset error:nil];
AVAssetReaderOutput *readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[asset.tracks firstObject]
                                                                               outputSettings:nil];
[assetReader addOutput:readerOutput];

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:path
                                                       fileType:AVFileTypeMPEG4
                                                          error:nil];
NSParameterAssert(videoWriter);

AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                     outputSettings:videoSettings];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:nil];

NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

[videoWriter addInput:writerInput];

[assetReader startReading];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

CVPixelBufferRef buffer = [self pixelBufferFromCGImage:frame.CGImage andSize:frame.size];

BOOL append_ok = NO;
while (!append_ok) {
    if (adaptor.assetWriterInput.readyForMoreMediaData) {
        append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
        CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
        NSParameterAssert(bufferPool != NULL);

        [NSThread sleepForTimeInterval:0.05];
    } else {
        [NSThread sleepForTimeInterval:0.1];
    }
}
CVBufferRelease(buffer);

dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
    CMSampleBufferRef nextBuffer;
    while (writerInput.readyForMoreMediaData) {
        nextBuffer = [readerOutput copyNextSampleBuffer];
        if(nextBuffer) {
            NSLog(@"Wrote: %zu bytes", CMSampleBufferGetTotalSampleSize(nextBuffer));
            [writerInput appendSampleBuffer:nextBuffer];
        } else {
            [writerInput markAsFinished];
            [videoWriter finishWritingWithCompletionHandler:^{
                //int res = videoWriter.status;
            }];
            break;
        }
    }
}];

      

I have tried some variations of this but to no avail. I've seen some glitches due to the file format. I'm using an mp4 file (not sure how to find out its compression status or if it's supported), but I couldn't get it to work even with an uncompressed .mov file (made with Photo Booth on Mac).

Any ideas what I am doing wrong?

+3


source to share


1 answer


You had the same problem.

Your resource is released after the ARC function ends. But the block read buffer from readerOutput keeps trying to read the content.



When the property is missing, readerOutput is disconnected from it, so an error message appears indicating that you need to connect it back to the activator.

The fix is ​​to make sure the activator is not released. For example. putting it into property.

+2


source







All Articles