IOS: audio is missing in exported video

I am trying to export a recorded video. And succeed in this. But the audio is missing the final exported video. So I searched for it and added below code for audio.

if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}

      

But I can't save the video after adding the above code. I am getting the error:

"session.status 4 error Error Domain = AVFoundationErrorDomain Code = -11841" Operation stopped "UserInfo = 0x17027e140 {NSLocalizedDescription = Operation stopped, NSLocalizedFailureReason = Video could not be compiled.}"

- (void)exportDidFinish:(AVAssetExportSession*)session {

NSLog(@"session.status %ld error %@",session.status,session.error);}

      

Here is the code I used to export the video. So, you have ideas. How can I accomplish my task of exporting video with audio? Thank you !!

- (void)getVideoOutput{    
exportInProgress=YES;
NSLog(@"videoOutputFileUrl %@",videoOutputFileUrl);
AVAsset *videoAsset = [AVAsset assetWithURL:videoOutputFileUrl];
NSLog(@"videoAsset %@",videoAsset);
// 1 - Early exit if there no video file selected

NSLog(@"video asset %@",videoAsset);

if (!videoAsset) {

    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First"

                                                   delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];

    [alert show];

    return;

}



// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];



// 3 - Video track

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo

                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                    ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]

                     atTime:kCMTimeZero error:nil];

/* getting an error AVAssetExportSessionStatusFailed
if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}*/


// 3.1 - Create AVMutableVideoCompositionInstruction

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);



// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.

AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;

BOOL isVideoAssetPortrait_  = NO;

CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;

if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ = UIImageOrientationRight;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ =  UIImageOrientationLeft;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {

    videoAssetOrientation_ =  UIImageOrientationUp;

}

if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {

    videoAssetOrientation_ = UIImageOrientationDown;

}

[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];

[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];



// 3.3 - Add instructions

mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];



AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];



CGSize naturalSize;

if(isVideoAssetPortrait_){

    naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);

} else {

    naturalSize = videoAssetTrack.naturalSize;

}



float renderWidth, renderHeight;

renderWidth = naturalSize.width;

renderHeight = naturalSize.height;

mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);

mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];

mainCompositionInst.frameDuration = CMTimeMake(1, 30);


int totalSeconds= (int) CMTimeGetSeconds(videoAsset.duration);

[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize videoDuration:totalSeconds];



// 4 - Get path

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:

                         [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];



// 5 - Create exporter

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition

                                                                  presetName:AVAssetExportPresetHighestQuality];

exporter.outputURL=url;

exporter.outputFileType = AVFileTypeQuickTimeMovie;

exporter.shouldOptimizeForNetworkUse = YES;

exporter.videoComposition = mainCompositionInst;


[exporter exportAsynchronouslyWithCompletionHandler:^{


    //dispatch_async(dispatch_get_main_queue(), ^{

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{


        [self exportDidFinish:exporter];


    });

}];

      

}

+3


source to share


1 answer


I'm not sure if this helps, but this is how I did it in the project:

  • Prepare the final composition

    AVMutableComposition *composition = [[AVMutableComposition alloc] init];
    
          

  • Prepare the video track

    AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
          

  • Prepare an audio track

    AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
          

  • Insert video data from an asset into a video track

    AVAssetTrack *video = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:video atTime:kCMTimeZero error:&error];
    
          

  • Insert audio data from asset into audio track

    AVAssetTrack *audio = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audio atTime:kCMTimeZero error:&error];
    
          

  • Then you can add some instructions for handling your video and / or audio data

  • Finally, you should be able to export using:

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
    [exporter exportAsynchronouslyWithCompletionHandler:^{ /* code when the export is complete */ }];
    
          



Also, check if the audio is recorded correctly.
When you first launch the iOS camera, you should ask if you want to allow the use of the microphone. Check your device settings if allowed.

Another option, you can get your original asset using Window> Device in Xcode.
Select a device and export the data to your computer. Then find the registered asset and open it with, for example, VLC. Inspect the streams using Cmd + I

to see if there is an audio and video track.

+3


source







All Articles