How do I get AVCaptureStillImageOutput with the same aspect ratio as AVCaptureVideoPreviewLayer?

I am taking an image using AVFoundation

. I am using AVCaptureVideoPreviewLayer

to display the camera feed on the screen. This preview frame gets bounds UIView

with dynamic dimensions:

AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [self.cameraFeedView layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.cameraFeedView.frame;
[previewLayer setFrame:frame];
previewLayer.frame = rootLayer.bounds;
[rootLayer insertSublayer:previewLayer atIndex:0];

      

And I use AVCaptureStillImageOutput

to capture the image:

AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
                                              completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                                                  if (imageDataSampleBuffer != NULL) {
                                                      NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                      UIImage *capturedImage = [UIImage imageWithData:imageData];
                                                  }
                                              }];

      

My problem is that the captured image is the size of the iPhone camera (1280x960 - front camera), but I need it to be the same aspect ratio as the preview layer. For example, if the size of the preview layer is 150x100, I need the captured image to be 960x640. Is there any solution for this?

+3


source to share


1 answer


I am also entering the same problem. You must crop or resize the still image. But you should notice that the output scale and orientation of the image.

preview frame square

CGFloat width = CGRectGetWidth(self.view.bounds);
[self.captureVideoPreviewLayer setFrame:CGRectMake(0, 0, width, width)];
[self.cameraView.layer addSublayer:self.captureVideoPreviewLayer];

      



calculates the frame of the cropped image

[self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
    NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
    UIImage *image = [[UIImage alloc] initWithData:data];

    CGRect cropRect = CGRectMake((image.size.height - image.size.width) / 2, 0, image.size.width, image.size.width);
    CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);
    UIImage *croppedImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation]; // always UIImageOrientationRight
    CGImageRelease(imageRef);
}];

      

+1


source







All Articles