The captured image is flipped horizontally

Hello I am using avcapturesession in xcode to make a live camera screen so that I can take photos (similar to snapchat settings). The camera is fully functional and I have configured it so I can use the front or rear camera. The rear camera works fine and I can capture the image and it previews as I need it, however the pre-preview camera is previewed fine, but when the image is captured it is reversed in the preview and I cannot see where this is going.

Here is my code for the session:

- (void) initializeCamera {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

captureVideoPreviewLayer.frame = self.imagePreview.bounds;
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];

UIView *view = [self imagePreview];
CALayer *viewLayer = [view layer];
[viewLayer setMasksToBounds:YES];

CGRect bounds = [view bounds];
[captureVideoPreviewLayer setFrame:bounds];

NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera = nil;
AVCaptureDevice *backCamera = nil;

for (AVCaptureDevice *device in devices) {

    NSLog(@"Device name: %@", [device localizedName]);

    if ([device hasMediaType:AVMediaTypeVideo]) {

        if ([device position] == AVCaptureDevicePositionBack) {
            NSLog(@"Device position : back");
            backCamera = device;
        }
        else {
            NSLog(@"Device position : front");
            frontCamera = device;
        }
    }
}

if (!FrontCamera) {
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
    if (!input) {
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];
}

if (FrontCamera) {
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
    if (!input) {
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

}

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

[session startRunning];
}
- (void) capImage {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {

    for (AVCaptureInputPort *port in [connection inputPorts]) {

        if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
            videoConnection = connection;
            break;
        }
    }

    if (videoConnection) {
        break;
    }
}

NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

    if (imageSampleBuffer != NULL) {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        [self processImage:[UIImage imageWithData:imageData]];
    }
}];
}

- (void) processImage:(UIImage *)image {
haveImage = YES;

if([UIDevice currentDevice].userInterfaceIdiom==UIUserInterfaceIdiomPad) { //Device is ipad
    UIGraphicsBeginImageContext(CGSizeMake(3072, 4088));
    [image drawInRect: CGRectMake(0, 0, 3072, 4088)];
    UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    CGRect cropRect = CGRectMake(0, 0, 3072, 4088);
    CGImageRef imageRef = CGImageCreateWithImageInRect([smallImage CGImage], cropRect);

    [captureImage setImage:[UIImage imageWithCGImage:imageRef]];

    CGImageRelease(imageRef);

}else{ //Device is iphone
    UIGraphicsBeginImageContext(CGSizeMake(1280, 2272));
    [image drawInRect: CGRectMake(0, 0, 1280, 2272)];
    UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    UIImage * flippedImage = [UIImage imageWithCGImage:smallImage.CGImage scale:smallImage.scale orientation:UIImageOrientationLeftMirrored];

    smallImage = flippedImage;

    CGRect cropRect = CGRectMake(0, 0, 1280, 2272);
    CGImageRef imageRef = CGImageCreateWithImageInRect([smallImage CGImage], cropRect);


    [captureImage setImage:[UIImage imageWithCGImage:imageRef]];

    CGImageRelease(imageRef);
}
}

      

I also want to add touch for focus and flash, but I don't know where I need to implement the code, this is what I found:

flash -

for a flash, all I can find regarding a torch is a switch. I can't seem to find a way to get it to work and work like apple app apps.

click to focus -

ios AVFoundation click to focus

+3


source to share


1 answer


I believe this is the default behavior for the front camera. Try manually flipping the output image right before displaying it.



+3


source







All Articles