Objective-C An easy way to take a photo without a camera interface. Just take a picture from the camera and save to a file

I can't find an easy way to take a photo without a camera interface. I just need to get a snapshot from the camera and save it to a file.

+1


source to share


1 answer


I used this code to take a photo with the front camera. Not all of the code is mine, but I couldn't find a link to the original source. This code also creates a shutter sound. The image quality is not very good (quite dark) so the code needs a tweak or two.

-(void) takePhoto 
{
    AVCaptureDevice *frontalCamera;

    NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for ( int i = 0; i < allCameras.count; i++ )
    {
        AVCaptureDevice *camera = [allCameras objectAtIndex:i];

        if ( camera.position == AVCaptureDevicePositionFront )
        {
            frontalCamera = camera;
        }
    }

    if ( frontalCamera != nil )
    {
        photoSession = [[AVCaptureSession alloc] init];

        NSError *error;
        AVCaptureDeviceInput *input =
        [AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];

        if ( !error && [photoSession canAddInput:input] )
        {
            [photoSession addInput:input];

            AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];

            [output setOutputSettings:
             [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];

            if ( [photoSession canAddOutput:output] )
            {
                [photoSession addOutput:output];

                AVCaptureConnection *videoConnection = nil;

                for (AVCaptureConnection *connection in output.connections)
                {
                    for (AVCaptureInputPort *port in [connection inputPorts])
                    {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                        {
                            videoConnection = connection;
                            break;
                        }
                    }
                    if (videoConnection) { break; }
                }

                if ( videoConnection )
                {
                    [photoSession startRunning];

                    [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                                        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                        if (imageDataSampleBuffer != NULL)
                        {
                            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                            UIImage *photo = [[UIImage alloc] initWithData:imageData];
                            [self processImage:photo]; //this is a custom method
                        }
                    }];
                }
            }
        }
    }
}

      

photoSession

is the AVCaptureSession *

ivar of the class that contains the method takePhoto

.

EDIT: If you change the block if ( videoConnection )

to the code below, you add 1 second of delay and get a nice image.

if ( videoConnection )
{
    [photoSession startRunning];

    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){

        [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                            completionHandler:^(CMSampleBufferRefimageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer != NULL)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *photo = [[UIImage alloc] initWithData:imageData];
                [self processImage:photo];
            }
        }];
    });
}

      



If lag is unacceptable for your application, you can split the code in two and run photoSession

in viewDidAppear

(or somewhere similar) and just take an immediate snapshot when needed - usually after some user interaction.

dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.25 * NSEC_PER_SEC);

      

also gives a good result - so there is no need for a whole second lag.

Note that this code is written to take a photo with the front camera - I'm sure you will know how to fix it if you need to use the rear camera.

+6


source







All Articles