Tavis
Tavis

Reputation: 35

Captured image flipped horizontally

Hello I am using an avcapturesession in xcode to make a live camera screen so I can take photos (similar to the snapchat setup). The camera is fully functional and I have set it up so I can use the front or back camera. The back camera works fine and I can capture an image and it previews how I need it, however the front camera previews ok but when the image is captured, in the preview it is flipped and I can't see where this is occurring.

Heres my code of for the session:

- (void) initializeCamera {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

captureVideoPreviewLayer.frame = self.imagePreview.bounds;
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];

UIView *view = [self imagePreview];
CALayer *viewLayer = [view layer];
[viewLayer setMasksToBounds:YES];

CGRect bounds = [view bounds];
[captureVideoPreviewLayer setFrame:bounds];

NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera = nil;
AVCaptureDevice *backCamera = nil;

for (AVCaptureDevice *device in devices) {

    NSLog(@"Device name: %@", [device localizedName]);

    if ([device hasMediaType:AVMediaTypeVideo]) {

        if ([device position] == AVCaptureDevicePositionBack) {
            NSLog(@"Device position : back");
            backCamera = device;
        }
        else {
            NSLog(@"Device position : front");
            frontCamera = device;
        }
    }
}

if (!FrontCamera) {
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
    if (!input) {
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];
}

if (FrontCamera) {
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
    if (!input) {
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

}

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

[session startRunning];
}
- (void) capImage {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {

    for (AVCaptureInputPort *port in [connection inputPorts]) {

        if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
            videoConnection = connection;
            break;
        }
    }

    if (videoConnection) {
        break;
    }
}

NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

    if (imageSampleBuffer != NULL) {
        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        [self processImage:[UIImage imageWithData:imageData]];
    }
}];
}

- (void) processImage:(UIImage *)image {
haveImage = YES;

if([UIDevice currentDevice].userInterfaceIdiom==UIUserInterfaceIdiomPad) { //Device is ipad
    UIGraphicsBeginImageContext(CGSizeMake(3072, 4088));
    [image drawInRect: CGRectMake(0, 0, 3072, 4088)];
    UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    CGRect cropRect = CGRectMake(0, 0, 3072, 4088);
    CGImageRef imageRef = CGImageCreateWithImageInRect([smallImage CGImage], cropRect);

    [captureImage setImage:[UIImage imageWithCGImage:imageRef]];

    CGImageRelease(imageRef);

}else{ //Device is iphone
    UIGraphicsBeginImageContext(CGSizeMake(1280, 2272));
    [image drawInRect: CGRectMake(0, 0, 1280, 2272)];
    UIImage *smallImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    UIImage * flippedImage = [UIImage imageWithCGImage:smallImage.CGImage scale:smallImage.scale orientation:UIImageOrientationLeftMirrored];

    smallImage = flippedImage;

    CGRect cropRect = CGRectMake(0, 0, 1280, 2272);
    CGImageRef imageRef = CGImageCreateWithImageInRect([smallImage CGImage], cropRect);


    [captureImage setImage:[UIImage imageWithCGImage:imageRef]];

    CGImageRelease(imageRef);
}
}

I also want to add touch to focus and flash but I don't know where I have to implement the code here is what I have found:

flash-

for the flash all I can find regarding the torch is a toggle. I can't find a way to make it only come on and work like apples camera apps flash.

tap to focus -

ios AVFoundation tap to focus

Upvotes: 3

Views: 3067

Answers (1)

Abhijay Bhatnagar
Abhijay Bhatnagar

Reputation: 79

I believe that is the default behavior for the front-facing camera. Try flipping the output image manually right before it is displayed.

Upvotes: 3

Related Questions