Reputation: 1077
I am trying to use the new AVFoundation framework
for taking still pictures with the iPhone.
With a button press this methos is called. I can hear the shutter sound but I can't see the log output. If I call this method several times the camera preview will freeze.
Is there any tutorial out there how to use captureStillImageAsynchronouslyFromConnection
?
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:
[[self stillImageOutput].connections objectAtIndex:0]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,
NSError *error) {
NSLog(@"inside");
}];
- (void)initCapture { AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init]; captureOutput.alwaysDiscardsLateVideoFrames = YES; dispatch_queue_t queue; queue = dispatch_queue_create("cameraQueue", NULL); [captureOutput setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureOutput setVideoSettings:videoSettings]; self.captureSession = [[AVCaptureSession alloc] init]; self.captureSession.sessionPreset = AVCaptureSessionPresetLow; [self.captureSession addInput:captureInput]; [self.captureSession addOutput:captureOutput]; self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession]; [self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft]; self.prevLayer.frame = CGRectMake(0.0, 0.0, 480.0, 320.0); self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer: self.prevLayer]; // Setup the default file outputs AVCaptureStillImageOutput *_stillImageOutput = [[[AVCaptureStillImageOutput alloc] init] autorelease]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [_stillImageOutput setOutputSettings:outputSettings]; [outputSettings release]; [self setStillImageOutput:_stillImageOutput]; if ([self.captureSession canAddOutput:stillImageOutput]) { [self.captureSession addOutput:stillImageOutput]; } [self.captureSession commitConfiguration]; [self.captureSession startRunning]; }
Upvotes: 22
Views: 29192
Reputation: 4025
You should use Adam's answer, but if you use Swift (like most of you probably do nowadays), here's a Swift 1.2 port of his code:
import ImageIO
private var stillImageOutput: AVCaptureStillImageOutput!
stillImageOutput
before captureSession.startRunning()
:Like this:
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
captureSession.addOutput(stillImageOutput)
Then use this code to capture an image:
private func captureImage() {
var videoConnection: AVCaptureConnection?
for connection in stillImageOutput.connections as! [AVCaptureConnection] {
for port in connection.inputPorts {
if port.mediaType == AVMediaTypeVideo {
videoConnection = connection
break
}
}
if videoConnection != nil {
break
}
}
print("about to request a capture from: \(stillImageOutput)")
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) { (imageSampleBuffer: CMSampleBuffer!, error: NSError!) -> Void in
let exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, nil)
if let attachments = exifAttachments {
// Do something with the attachments
print("attachments: \(attachments)")
} else {
print("no attachments")
}
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
let image = UIImage(data: imageData)
// Do something with the image
}
}
This all assumes that you already have an AVCaptureSession
setup and just need to take a still from it, as did I.
Upvotes: 0
Reputation: 33126
After a lot of trial and error, I worked out how to do this.
Hint: Apple's official docs are - simply - wrong. The code they give you doesn't actually work.
I wrote it up here with step-by-step instructions:
Lots of code on the link, but in summary:
-(void) viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(@"attachements: %@", exifAttachments);
}
else
NSLog(@"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
}
Upvotes: 62
Reputation: 682
This has been a huge help - I was stuck in the weeds for quite a while trying to follow the AVCam example.
Here is a complete working project with my comments that explain what is happening. This illustrates how you can use the capture manager with multiple outputs. In this example there are two outputs.
The first is the still image output of the example above.
The second provides frame by frame access to the video coming out of the camera. You can add more code to do something interesting with the frames if you like. In this example I am just updating a frame counter on the screen from within the delegate callback.
Upvotes: 6
Reputation: 4780
Apple has some notes and example code on this:
Technical Q&A QA1702: How to capture video frames from the camera as images using AV Foundation
Upvotes: 3
Reputation: 33592
We had this problem when 4.0 was still in beta. I tried a fair bunch of things. Here goes:
I ended up just capturing video frames. The "take picture" button simply sets a flag; in the video frame callback, if the flag is set, it returns the video frame instead of a UIImage*. This was sufficient for our image-processing needs — "take picture" exists largely so the user can get a negative response (and an option to submit a bug report); we don't actually want 2/3/5 megapixel images, since they take ages to process.
If video frames are not good enough (i.e. you want to capture viewfinder frames between high-res image captures), I'd first see whether they've fixed using multiple AVCapture sessions, since that's the only way you can set both presets.
It's probably worth filing a bug. I filed a bug around the launch of 4.0 GM; Apple asked me for some sample code, but by then I'd decided to use the video frame workaround and had a release to release.
Additionally, the "low" preset is very low-res (and results in a low-res, low-framerate video preview). I'd go for 640x480 if available, falling back to Medium if not.
Upvotes: 16