Reputation: 615
My goal is to use an AVCaptureSession
to programmatically lock focus, capture one image, activate the flash, then capture a second image after some delay.
I have managed to get the captures to work using an AVCaptureSession
instance and an AVCaptureStillImageOutput
. However, the images I get when calling captureStillImageAsynchronouslyFromConnection(_:completionHandler:)
are 1920 x 1080, not the full 12 megapixel image my iPhone 6S camera is capable of.
Here is my capture function:
func captureImageFromStream(completion: (result: UIImage) -> Void)
{
if let stillOutput = self.stillImageOutput {
var videoConnection : AVCaptureConnection?
for connection in stillOutput.connections {
for port in connection.inputPorts! {
if port.mediaType == AVMediaTypeVideo {
videoConnection = connection as? AVCaptureConnection
break
}
}
if videoConnection != nil {
break
}
}
if videoConnection != nil {
stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
if let image = UIImage(data: imageData) {
completion(result: image)
}
}
else {
NSLog("ImageCapture Error: \(error)")
}
}
}
}
}
What modifications should I make to capture the image I'm looking for? I'm new to Swift, so please excuse any beginner mistakes I've made.
Upvotes: 2
Views: 1336
Reputation: 36072
Before you addOutput
the stillImageOutput
and startRunning
, you need to set your capture session preset to photo:
captureSession.sessionPreset = AVCaptureSessionPresetPhoto
Upvotes: 2