Reputation: 3029
I am currently working on a snippet of code which looks like the following:
if error == nil && (captureSession?.canAddInput(input))!
{
captureSession?.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
//let settings = AVCapturePhotoSettings()
//settings.availablePreviewPhotoPixelFormatTypes =
stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG]
if (captureSession?.canAddOutput(stillImageOutput))!
{
captureSession?.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
}
I am aware that I should be using AVCapturePhotoOutput()
instead of AVCaptureStillImageOutput()
but am confused as to how I can transform the rest of this block if I make that change.
Specifically, how can I apply the same settings using the commented let settings = AVCapturePhotoSettings()
?
For reference, I am using this tutorial as a guide.
Thanks
Upvotes: 0
Views: 1748
Reputation: 3488
Apple documentation explains very clear for How to use AVCapturePhotoOutput
These are the steps to capture a photo.
have this below code on your clickCapture method and don't forgot to confirm and implement to delegate in your class.
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)
if you would like to know the different way to capturing photo from avfoundation check out my previous SO answer
Upvotes: 2