Reputation: 529
I am trying to simply put a Camera View in my View Controller.
I imported AVFoundation
at the top, as well as UIImagePickerControllerDelegate
and UINavigationControllerDelegate
classes.
However, whenever I try to use AVCaptureStillImageOutput
, Xcode tells me that it was deprecated in iOS10 and I should use AVCapturePhotoOutput
. That is completely fine, however, as soon as I want to call stillImageOutput.outputSettings
, .outputSettings
itself is not available. Thus, I have to use AVAVCaptureStillImageOutput
for it to work but I have multiple warnings because this function was deprecated in iOS10.
I searched and searched but could not really find the solution around it. I would really appreciate your help. I am learning so any explanation would be great! Code is below.
import UIKit
import AVFoundation
class CameraView: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
var captureSession : AVCaptureSession?
var stillImageOutput : AVCaptureStillImageOutput?
var previewLayer : AVCaptureVideoPreviewLayer?
@IBOutlet var cameraView: UIView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
captureSession = AVCaptureSession()
captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080
var backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error : NSError?
do {
var input = try! AVCaptureDeviceInput (device: backCamera)
if (error == nil && captureSession?.canAddInput(input) != nil) {
captureSession?.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if (captureSession?.canAddOutput(stillImageOutput) != nil) {
captureSession?.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer (session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
}
} catch {
}
}
}
Upvotes: 17
Views: 14504
Reputation: 289
I write the objective-c code because Aleksey Timoshchenko answer is correct.
Only for help others.
@interface CameraGalleryViewController ()
@property (weak, nonatomic) IBOutlet UIView *viewCamera;
@property (weak, nonatomic) IBOutlet UICollectionView *collectionView;
@property (strong, nonatomic) AVCaptureSession *session;
@property (strong, nonatomic) AVCapturePhotoOutput *cameraOutput;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
@end
@implementation CameraGalleryViewController
#pragma mark - Lifecycle
// ==================================================================================
// Lifecycle
- (void) viewDidLoad {
[super viewDidLoad];
[self.viewModel viewModelDidLoad];
}
- (void) viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
}
- (void) viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[self initVars];
}
- (void)viewWillTransitionToSize:(CGSize)size withTransitionCoordinator:(id<UIViewControllerTransitionCoordinator>)coordinator {
[super viewWillTransitionToSize:size withTransitionCoordinator:coordinator];
[coordinator animateAlongsideTransition:^(id<UIViewControllerTransitionCoordinatorContext> _Nonnull context) {
} completion:^(id<UIViewControllerTransitionCoordinatorContext> _Nonnull context) {
[self changeOrientation];
}];
}
#pragma mark - IBActions
// ==================================================================================
// IBActions
- (IBAction)takePhoto:(UIButton *)sender {
AVCapturePhotoSettings *settings = [[AVCapturePhotoSettings alloc] init];
NSNumber *previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.firstObject;
NSString *formatTypeKey = (NSString *)kCVPixelBufferPixelFormatTypeKey;
NSString *widthKey = (NSString *)kCVPixelBufferWidthKey;
NSString *heightKey = (NSString *)kCVPixelBufferHeightKey;
NSDictionary *previewFormat = @{formatTypeKey:previewPixelType,
widthKey:@1024,
heightKey:@768
};
settings.previewPhotoFormat = previewFormat;
[self.cameraOutput capturePhotoWithSettings:settings delegate:self];
}
#pragma mark - Public methods
// ==================================================================================
// Public methods
- (void) setupView {
[self.collectionView reloadData];
}
#pragma mark - Private methods
// ==================================================================================
// Private methods
- (void) initVars {
[self.collectionView registerNib:[CameraGalleryViewCell cellNib] forCellWithReuseIdentifier:[CameraGalleryViewCell cellId]];
self.collectionView.dataSource = self;
self.collectionView.delegate = self;
self.session = [[AVCaptureSession alloc] init];
[self.session setSessionPreset:AVCaptureSessionPresetPhoto];
self.cameraOutput = [[AVCapturePhotoOutput alloc] init];
AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error;
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ([self.session canAddInput:deviceInput]) {
[self.session addInput:deviceInput];
if ([self.session canAddOutput:self.cameraOutput]) {
[self.session addOutput:self.cameraOutput];
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.previewLayer.frame = CGRectMake(0,0, self.view.bounds.size.width, self.viewCamera.bounds.size.height);
[self.viewCamera.layer addSublayer:self.previewLayer];
[self changeOrientation];
[self.session startRunning];
}
}
}
- (void) changeOrientation {
UIInterfaceOrientation orientation = [UIApplication sharedApplication].statusBarOrientation;
CGRect size = [UIScreen mainScreen].bounds;
if (size.size.height > size.size.width) {
if (orientation == UIInterfaceOrientationPortrait) {
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
} else {
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
}
} else {
if (orientation == UIInterfaceOrientationLandscapeRight) {
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
} else {
self.previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
}
}
}
#pragma mark - CollectionView delegate
// ==================================================================================
// CollectionView delegate
- (NSInteger) collectionView:(UICollectionView *)collectionView numberOfItemsInSection:(NSInteger)section {
NSInteger numItems = [self.viewModel imageListCount];
self.collectionView.hidden = !(numItems > 0);
return numItems;
}
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
CameraGalleryViewCell *cell = [collectionView dequeueReusableCellWithReuseIdentifier:[CameraGalleryViewCell cellId] forIndexPath:indexPath];
[cell imageForImageView:[self.viewModel imageFromListWithIndex:indexPath.row]];
return cell;
}
#pragma mark - Camera delegate
// ==================================================================================
// Camera delegate
- (void) captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
if (error) {
return;
}
if (photoSampleBuffer && previewPhotoSampleBuffer) {
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
[self.viewModel addImageToListAndRefresh:[UIImage imageWithData:imageData]];
}
}
@end
Upvotes: -2
Reputation: 5273
There is my full implementation
import UIKit
import AVFoundation
class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {
var captureSesssion : AVCaptureSession!
var cameraOutput : AVCapturePhotoOutput!
var previewLayer : AVCaptureVideoPreviewLayer!
@IBOutlet weak var capturedImage: UIImageView!
@IBOutlet weak var previewView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
captureSesssion = AVCaptureSession()
captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto
cameraOutput = AVCapturePhotoOutput()
let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
if let input = try? AVCaptureDeviceInput(device: device) {
if captureSesssion.canAddInput(input) {
captureSesssion.addInput(input)
if captureSesssion.canAddOutput(cameraOutput) {
captureSesssion.addOutput(cameraOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion)
previewLayer.frame = previewView.bounds
previewView.layer.addSublayer(previewLayer)
captureSesssion.startRunning()
}
} else {
print("issue here : captureSesssion.canAddInput")
}
} else {
print("some problem here")
}
}
// Take picture button
@IBAction func didPressTakePhoto(_ sender: UIButton) {
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [
kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160
]
settings.previewPhotoFormat = previewFormat
cameraOutput.capturePhoto(with: settings, delegate: self)
}
// callBack from take picture
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
if let error = error {
print("error occure : \(error.localizedDescription)")
}
if let sampleBuffer = photoSampleBuffer,
let previewBuffer = previewPhotoSampleBuffer,
let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
print(UIImage(data: dataImage)?.size as Any)
let dataProvider = CGDataProvider(data: dataImage as CFData)
let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)
self.capturedImage.image = image
} else {
print("some error here")
}
}
// This method you can use somewhere you need to know camera permission state
func askPermission() {
print("here")
let cameraPermissionStatus = AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo)
switch cameraPermissionStatus {
case .authorized:
print("Already Authorized")
case .denied:
print("denied")
let alert = UIAlertController(title: "Sorry :(" , message: "But could you please grant permission for camera within device settings", preferredStyle: .alert)
let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil)
alert.addAction(action)
present(alert, animated: true, completion: nil)
case .restricted:
print("restricted")
default:
AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo) {
[weak self]
(granted :Bool) -> Void in
if granted == true {
// User granted
print("User granted")
DispatchQueue.main.async() {
// Do smth that you need in main thread
}
} else {
// User Rejected
print("User Rejected")
DispatchQueue.main.async() {
let alert = UIAlertController(title: "WHY?" , message: "Camera it is the main feature of our application", preferredStyle: .alert)
let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil)
alert.addAction(action)
self?.present(alert, animated: true, completion: nil)
}
}
}
}
}
}
Upvotes: 17
Reputation: 126167
AVCaptureStillImageOutput
being deprecated means you can keep using it in iOS 10, but:
AVCaptureStillImageOutput
for wide color but it's a lot easier to do wide color with AVCapturePhotoOutput
. And for RAW capture or Live Photos, AVCapturePhotoOutput
is the only game in town. If you're happy proceeding despite the deprecation, your issue isn't that outputSettings
is removed — it's still there.
Something to be aware of for beta 6 and beyond (though it turns out not to be an issue here): APIs that use NSDictionary
without explicit key and value types come into Swift 3 as [AnyHashable: Any]
and the Foundation or CoreFoundation types you might use in a dictionary are no longer implicitly bridged to Swift types. (Some of the other questions about beta 6 dictionary conversions might point you in the right direction there.)
However, I'm not getting any compilation errors for setting outputSettings
. Whether in your full code or by reducing it to the essential parts for that line:
var stillImageOutput : AVCaptureStillImageOutput?
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
...the only warnings I see are about the deprecation.
Upvotes: 8