Reputation: 21
When using AVCaptureSession to show camera image to a small preview layer (AVCaptureVideoPreviewLayer), the image is scaled to fit the layer size.
In order to fill all the preview layer, this scaling is configurable by setting:
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
The problem is that this will only resize to fill the layer and as the camera image is very large, the image is resized becoming too small.
what i am looking for is a way to have my preview layer showing the same level of scale of the builtin camera, discarding the remaining image (imagine the device camera masked, only showing a square in the middle of the screen).
I have also tried changing capture Presets by issuing multiple presets, to no avail:
AVCaptureSession *session = [[AVCaptureSession alloc]init];
session.sessionPreset = AVCaptureSessionPresetMedium;
Upvotes: 2
Views: 871
Reputation: 2661
Here's what you do:
You get the inputVideo size:
AVCaptureDeviceInput *videoDeviceInput = // initialised already in your app
// Here you can get the video dimensions:
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(videoDeviceInput.device.activeFormat.formatDescription);
You size the previewLayer
to the size of the video feed.
Now you have a video layer at a 1:1 scale.
You then just centre the previewLayer
within the window.
Upvotes: 1