Reputation: 10226
I've got a UIGesture recognizer that calls the below function when the screen is tapped.
The device is in portrait mode and the cameraView
takes up a variable height between about 50% and 90% of the screen, but that shouldn't matter because I'm getting the tap location within that view.
The apple docs state the focal point is a normalized coordinate (x and y are both between 0 and 1) with the top left in landscape mode (with the home button to the right) at (0, 0) and bottom right at (1, 1).
Setting the focus mode to AutoFocus rather than ContinuousAutoFocus made things worse. What am I missing? Other apps have this working just fine.
func focusPhoto(recognizer : UITapGestureRecognizer) {
if photoCaptureDevice.lockForConfiguration(nil) && photoCaptureDevice.focusPointOfInterestSupported {
var tapLocation = recognizer.locationInView(cameraView);
var focalPoint = CGPoint(x: tapLocation.x/cameraView.frame.width, y: tapLocation.y/cameraView.frame.height);
photoCaptureDevice.focusPointOfInterest = focalPoint
//photoCaptureDevice.focusMode = AVCaptureFocusMode.AutoFocus
photoCaptureDevice.unlockForConfiguration()
}
}
Attempted this as well:
func focusPhoto(recognizer : UITapGestureRecognizer) {
if photoCaptureDevice.lockForConfiguration(nil) && photoCaptureDevice.focusPointOfInterestSupported {
photoCaptureDevice.focusPointOfInterest = previewLayer!.captureDevicePointOfInterestForPoint(tapLocation)
photoCaptureDevice.unlockForConfiguration()
}
}
Upvotes: 2
Views: 135
Reputation: 23634
Are you using a AVCaptureVideoPreviewLayer
?
There is a method you can use to convert a tapped point into the correct coordinate space for focusing.
- captureDevicePointOfInterestForPoint:
Upvotes: 1