user3411226
user3411226

Reputation: 183

Detect a face while my cam is open

i need to build an app with just a cam view, and it should detect my cam is looking at a face, can anyone point me in the right direction? I have built something that detects a face on an image, but i need to work with a cam, here is what i have done so far:

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    NSString *path = [[NSBundle mainBundle] pathForResource:@"picture" ofType:@"JPG"];
    NSURL *url = [NSURL fileURLWithPath:path];

    CIContext *context = [CIContext contextWithOptions:nil];

    CIImage *image = [CIImage imageWithContentsOfURL:url];

    NSDictionary *options = @{CIDetectorAccuracy: CIDetectorAccuracyHigh};

    CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace context:context options:options];

    NSArray *features = [detector featuresInImage:image];

}

i have done the following:

-(void)viewWillAppear:(BOOL)animated{
    _session = [[AVCaptureSession alloc] init];
    [_session setSessionPreset:AVCaptureSessionPresetPhoto];

    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error;
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];

    if([_session canAddInput:deviceInput]){
        [_session addInput:deviceInput];
    }

    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_session];
    [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    CALayer *rootLayer = [[self view] layer];
    [rootLayer setMasksToBounds:YES];

    CGRect frame = self.frameCapture.frame;
    [previewLayer setFrame:frame];

    [rootLayer insertSublayer:previewLayer atIndex:0];
    [_session startRunning];

}

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
    for(AVMetadataObject *metadataObject in metadataObjects) {
        if([metadataObject.type isEqualToString:AVMetadataObjectTypeFace]) {

           _faceDetectedLabel.text = @"face detected";
        }
    }
}

but still it is not detecting any faces, am i doing anything wrong?

Upvotes: 0

Views: 72

Answers (1)

Stas Volskiy
Stas Volskiy

Reputation: 451

You should add a metadata output before you'll have some data.

AVCaptureMetadataOutput *metadataOutput = [[AVCaptureMetadataOutput alloc] init];
// create a serial queue to handle metadata output
dispatch_queue_t metadataQueueOutput = dispatch_queue_create("com.YourAppName.metaDataQueue.OutputQueue", DISPATCH_QUEUE_SERIAL);
[metadataOutput setMetadataObjectsDelegate:self queue:metadataQueueOutput];
if ([_session canAddOutput:metadataOutput]) {
    [strongSelf.session addOutput:metadataOutput];
}
// set object types that you are interested, then you should not check type in output callback
metadataOutput.metadataObjectTypes = @[AVMetadataObjectTypeFace];

That should work. Let me know if it does

Upvotes: 1

Related Questions