Mustafa Shabib
Mustafa Shabib

Reputation: 740

Using CIFaceFeature detection, can I determine the Confidence of the Face detected in an Image

I'm just starting to take a look at CIDetector to detect faces in an image but am wondering whether anyone has had any luck determining the confidence level that the detector has when it detected the face.

I know we can essentially set the detector threshold by choosing different detector accuracies, but is there any way to tell how much the detected feature has surpassed the requested accuracy?

CIContext *context = [CIContext contextWithOptions:nil];
NSDictionary *opts = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace
                                      context:context
                                      options:opts]; 

Essentially, if I have an image that has two faces in it - how can I determine which of the two is more likely to be a face assuming both are detected using the CIDirectorAccuracyHigh option?

Thanks

Mustafa

Upvotes: 1

Views: 338

Answers (1)

Manan
Manan

Reputation: 93

The lower index in the NSArray of CIFeatures is more likely to be a face according to the documentation in CoreImage

/** Returns an array of CIFeature instances in the given image.
 The array is sorted by confidence, highest confidence first. */
- (NSArray *)featuresInImage:(CIImage *)image __OSX_AVAILABLE_STARTING(__MAC_10_7, __IPHONE_5_0);

Upvotes: 0

Related Questions