Igor Custodio
Igor Custodio

Reputation: 99

Is there a way to convert CMSampleBuffer into CVImageBuffer?

I am using a framework (MoodMe) to detect faces on iPhone camera and I need to pass a image or frame to MoodMe instance.

I have converted the camera output into a UIImage but the framework does not detect any face. (I think it)

So I want to pass to my buffer to framework, it is asking me for a CVImageBuffer variable but I don't know how (and if it is possible) convert my CMSampleBuffer that I receive from camera output into CVImageBuffer. There is a way to do this?

My code:

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    let attachments = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate)
    let ciImage = CIImage(cvImageBuffer: pixelBuffer!, options: attachments as! [String : Any]?)
    let img = UIImage(ciImage: ciImage)

    mdm.processImage(img)

    // it does not works
    if mdm.faceTracked {
        print("Face")
    } else {
        print("Not face")
    }

//  mdm.processImageBuffer(frame: CVImageBuffer! var)
}

Sorry for any English errors :)

Upvotes: 2

Views: 1803

Answers (1)

adamfowlerphoto
adamfowlerphoto

Reputation: 2751

You are already doing this in your code. The pixelBuffer variable is a CVImageBuffer

let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)

Upvotes: 6

Related Questions