Reputation: 95
I'm working on a project where I capture the video input of user with AVCaptureSession
. The thing is I need the output in BGRA format to render it with OpenCV, but then I need to change it back to the default 420f format to display it to the user. So, I set PixelFormatType
in videoSettings
to kCVPixelFormatType_32BGRA
,
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
and I want to convert the sample buffer to 420f in captureOutput:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
[self doSomeOpenCV:sampleBuffer]; // the OpenCV part
//convert sampleBuffer to kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
}
Is there a way to do so? Or maybe, there is a function that returns a samplebuffer in specified format given the input?
Upvotes: 1
Views: 1341
Reputation: 36084
Why not use an AVSampleBufferDisplayLayer
to display your BGRA sample buffers and ditch webRTC (at least for display purposes)?
If you don’t want do that, you can of course convert your BGRA buffers to YUV, and you can even do it quite efficiently on the GPU, but the best conversion is no conversion and besides who wants to type that much on an iPad screen keyboard?
Upvotes: 1