Reputation: 1
I'm developing an app which reads video file and play while processing each frame (such as color converting). Since I'm newbie of Android and media processing, I can't configure how can I do this.
Currently, I'm trying to retrieve frames (image) using MediaCodec API, and process each frame using OpenCV. But, I'm stuck in converting output buffer (ByteBuffer) into OpenCV mat.
How can I do this? Or, if you guys know better way to make a this type of app, please give me advice. Even a small piece of advice would be appreciated.
Upvotes: 0
Views: 111
Reputation: 10621
Images (video frames) that the MediaCodec
outputs are typically represented as buffers of planar or semi-planar YCbCr
(often referred to as YUV) data.
AFAIK OpenCV does not support these formats directly. So you need to convert the data into another format (e.g. RGBA or grayscale). You can look at the implementation of JavaCamera2Frame as a starting point (it assumes a camera image format).
Note, that you need to check the actual format of the image (e.g. YUV_420_888
, YUV_422_888
, YUV_444_888
) and apply the appropriate color conversion.
An alternative approach is to use a SurfaceTexture and OpenGL ES to do the conversion on the GPU and then read it back as RGBA data. But this requires writing some amount of setup code.
Upvotes: 0