cloudraven
cloudraven

Reputation: 2544

Hardware accelerated H.264/HEVC video decode on Android to OpenGL FBO or texture

I want to decode a video stream and write the decoded frames to a FBO opengl texture or other memory representation that OpenGL can readily use to apply additional filters or transformation (through shaders) to it.

Is using MediaCodec the accelerated standard in current Android? Can I use a resulting surface as input to OpenGL?

Coming from doing this in desktop development, I know many ways in which this process can be unnecessarily slowed down (e.g. using CPU instead of dedicated silicon or GPU for decoding, decoding in GPU and dumping the results to system ram to then copy again to VRAM, etc.). I am not sure whether these issues matter on Android. For example, is there also a high cost to copy between memory usable by java and memory usable by opengl or it is negligible?

Upvotes: 2

Views: 2599

Answers (3)

solidpixel
solidpixel

Reputation: 12069

In general for OpenGL ES you can import YUV video surfaces directly and access them natively if you import the surface as an external EGL image (see the extension here https://www.khronos.org/registry/OpenGL/extensions/OES/OES_EGL_image_external.txt).

The graphics driver will handle color conversion from YUV to RGB in the correct color space (assuming that the decoder reports the right color space anyway ...), and this is all zero-copy so is definitely the right approach and more efficient than rolling your own color conversion shaders if you want to use OpenGL ES.

On Android this external surface import functionality in Java-land is provided via the SurfaceTexture class (https://developer.android.com/reference/android/graphics/SurfaceTexture).

Upvotes: 4

E.Abdel
E.Abdel

Reputation: 1992

Yes, you can configure a MediaCodec decoder with an output surface and use GLSurfaceView to apply changes through shaders, this is the best way because you use just the necessary memory, especially for color transformation (you can't know in advance what color your decoder use), and for YUV2RGB transformation.

You can take a look here or here if you want to do it in native

Upvotes: 2

Ketan
Ketan

Reputation: 1015

You do have to use MediaCodec to access hardware decoders (it will be done via libstagefright and OpenMAX - under the hood).

  • If you want to do any processing then need to get decoded image (typically YUV420p type) and do YUV2RGB and any other required processing, all of which can be done as OGL texture/GLSL etc before passing your texture to a consumer. However, this means you have to do all audio and video sync etc. Avoid if it is not required.

  • If you want to ensure it is copyright and DRM content then have to use Surface and DRM interface, cannot do any processing.

  • If you just want to do direct display then can use Surface which can do YUV2RGB too or just use ExoPlayer.

I don't know if you can read Surface's texture into any OGL texture. I have implemented first option above.

Upvotes: -1

Related Questions