dac2009
dac2009

Reputation: 3561

Using ByteBuffer as a Texture in OpenGL

Iḿ working on an application that should stream video data into OpenGL (ES2). For each frame of my video, I can get a ByteBuffer that contain all the data. I guess one way to go would be to convert that to a OpenGL texture, and then use sampler2D in OpenGL. However, it feels as I already got that ByteBuffer, it would be more efficient to just send that data into OpenGL, with a width and height, and somehow read pixel data from that in the shader. The application will only show the video in 2D, but should use the fragment shader to do operations on the data.

My question is if OpenGL has anything built in to simplify this (i.e. using a ByteBuffer array just as a texture)? And is it possible to use sampler2D or equivalent on a ByteBuffer array?

Upvotes: 2

Views: 898

Answers (1)

Christian Rau
Christian Rau

Reputation: 45948

"However, it feels as I already got that ByteBuffer" - You got it, but OpenGL doesn't. So you have to put that data into your texture somehow. OpenGL is not able to use arbitrary CPU data for texturing.

You may however stream it by putting the video frame directly into a mapped PBO (if ES supports them) instead of a ByteBuffer (if that is even possible within your framework) and then copying it from this PBO into the texture, which might buy you something. But no matter what, you have to call glTexSubImage2D (and be sure to not forget that Sub in there, otherwise you will reallocate the whole texture storage each frame).

Upvotes: 1

Related Questions