Reputation: 5424
I have an Android 4.0 application that uses the GL_OES_EGL_image_external method of rendering video as an OpenGL surface. That works great. In addition, I would like to stretch/warp a few patches on top of that. I'm currently shading those areas I would like to warp with some additional shaders on some quads on top of those areas. I'm stuck on how to get the underlying color. How does the shader on my quad on top of the video quad warp the underlying image? Is it possible?
Upvotes: 4
Views: 1302
Reputation: 690
I'm on iOS, but my app does something very similar.
How I've achieved it is based on some sample code from Apple (look at the RippleModel.m in particular). How it works is that it places the video texture video not on a quad, but on a highly tessellated grid, so you've got a ton of triangles with a ton of texture coordinates. It creates the vertices of this grid programmatically -- and more importantly, it creates the texture coordinates programmatically as well -- and holds them in an array.
For each frame, it iterates through all the vertices and updates the texture coordinates for each, 'warping' them in a ripple pattern, based on where the user has touched, and based on how much texture offset the surrounding vertices have. So the geometry isn't changed at all, and they don't perform the warp in the shader, it's all done in the texture coordinates; the shader is then just doing the straight texture lookup on the coordinates it has received.
So it's hard to say if this approach will work for your needs, but if your warps only happen in 2d, and if you can figure out how to define your warp as texture coordinate adjustments, this may help.
Upvotes: 4