Reputation: 14435
I need to render something to an off-screen buffer and read the pixels back into "CPU memory" by calling glReadPixels
. My code works great when compiled to normal OpenGL under Windows, but to make it run under OpenGL ES 2.0 in iOS, I had to replace the GL_ALPHA
(or GL_LUMINANCE
) texture with an GL_RGBA
one, meaning
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
instead of
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, width, height, 0, GL_ALPHA, GL_UNSIGNED_BYTE, NULL);
I wasn't able to run the code with an 8-bit texture.
But since I don't need the full colors on this rendering, those extra 3 bytes per pixel are wasted. Before I try it again, here my question:
Has anyone succeeded to off-screen-render to an 8-bit texture and reading the bytes back into CPU memory? Is it possible at all?
Upvotes: 2
Views: 1850
Reputation: 1826
To make it run in OpenGL 3.0, you could use GL_R8 for buffer buffers:
glGenFramebuffers(1, &_framebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
glGenRenderbuffers(1, &_renderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_R8, 640, 480);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER, _renderBuffer);
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER) ;
if(status != GL_FRAMEBUFFER_COMPLETE) {
NSLog(@"failed to make complete framebuffer object %x", status);
}
Upvotes: 0
Reputation: 807
Set the glRenderbufferStorage format to GL_R8_EXT, and the glReadPixels/glTexImage2D formats to GL_RED_EXT.
http://www.khronos.org/registry/gles/extensions/EXT/EXT_texture_rg.txt
Edit: I've tested this and it works, but only on A5 and above (iPad2/3/4/mini, iPhone4S/5, iPod touch 5th gen). Unfortunately it's not available on A4 and older where it's needed most (iPhone4, iPod touch 4th gen, 3GS, iPad1).
Upvotes: 2