Reputation: 8160
I'm reading pixel data from a framebuffer, and everything seems to work, except for the alpha value, which is always 1.0
GLfloat lebuf[areasize * 4];
glReadPixels(xstart, ystart, partw, parth, GL_RGBA, GL_FLOAT, lebuf);
I've set the window creation code to support an alpha channel:
SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8);
Is there any other place I should look at for checking why the alpha channel seems to be 1.0 all the time? Better yet, is there another way (other than glReadPixels) to get the texture into client memory, from the framebuffer?
edit: this is how I clear the buffer:
glClearColor(0,0,0,0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
Upvotes: 3
Views: 3496
Reputation: 1
Please use the following line.. problem will solved.
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_ALPHA | GLUT_DEPTH | GLUT_STENCIL);
Upvotes: 0
Reputation: 1486
If you are using GLUT, remember you have to set your main window as follows:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_ALPHA | GLUT_DEPTH | GLUT_STENCIL);
otherwise glReadPixels will read always alpha channel = 1.
Upvotes: 3
Reputation: 11636
Could you check:
glGetIntegerv(GL_ALPHA_BITS, bits)
)?glClearColor
). What if you clear to 0.5 and retrieve the buffer before rendering. Do you retrieve 0.5?glColorMask(GL_TRUE,GL_TRUE,GL_TRUE,GL_TRUE)
)?Upvotes: 6