Reputation: 473
(iPhone) I'm trying to draw a cube in ES2 with a different color on each face. Right now the colors aren't coming out right and I can't figure out why. Here's the relevant code:
- (void) DrawES2 {
glViewport ( 0, 0, backingWidth, backingHeight );
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear ( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glUseProgram ( programObject );
int colorIndex = 0;
BOOL newFace = NO;
for(int i = 0; i < 36; i += 3)
{
GLfloat faceColor[] = { faceColors[colorIndex], faceColors[colorIndex+1], faceColors[colorIndex+2], faceColors[colorIndex+3] };
// Load the vertex data
glVertexAttribPointer ( 0, 3, GL_FLOAT, GL_FALSE, 0, vVertices );
glEnableVertexAttribArray ( 0 );
// Load the color data
glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_FALSE, 0, faceColor);
glEnableVertexAttribArray( 1 );
glDrawElements(GL_TRIANGLES, 3, GL_UNSIGNED_BYTE, &indices[i]);
newFace = ( i%2 == 0 ) ? NO : YES;
if( newFace )
colorIndex+=4;
}
}
GLfloat vVertices[] = { -0.5f, 0.5f, 0.5f,
-0.5f, -0.5f, 0.5f,
0.5f, -0.5f, 0.5f,
0.5f, 0.5f, 0.5f,
-0.5f, 0.5f, -0.5f,
-0.5f, -0.5f, -0.5f,
0.5f, -0.5f, -0.5f,
0.5f, 0.5f, -0.5f };
// Used to draw cube more efficiently
GLubyte indices[36] = {
4, 7, 3, //top face
4, 3, 0,
5, 6, 7, //front face
5, 7, 4,
3, 2, 1, //back face
0, 3, 1,
6, 2, 3, //right face
6, 3, 7,
5, 0, 1, //left face
5, 4, 0,
5, 2, 6, //bottom face
5, 1, 2 };
const GLfloat faceColors[] = {
0, 1, 0, 1,
1, 0.5f, 0, 1,
1, 0, 0, 1,
1, 1, 0, 1,
0, 0, 1, 1,
1, 0, 1, 1
};
GLbyte vShaderStr[] =
"uniform mat4 t_matrix; \n"
"uniform mat4 r_matrix; \n"
"uniform mat4 u_proj_matrix; \n"
"attribute vec4 vPosition; \n"
"attribute vec4 a_color; \n"
"varying vec4 v_color; \n"
"void main() \n"
"{ \n"
" mat4 model_matrix = t_matrix * r_matrix; \n"
" mat4 mvp_matrix = u_proj_matrix * model_matrix; \n"
" gl_Position = mvp_matrix * vPosition; \n"
" v_color = a_color; \n"
"} \n";
GLbyte fShaderStr[] =
"precision mediump float; \n"
"varying vec4 v_color; \n"
"void main() \n"
"{ \n"
" gl_FragColor = v_color; \n"
"}
Upvotes: 1
Views: 2038
Reputation: 2000
Just to mention it - if you wanna use bytes for colors, use:
glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 0, faceColor);
(GL_TRUE instead of GL_FALSE) see Techniques for working with vertex data
Upvotes: 1
Reputation: 3577
You have to enable the DEPTH_TEST on your engine otherwise the sequence of drawing of the vertices will override the logical Z DEPTH your eyes expect to see. Very very common initial error.
Try to enable the just after you clear the color and depth buffer.
glEnable(GL_DEPTH_TEST);
Cheers
Upvotes: 0
Reputation: 11
You have probably already found this, but it seems like your color data is being declared and initialized as floats while you're setting up the vertex attribute array to be using unsigned bytes. This may be why the colors don't seem to be what you expect.
Upvotes: 1
Reputation: 186088
You'll find that your logic is cleaner if you define vertex and element data via nested arrays:
GLubyte vVertices[][3] = {
{ -0.5f, -0.5f, 0.5f },
{ 0.5f, -0.5f, 0.5f },
...
};
GLubyte indices[][3] = {
{ 4, 7, 3 }, //top face
{ 4, 3, 0 },
...
};
GLfloat faceColors[][4] = {
{ 0, 1, 0, 1 },
{ 1, 0.5, 0, 1 },
...
};
It will also allow you to simply divide by two to get the color index, rather than using the awkward i%2
trick, which, BTW, is where you'll probably find the bug. You are incrementing the color index after the first face (i
is still zero at that point, so i%2 == 0
), so the second triangle of the top face gets a different color to the first one; the same problem will occur for all faces.
Upvotes: 0