Reputation: 21
I have written my own Vertex and Fragment shader for a little openGL 2.0 application I am working on. Everything seems to work great except for one thing, the texture coordinate for every vertex seems to be (0, 0).
When I dont use my own shaders, that is, when I use the default shader of openGL, then everything is drawn fine.
When I activate my own shader the shape is still okay, the vertex positions are correct. But the texture coordinates all become (0, 0).
Here is the code for the vertex shader:
in vec2 position;
in vec2 texcoord;
out vec2 coord;
void main(void) {
gl_Position = gl_ModelViewProjectionMatrix * vec4(position, 0, 1);
coord = vec2(gl_TextureMatrix[0] * vec4(texcoord, 1, 1));
}
And this is the fragment shader:
uniform sampler2D texture;
uniform float redFactor;
uniform float greenFactor;
uniform float blueFactor;
in vec2 coord;
void main(void) {
vec4 color = texture2D(texture, coord);
float grey = (color.r * redFactor + color.g * greenFactor + color.b * blueFactor);
gl_FragColor = vec4(grey, grey, grey, color.a);
}
Again, it works fine without my own shader. So the VBO's are set up correctly.
By the way, the texcoords are correctly passed from the vertex shader to the fragment shader. For example, when I change the line
coord = vec2(gl_TextureMatrix[0] * vec4(texcoord, 1, 1));
in the vertex shader to this:
coord = vec2(0.5, 0.5);
I get the correct result.
Here is how my VBO's content looks like:
[0.0, 0.0,
0.0, 0.0,
0.0, 1.0,
0.0, 1.0,
1.0, 1.0,
1.0, 1.0,
1.0, 0.0,
1.0, 0.0]
Here is the IBO for the drawing:
[0, 1, 2, 2, 3, 0]
Here are the pointers:
VERTEX_ARRAY_POINTER(size = 2, stride = 16, offset = 0),
TEXCOORD_ARRAY_POINTER(size = 2, stride = 16, offset = 8)
Edit: By the way, what is wrong with the line break in this editor?
Upvotes: 2
Views: 8795
Reputation: 858
It sounds like you need to enable the client state for the texture coordinates, so be sure to call glEnableVertexAttribArray
for positions and texture coordinates
for more details: http://www.opengl.org/wiki/Vertex_Specification#Vertex_Array_Object
But that is if you are using vertex arrays, and from you question it seems like you are.
And personally, I enable all arrays that I'm using just before calling glDraw*()
and then disabling all but the positions array after the draw call.
In order to know the index to use for glEnableVertexAttribArray
you can use 0
for your positions and 1
for your texcoords.
A more robust answer would be that after you have program compiled and linked your program, make it current with glUseProgram()
and then
GLuint positionIndex = 0;
GLuint texcoordIndex = 1;
glUseProgram(programId);
glBindAttribLocation(programId, positionIndex, "position");
glBindAttribLocation(programId, texcoordIndex, "texcoord");
Then before your glDraw*()
glEnableVertexAttribArray(positionIndex);
glEnableVertexAttribArray(texcoordIndex);
This is still a rather hard coded way of handling this, if you want a more generic method please leave a comment.
Since it was requested here is a generic way of knowing how many vertex attributes your shaders are using
glUseProgram(programId);
int attribCount;
glGetProgramiv(programId, GL_ACTIVE_ATTRIBUTES, &attribCount);
GLuint attribLoc[] = new GLuint[attribCount];
for (int attrib=0; attrib<attribCount; ++attrib) {
char szName[32];
int size;
GLenum vecType;
glGetActiveAttrib(programId, attrib, sizeof(szName), 0, &size, &vecType, szName);
attribLoc[attrib] = glGetAttribLocation(programId, szName);
glBindAttribLocation(programId, attribLoc[attrib], szName);
}
then when you go to draw
for (int attrib=0; attrib<attribCount; ++attrib) {
glEnableVertexAttribArray(attribLoc[attrib]);
}
glDraw*();
for (int attrib=0; attrib<attribCount; ++attrib) {
glDisableVertexAttribArray(attribLoc[attrib]);
}
Upvotes: 4
Reputation: 43319
If you are using straight up vertex array pointer, texcoord array pointer, etc... API calls, then you will need to use the corresponding array in your shader.
More precisely, instead of using in vec2 texcoord
(which is for generic vertex attributes), you would use gl_TexCoord [0]
whenever you wanted to access the texture coordinates for texture unit 0 in your shader.
The fundamental problem here is you are mixing deprecated API features (glTexCoordPointer
) with new GLSL constructs (in
and out
). nVIDIA drivers will actually alias calls like glTexCoordPointer (...)
to a specific vertex attribute slot, but this is non-standard behavior and you should generally NEVER mix and match the two.
The only array pointer that is guaranteed to be aliased to a specific attribute slot by the OpenGL spec. is the vertex pointer, which aliases to attrib slot 0.
In the end, you will want to switch to vertex attrib arrays because they are much more flexible and are actually supported by core OpenGL :)
Upvotes: 2