Ajit
Ajit

Reputation: 3

OpenGL ES iOS Attribute issues

I'm trying to get OpenGL ES on iOS to work properly with more than 2 attributes, but for some reason it seems to be drawing the normals and not the actual object.

My vertex shader:

attribute vec4 position;
attribute vec3 tc;
attribute vec3 normal;

uniform vec3 color;
uniform mat4 modelViewProjectionMatrix;
uniform mat3 normalMatrix;

varying lowp vec3 Color;

void main()
{

    vec3 eyeNormal = normalize(normalMatrix * normal);
    vec3 lightDirection = vec3(0.0, 0.0, 1.0);

    float nDot = max(0.0, dot(eyeNormal, normalize(lightDirection)));

    Color = tc * nDot;
    gl_Position = modelViewProjectionMatrix * position;
}

I grab references to the attributes like you'd expect:

glBindAttribLocation(_program, GLKVertexAttribPosition, "position");
glBindAttribLocation(_program, GLKVertexAttribTexCoord0, "tc");
glBindAttribLocation(_program, GLKVertexAttribNormal, "normal");

I use an NSMutableArray container to store Vector objects, the first Vector object represents the position, the one after that represents the texture data, and the one after that represents the normal. This repeats, and the array usually has around 5000 Vector objects in it. The contents of that array is read into a buffer with this code:

float *buffer = (float*) calloc(vBuffer.count * 3, sizeof(float));
int bufferIndex = 0;

for(int i =0; i < [vBuffer count]; i++) {

    vector3 *vec = (vector3*) vBuffer[i];

    buffer[bufferIndex] = vec.x;
    buffer[bufferIndex+1] = vec.y;
    buffer[bufferIndex+2] = vec.z;

    bufferIndex += 3;

}

Then I set up the glVertexAttribPointer stuff like you'd expect:

glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0));

glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 3, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(12));

glEnableVertexAttribArray(GLKVertexAttribNormal);
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(24));

I compile and this is drawn: https://i.sstatic.net/RdRfj.png. If I don't include the Vector object that represents the normal into my NSMutableArray, I get the intended result (albeit weirdly coloured but that's because I'm reading the texture coords as the color, ignore that): https://i.sstatic.net/ROAfV.png.

Why is everything screwed up when I add that third attribute into my buffer?

Here's the code that adds positions, textures, and normals to the array, based on parsing a face statement in an .obj file:

    if(v < INT32_MAX && v != 0) {

     vector3 *position = (vector3*) _vertices[v-1];
    [vBuffer addObject:position];

    }

   if(vt < INT32_MAX && vt != 0) {

    vector3 *texture = (vector3*) _textures[vt-1];
    [vBuffer addObject:texture];

    }


    if(vn < INT32_MAX && vn != 0) {

    vector3 *normal = (vector3*) _normals[vn-1];
    [vBuffer addObject:normal];

    }

I created a new test project based off XCode's OpenGL template and added a third attribute and it worked fine, so not sure what's causing the problem with this particular project. Is it the size of the buffer? There's around 15,000 floats being stored in it.

Upvotes: 0

Views: 1318

Answers (1)

rickster
rickster

Reputation: 126167

The man page for glVertexAttribPointer admittedly isn't very clear on the meaning of its fifth parameter (stride). Where they say it "specifies the byte offset between consecutive generic vertex attributes", it means something more like between attributes of the same kind, that is, the offset between the same attribute in consecutive vertices. You're seeing weird rendering most likely because you've passed 0 for that parameter, causing GL to interpret the normal data for the first vertex as the position data for the second (and so on).

Because you're interleaving multiple vertex attributes in your buffer (as recommended in Apple's programming guide), the value of the stride parameter should be a) nonzero, and b) the same for all attributes. Like this (assuming 32-bit GLfloat components):

glVertexAttribPointer stride

Magic numbers are considered harmful, so I like to declare structs for my vertex data and use sizeof and offsetof when GL APIs call for widths and offsets:

typedef struct {
    GLKVector3 position;
    GLKVector3 normal;
    GLKVector3 texCoord0;
} Vertex;

glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 
    sizeof(Vertex), (const GLvoid *)offsetof(Vertex, position));
glEnableVertexAttribArray(GLKVertexAttribNormal);
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 
    sizeof(Vertex), (const GLvoid *)offsetof(Vertex, normal));
glEnableVertexAttribArray(GLKVertexAttribTexCoord0);
glVertexAttribPointer(GLKVertexAttribTexCoord0, 3, GL_FLOAT, GL_FALSE, 
    sizeof(Vertex), (const GLvoid *)offsetof(Vertex, texCoord0));

This trick works regardless of whether the buffer containing my vertex data is actually an array of Vertex (though that's certainly otherwise convenient).

Upvotes: 1

Related Questions