Reputation: 783
Trying to modify this OpenGL and GLKit tutorial at raywenderlich.com, I am trying to render a cube from a wavefront obj file, without the per vertex color information and with surface normals. But the thing that is rendered looks nothing like a cube.
For parsing an obj file I have a method (createDrawable) that goes through the obj and saves the info into a struct (Drawable) that contains four things: vertex buffer, index buffer, number of faces in the object and the transform matrix of the object. (Here are the header, .m file and the .obj file.)
- (Drawable)createDrawable: (NSString *)objFileName {
......
......
// Parsed obj and put info in vertexData and indices arrays.
Drawable _drawable;
_drawable.numFaces = numFaces;
glGenBuffers(1, &_drawable.vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, _drawable.vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertexData) * numFaces * 8, vertexData, GL_STATIC_DRAW);
glGenBuffers(1, &_drawable.indexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _drawable.indexBuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices) * numFaces * 3, indices, GL_STATIC_DRAW);
_drawable.matrix = GLKMatrix4Identity;
_drawable.matrix = GLKMatrix4Translate(_drawable.matrix, 0.0f, 0.0f, 10.0f);
return _drawable;
}
For rendering I am using another method (renderDrawable) that binds an object's buffers, sets pointers to them and then renders using glDrawElements(..).
- (void) renderDrawable: (Drawable)object {
glBindBuffer(GL_ARRAY_BUFFER, object.vertexBuffer);
glEnableVertexAttribArray(GLKVertexAttribPosition);
glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid *) 0);
glEnableVertexAttribArray(GLKVertexAttribTexCoord1);
glVertexAttribPointer(GLKVertexAttribTexCoord1, 2, GL_FLOAT, GL_FALSE, 0, (const GLvoid *) 3);
glEnableVertexAttribArray(GLKVertexAttribNormal);
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 0, (const GLvoid *) 5);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, object.indexBuffer);
glDrawElements(GL_LINES, sizeof(object.indexBuffer) * 3 * object.numFaces, GL_UNSIGNED_BYTE, (const GLvoid *) object.indexBuffer);
}
I think I am doing something wrong with the buffers (part of createDrawable shown here and renderDrawable, in the .m file), but I just can't figure out what it is.
Upvotes: 0
Views: 396
Reputation: 10125
I think that the "stride" param in glVertexArrayPointer should be set to 8 * sizeof(float) for instance (size of a single vertex)
glVertexAttribPointer(GLKVertexAttribNormal, 3, GL_FLOAT, GL_FALSE, 8 * sizeof(float), (const GLvoid *) 5);
when stride is 0 - that means attribs are stored in a continous way... but you have interleaved attribs so you must provide some more info to OpenGL.
Another issue: what is the sizeof(vertexData) * numFaces * 8
... I think is should be: sizeof(vertexData) * numFaces * 3
And for index buffer: sizeof(indices) * numFaces * 3
- I think it should be sizeof(int) * numFaces * 3
int - type for indices, but you have GL_BYTE (that means that you can have only 256 different indices!)
when rendering: glDrawArrays - you have index buffer bound, so set the last param to NULL
Upvotes: 1
Reputation: 3970
I think you problem is that you pass GL_LINES to glDrawElements() instead of GL_TRIANGLES.
Upvotes: 1