Reputation: 351
Probably a quick question as I'm probably messing up something silly.
I've been trying to get a rendering system working on my old laptop on my free time. I was having issues and thought that it had something to do with the original design of my program but now I'm not so sure if it's that. I've reduced my render call to a simple render triangle example and still get the error.
void RenderSystem::Render()
{
glClear( GL_COLOR_BUFFER_BIT );
GLuint vao;
glGenVertexArrays(1, &vao);
glBindVertexArray( vao );
glUseProgram( shaderProgram );
CheckForError();
glEnableVertexAttribArray( coord3dAttrib );
CheckForError();
GLfloat triangle[] = {
0.0, 0.8, 0.0,
-0.8, -0.8, 0.0,
0.8, -0.8, 0.0,
};
glVertexAttribPointer(
coord3dAttrib,
3,
GL_FLOAT,
GL_FALSE,
0,
triangle
);
CheckForError();
glDrawArrays(GL_TRIANGLES, 0 , 3);
CheckForError();
glDisableVertexAttribArray( coord3dAttrib );
CheckForError();
SDL_GL_SwapWindow( window );
}
Ignore the excessive CheckForError function calls -- this was merely to pinpoint which function was causing the error. It definitely occurs right after my call to glVertexAttribPointer.
It should be worth noting that all of the shaders and programs have compiled properly. The context was initialized and is handled by SDL2 and there have been no errors with regards to the context initialization. The only weird error I get is an invalid enumerant when calling SDL_GL_SetSwapInterval(0) to turn off vsync.
Also, this laptop is an old laptop and I'm starting to consider the fact that maybe the drivers / hardware of this computer is too old to actually do this more modern style of GL rendering. In which case, is there a good way to test which version of GL I should run on this hardware? (I'm on Crunchbang, a linux debian distribution. Iirc, I've set up and installed a lot of drivers and headers properly so I would assume it should work but if driver issues cause this, I'll look into it more.)
Any clues? I'm assuming that I'm missing something basic as it has been a while since I last worked with OpenGL.
Upvotes: 1
Views: 684
Reputation: 114
I think you are going wrong on your bindings. You need to generate a name to the VAO only once, bind and setup your buffer, and then use the assigned name to bind it later.
So, you should do the following.
Generate the name to your VAO variable in some initialization method (when you setup your buffers):
GLuint vao; // make this accessible in global scope or through some class
glGenVertexArrays(1, &vao);
Call the bind function as you did before, in your render method, using the previous assigned name to your variable vao
:
glUseProgram( shaderProgram );
glBindVertexArray( vao );
I think this may be your answer, at last, this is how I'm used to see VAO being handled in books and tutorials.
Upvotes: 1