hanstar17
hanstar17

Reputation: 90

opengl es gldrawelements function causes memory access violation

I'm learning OpenGL ES now and copying and modifying some examples from blue book. The example was simply drawing a red triangle on black background; I made it and it worked.

So I changed it into a cube drawing and it worked as well. But as soon as I change it to use VBO and IBO, it crashes in glDrawElements function with memory access violaiton 0x00000005.

I searched many sites to find out the reason but I was able to find any that helped.

Would you find any problem in my code?

Change Note

I'm using OpenGL ES 1.3 vertsion.

struct Vertex
{
GLfloat x;
GLfloat y;
GLfloat z;
};


void NewTriangle( Vertex*& vertices, GLuint& verticesCount, GLubyte*& indices, GLuint& indicesCount )
{
    verticesCount = 3;
    vertices = new Vertex[verticesCount];
    vertices[0] = Vertex( 0, 0 );
    vertices[1] = Vertex( -0.5, -0.5 );
    vertices[2] = Vertex( 0.5, -0.5 );
    indicesCount = 3;
    indices = new GLubyte[indicesCount];
    indices[0] = 0;
    indices[1] = 1;
    indices[2] = 2;
}

void NewVerticesAndIndices( Vertex*& vertices, GLuint& verticesCount, GLubyte*& indices, GLuint& indicesCount )
{
    NewTriangle( vertices, verticesCount, indices, indicesCount );
    //NewCube( vertices, verticesCount, indices, indicesCount );
}

void RenderCommon( Vertex*& vertices, GLuint& verticesCount, GLubyte*& indices, GLuint& indicesCount )
{
    const GLfloat color[] = { 1, 0, 0, 1 };
    glVertexAttrib4fv( 0, color );
    glEnableVertexAttribArray( 1 );
    glVertexAttribPointer( 1, 3, GL_FLOAT, GL_FALSE, 0, (const void*)vertices );

    glDrawElements( GL_TRIANGLES, indicesCount, GL_UNSIGNED_BYTE, (const void*)indices );
}

void RenderWithMemories( Vertex*& vertices, GLuint& verticesCount, GLubyte*& indices, GLuint& indicesCount )
{
    RenderCommon( vertices, verticesCount, indices, indicesCount );
}

void RenderWithVBO( const GLuint& vbo, const GLuint& ibo, Vertex*& vertices, GLuint& verticesCount, GLubyte*& indices, GLuint& indicesCount )
{
    glBindBuffer( GL_ARRAY_BUFFER, vbo );
    glBufferData( GL_ARRAY_BUFFER, verticesCount*sizeof(*vertices), (void*)vertices, GL_STATIC_DRAW );

    glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, ibo );
    glBufferData( GL_ELEMENT_ARRAY_BUFFER, indicesCount*sizeof(*indices), (void*)indices, GL_STATIC_DRAW );

    GLuint vboOffset = 0;
    GLuint iboOffset = 0;
    RenderCommon( (Vertex*&)vboOffset, verticesCount, (GLubyte*&)iboOffset, indicesCount );
}

void BlueEyeApp::OnRender()
{
    glViewport( 0, 0, 640, 480 );
    glClear( GL_COLOR_BUFFER_BIT );

    glUseProgram(m_program);

    GLuint verticesCount;
    Vertex* vertices = NULL;
    GLuint indicesCount;
    GLubyte* indices = NULL;
    NewVerticesAndIndices( vertices, verticesCount, indices, indicesCount );

    //RenderWithMemories( vertices, verticesCount, indices, indicesCount ); // successfully output
    RenderWithVBO( m_vbo, m_ibo, vertices, verticesCount, indices, indicesCount ); // crashes

    eglSwapBuffers( GetDisplay(), GetSurface() );

    delete[] vertices;
    delete[] indices;
}

and I have this in my initialization :

bool BlueEyeApp::CreateBuffers()
{
    glGenBuffers( 1, &m_vbo );
    glGenBuffers( 1, &m_ibo );
    return true;
}

I wonder if it has something to do with egl version, since my major & minor version of the result of eglInitialize is 1.3. I don't know what the versions mean; I thought I have opengl es 2.0 or higher.

I also checked all the gl/egl function error checking and there were no error.

Upvotes: 2

Views: 1329

Answers (2)

hanstar17
hanstar17

Reputation: 90

Quite old posting, but I solved this problem a long time ago.
As soon as I change my Opengl ES library file, everything worked fine.

It was very weird to me that I checked every single glCode with glError and nothing was detected.
But as soon as glDrawElement get called, the program crashed.
And the problem has gone as I change gles library file.
I want to specify to which version I moved on to but I don't remember.

Hope this helps someone. :)

Upvotes: 0

Tim
Tim

Reputation: 35923

I'm not sure if this is your only problem, but you can't call glBindAttribLocation where you are currently.

glBindAttribLocation only takes effect the next time you link the program is linked after you call it. If you call it after linking it does nothing.

Either bind your attributes before linking your shader, or use glGetAttribLocation to find the attribute locations after the program is linked.

Upvotes: 2

Related Questions