aCuria
aCuria

Reputation: 7205

OpenGL - problems on certain graphics cards

I have been running into issues with OpenGL rendering on different computers:

Works: Intel HD3000 / Sandy bridge: ATI 6950 ATI 6970m ATI 5670m Quadro FX 2000

Does not work: Nvidia mobility 9600 gt Quadro FX 1800

when the line of code "renderLines()" is called, nothing appears on the screen for the graphics cards that "does not work". Without "renderLines()", everything works as expected on all the graphics cards I have tested.

"renderSprites()" is very similar to renderLines(), the only difference is that it is rendering quads to screen and not lines.

void GraphicsEngineOGL3::update()
{
    this->renderSprites();
    this->renderLines(); // this is the offending line of code
    SDL_GL_SwapBuffers();
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    checkError();
}

void GraphicsEngineOGL3::renderLines()
{
    if(lineBuffer_.empty()) // note: lineBuffer is a std::Vector<Vertex>
        return;

    glEnableClientState(GL_VERTEX_ARRAY);           // DEPRECATED in OGL 3.1
    glEnableClientState(GL_COLOR_ARRAY);

    // Note: glVertexPointer is deprecated, change to glVertexAttribPointer
    glVertexPointer(3, GL_FLOAT, sizeof(Vertex), &(lineBuffer_[0].x));  // DEPRECATED in OGL 3.1
    glColorPointer(4, GL_BYTE, sizeof(Vertex), &(lineBuffer_[0].r));

    glBindBuffer( GL_ARRAY_BUFFER, VBOs_[activeVBO_]);
    glBufferData( GL_ARRAY_BUFFER, lineBuffer_.size() * sizeof(Vertex), &(lineBuffer_[0]), GL_STREAM_DRAW);
    glDrawArrays( GL_LINES, 0, lineBuffer_.size()); // where 4 is the number of vertices in the quad
    glBindBuffer( GL_ARRAY_BUFFER, 0); // Binding the buffer object with 0 switchs off VBO operation. 

    glDisableClientState(GL_VERTEX_ARRAY);
    glDisableClientState(GL_COLOR_ARRAY);

    lineBuffer_.clear();
    checkError();
}

Upvotes: 1

Views: 655

Answers (1)

Christian Rau
Christian Rau

Reputation: 45948

At the moment you first set the vertex arrays to source their data from a RAM array (_lineBuffer) and then you bind a VBO and copy _lineBuffer's data into it. This will probably not do what you want, anyway (though it's hard to say what you want to do there).

Always keep in mind, that the gl...Pointer calls source their data from the currently bound GL_ARRAY_BUFFER, or from CPU RAM if none (0) is bound (glDrawArrays doesn't care about the currently bound VBO). So in your case the glBindBuffer call simply has no effect, your arrays source their data from the CPU array _lineBuffer and not from the VBO. If you want them to use the VBO, you have to bind the buffer before the gl...Pointer calls, but in this case make sure the address is actually only a byte offset into the buffer and not a real RAM address:

glBindBuffer( GL_ARRAY_BUFFER, VBOs_[activeVBO_]);
glBufferData( GL_ARRAY_BUFFER, lineBuffer_.size() * sizeof(Vertex), &(lineBuffer_[0]), GL_STREAM_DRAW);

glVertexPointer(3, GL_FLOAT, sizeof(Vertex), (const char*)0+offsetof(Vertex,x));  //use the current VBO
glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(Vertex), (const char*)0+offsetof(Vertex,r));

glBindBuffer( GL_ARRAY_BUFFER, 0);    //can already be unbound

glDrawArrays( GL_LINES, 0, lineBuffer_.size());

Note that I used GL_UNSIGNED_BYTE for the color data, which is more natural for colors than a signed type. With GL_BYTE it might even be, that your colors get transformed to [-1,1] instead of [0,1] which is then clamped (and not linearly transformed) to [0,1].

But if you really want the arrays to source their data from _lineBuffer and not from the VBO (which I doubt), then the buffer function calls are unneccessary anyway and you can just omit them.

Note that your code, though very strange and surely wrong, should nevertheless work. So I don't know if this really was the problem of the question, but it definitely was a problem. But at most, I would supect the use of GL_BYTE instead of GL_UNSIGNED_BYTE to confuse your colors or be implemented strangely in some drivers, as it is not a very usual path.

Upvotes: 4

Related Questions