ArcaneEnforcer
ArcaneEnforcer

Reputation: 124

OpenGL Draw Vertex Buffer Object

I have two 'std::vector's, one for indices and one for vertices, which I fill with std::vector.push_back(). Then I do

glGenBuffers(1, &verticesbuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, verticesbuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, /*EDITED-->*/vertices.size() * sizeof(vertices[0])/*<--EDITED*/, &vertices[0], GL_STATIC_DRAW);

to create the buffers for each, and then attempt to draw the polygon with

glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);

glBindBuffer(GL_ARRAY_BUFFER, verticesbuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indicesbuffer);

glDrawElements(
    GL_TRIANGLES,
    indices.size(),
    GL_UNSIGNED_INT,
    &indices[0]
    );

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

When I run the program, nothing shows up. I can get it to work using the glBegin() / glEnd() approach but the indexed vbo just doesn't work (glGetError() also doesn't give any errors). I don't even know if this is remotely close to correct as I have searched through countless tutorials and other stackoverflow questions and tried many different things to fix it. I should also mention that I called

glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glOrtho(0.0f, windowX, windowY, 0.0f, 0.0f, 1000.0f);

at the beginning of the program, which I also have no idea if this is correct (as you can see, I am pretty new at this stuff).

Upvotes: 2

Views: 1551

Answers (3)

plasmacel
plasmacel

Reputation: 8530

You misunderstand how sizeof operator works. It is an operator which is executed at compile-time and returns the size (in bytes) of the specified type or variable.

float f;
std::cout << sizeof(f); // prints 4
std::cout << sizeof(float); // prints 4

But what happens when we use sizeof on a pointer to an array? Let's examine the following case:

float array1[50]; // static size array, allocated on the stack
float *array2 = new float[50]; // dynamic size array, allocated on the heap

std::cout << sizeof(array1); // prints 200, which is ok (50*4 == 200)
std::cout << sizeof(array2); // prints out the size of a float pointer, not the array

In the first case we use sizeof on a static array, which is allocated on the stack. Since the size of array1 is constant, the compiler knows about it and returns it's actual size in bytes on sizeof(array1).

In the second case we use sizeof on a dynamic array which is allocated on the heap. The size of array2 ideally cannot be known at compile time (otherwise you should use a static array, if fits into the stack), so the compiler knows nothing about the size of the array array2, so it returns the size of the pointer to our array.

What happens when you use sizeof on std::vector?

std::vector<float> vec(50);
std::cout << sizeof(vec); // prints out the size of the vector (but it's not 4*50)

But if sizeof(vec) returns the size of the vector, why doesn't return 4*50? std::vector manages an underlying dynamically allocated array (second case in the previous example), so the compiler don't know anything about the size of that underlying array. That's why it returns the size of the overall encapsulated (hidden) variables of the vector object, including the size of the pointer to the actual array data. If you want the number of elements in your underlying array, you need to use vec.size(). To get the size of the underlying float array in bytes, just use vec.size() * sizeof(float).

Fixing your code with the knowledge from above:

std::vector<float> vertices;
// ...add vertices with push_back()...
glBufferData(GL_ELEMENT_ARRAY_BUFFER, vertices.size() * sizeof(float), &vertices[0], GL_STATIC_DRAW);

or

std::vector<float> vertices;
// ..add vertices with push_back()...
glBufferData(GL_ELEMENT_ARRAY_BUFFER, vertices.size() * sizeof(vertices[0]), &vertices[0], GL_STATIC_DRAW);

Upvotes: 2

Tom Kulaga
Tom Kulaga

Reputation: 138

In the future you can also use a graphics debugger to help with these issues. Depending on your card you can use AMDs gpu perf studio or nVidia nsight on windows or graphics debugger on Linux. This saves a lot of time and headaches.

If you get your blank screen again. Run your app with the debugger attached and follow the pipeline.

You should see the data fed into the vertex shader and since it was shorter than what you expected it would flag an issue and you could start there.

Upvotes: 0

paddy
paddy

Reputation: 63451

The problem is that you expected sizeof(vertices) to give you the total number of bytes stored in the vector. However, it only gives the size of the vector object itself, not the dynamic data it contains.

Instead, you should use vertices.size() * sizeof(vertices[0]).

Upvotes: 3

Related Questions