Reputation: 71
My best guess is that GLuint holds a pointer rather than the object, and hence it can "hold" any object, because its actually just holding a pointer to a space in memory
But if this is true why do I not need to dereference anything when using these variables?
Upvotes: 3
Views: 267
Reputation: 473477
OpenGL object names are handles referencing an OpenGL object. They are not "pointers"; they are just a unique identifier which specifies a particular object. The OpenGL implementation, for each object type, has a map between object names and the actual internal object storage.
This dichotomy exists for legacy historical reasons.
The very first OpenGL object type was display lists. You created a number of new display lists using the glNewList
function. This function doesn't give you names for objects; you tell it a range of integer names that the implementation will use.
This is the foundational reason for the dichotomy: the user decides what the names are, and the implementation maps from the user-specified name to the implementation-defined data. The only limitation is that you can't use the same name twice.
The display list paradigm was modified slightly for the next OpenGL object type: textures. In the new paradigm, there is a function that allows the implementation to create names for you: glGenTextures
. But this function was optional. You could call glBindTexture
on any integer you want, and the implementation will, in that moment, create a texture object that maps to that integer name.
As new object types were created, OpenGL kept the texture paradigm for them. They had glGen*
functions, but they were optional so that the user could specify whatever names they wanted.
Shader objects were a bit of a departure, as their Create
functions don't allow you to pick names. But they still used integers because... API consistency matters even when being inconsistent (note that the extension version of GLSL shader objects used pointers, but the core version decided not to).
Of course, core OpenGL did away with user-provided names entirely. But it couldn't get rid of integer object names as a concept without basically creating a new API. While core OpenGL is a compatibility break, it was designed such that, if you coded your pre-core OpenGL code "correctly", it would still work in core OpenGL. That is, core OpenGL code should also be valid compatibility OpenGL code.
And the path of least resistance for that was to not create a new API, even if it makes the API really silly.
Upvotes: 6