Reputation: 513
The line
glActiveTexture(GL_TEXTURE0+32);
throws GL_INVALID_ENUM (as I found running glGetError()), while
glActiveTexture(GL_TEXTURE0+31);
runs fine.
According to the documentation:
"GL_INVALID_ENUM is generated if texture is not one of GL_TEXTUREi, where i ranges from zero to the value of GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS minus one."
but in my case GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS is defined as 35661, and according to the same documentation
"The number of texture units is implementation dependent, but must be at least 80."
How could I solve this problem?
Just in case, GL_TEXTURE0 is defined as 33984 and my version of OpenGL is 2.1
Upvotes: 2
Views: 2473
Reputation: 5157
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS
may be defined as 35661
, but that's not what you want, thats the value of an enum (same as GL_TEXTURE0
is defined as 33984
). These values have no meaning at all.
You get the number by querying glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &texture_units);
, which will return the number of textures you can use. It is very likely that it will be 32 on OpenGL 2.1 hw.
Upvotes: 5
Reputation: 162327
The documentation is a bit misleading. They mean the value retrieved by
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, ...)
not the value of the token itself.
Upvotes: 4