Reputation: 1264
I'm putting together an emulator of the said extension (spec full text: http://www.opengl.org/registry/specs/EXT/direct_state_access.txt) so that I can use the bindless programming model with OpenGL regardless of actual driver support.
Most of the extension is absolutely straightforward, but there are a few commands that make me scratch my head. Check out this fragment of the extension spec:
<OpenGL 1.2.1: New indexed texture commands and queries append
"Indexed" to name and add "uint index" parameter (to identify the
texture unit index) after state name parameters (if any) and before
state value parameters>
void EnableClientStateIndexedEXT(enum array, uint index);
void DisableClientStateIndexedEXT(enum array, uint index);
<OpenGL 3.0: New indexed texture commands and queries append "i"
to name and add "uint index" parameter (to identify the texture
unit index) after state name parameters (if any) and before state
value parameters>
void EnableClientStateiEXT(enum array, uint index);
void DisableClientStateiEXT(enum array, uint index);
<OpenGL 1.2.1: New indexed generic queries (added for indexed texture
state) append "Indexed" to name and add "uint index" parameter
(to identify the texture unit) after state name parameters (if any) and
before state value parameters>
void GetFloatIndexedvEXT(enum target, uint index, float *params);
void GetDoubleIndexedvEXT(enum target, uint index, double *params);
void GetPointerIndexedvEXT(enum target, uint index, void **params);
<OpenGL 3.0: New indexed generic queries (added for indexed texture
state) replace "v" for "i_v" to name and add "uint index" parameter
(to identify the texture unit) after state name parameters (if any)
and before state value parameters>
void GetFloati_vEXT(enum pname, uint index, float *params);
void GetDoublei_vEXT(enum pname, uint index, double *params);
void GetPointeri_vEXT(enum pname, uint index, void **params);
<OpenGL 1.2.1: Extend the functionality of these EXT_draw_buffers2
commands and queries for multitexture>
void EnableIndexedEXT(enum cap, uint index);
void DisableIndexedEXT(enum cap, uint index);
boolean IsEnabledIndexedEXT(enum target, uint index);
void GetIntegerIndexedvEXT(enum target, uint index, int *params);
void GetBooleanIndexedvEXT(enum target, uint index,
boolean *params);
Perhaps I'm missing something very obvious, but I'd like to know:
GLuint
and not GLenum
(to handle GL_TEXTUREi
values) like everywhere else where multitexturing is involved?glEnable/DisableClientState()
or glGet*()
calls?Upvotes: 1
Views: 581
Reputation: 1007
The short answer is "because of texture coordinate arrays". These have of course been removed from more recent versions of OpenGL, but they are catered for by this extension.
I assume the indices were typed as GLuint
rather than GLenum
just to make the new functions more generic if they were ever used for other purposes.
Here's a relevant part from the extension specification you linked to:
"The following commands:
void EnableClientStateIndexedEXT(enum array, uint index);
void DisableClientStateIndexedEXT(enum array, uint index);
void EnableClientStateiEXT(enum array, uint index);
void DisableClientStateiEXT(enum array, uint index);
are equivalent (assuming no errors) to the following:
if (array == TEXTURE_COORD_ARRAY) {
int savedClientActiveTexture;
GetIntegerv(CLIENT_ACTIVE_TEXTURE, &savedClientActiveTexture);
ClientActiveTexture(TEXTURE0+index);
XXX(array);
ClientActiveTexture(savedActiveTexture);
} else {
// Invalid enum
}
where index is the index parameter to the listed commands and XXX is
the name of the command without its "Indexed" or "i" suffix (either
EnableClientState or DisableClientState). The error INVALID_VALUE
is generated if index is greater or equal to the implementation's
NUM_TEXTURE_COORDS value. The error INVALID_ENUM is generated if
array is not TEXTURE_COORD_ARRAY."
Upvotes: 2