paulscode
paulscode

Reputation: 1069

Why does glGetIntegerv for GL_NUM_SHADER_BINARY_FORMATS generate GL_INVALID_ENUM?

I am using the official Android port of SDL 1.3, and using it to set up the GLES2 renderer. It works for most devices, but for one user, it is not working. Log output shows the following error:

error  of type 0x500 glGetIntegerv

I looked up 0x500, and it refers to GL_INVALID_ENUM. I've tracked down where the problem occurs to the following code inside the SDL library: (the full source is quite large and I cut out logging and basic error-checking lines, so let me know if I haven't included enough information here)

glGetIntegerv( GL_NUM_SHADER_BINARY_FORMATS, &nFormats );
glGetBooleanv( GL_SHADER_COMPILER, &hasCompiler );
if( hasCompiler )
    ++nFormats;
rdata->shader_formats = (GLenum *) SDL_calloc( nFormats, sizeof( GLenum ) );
rdata->shader_format_count = nFormats;
glGetIntegerv( GL_SHADER_BINARY_FORMATS, (GLint *) rdata->shader_formats );

Immediately after the last line (the glGetIntegerv for GL_SHADER_BINARY_FORMATS), glGetError() returns GL_INVALID_ENUM.

Upvotes: 2

Views: 2386

Answers (1)

the swine
the swine

Reputation: 11031

The problem is the GL_ARB_ES2_compatibility extension is not properly supported on your system.

By GL_INVALID_ENUM it means that it does not know the GL_NUM_SHADER_BINARY_FORMATS and GL_SHADER_BINARY_FORMATS enums, which are a part of the said extension.

In contrast, GL_SHADER_COMPILER was recognized, which is strange.

You can try using GL_ARB_get_program_binary and using these two instead:

#define GL_NUM_PROGRAM_BINARY_FORMATS                                0x87fe
#define GL_PROGRAM_BINARY_FORMATS                                    0x87ff

Note that these are different from:

#define GL_SHADER_BINARY_FORMATS                                     0x8df8
#define GL_NUM_SHADER_BINARY_FORMATS                                 0x8df9

But they should pretty much do the same.

Upvotes: 1

Related Questions