Reputation: 159
I have code fully working under graphics card OpenGL.
However when I switch to OpenGL emulation using GDI generic OpenGL driver I get very strange error:
GL ERROR - Function glBindTexture(GL_TEXTURE_2D,1) generated error GL_INVALID_ENUM
The documentation of glBindTexture() says that GL_INVALID_ENUM can be returned only in case the target has wrong enum. GL_TEXTURE_2D is however correct enum - works on graphic card OpenGL driver.
I'm sure that: 1) The glBindTexture is generating that error - using GLIntercept tracker with error logging 2) Texture is allocated and has size of 512 x 4 3) Texture is assigned the data: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_textureImage.width(), m_textureImage.height(), 0, GL_BGRA, GL_UNSIGNED_BYTE, m_textureImage.bits()); 4) OpenGL context between assigning the texture the data and actually using the texture doesn't change - thou texture is unbound in-between
Is there some other undocumented reasons why it can return such error? Any ideas how to find the problem?
Upvotes: 0
Views: 3132
Reputation: 159
This has been identified as problem of GlIntercept image logger. The logger is using some OpenGL 1.2 enums which are not available in OpenGL 1.1.
The issue is being fixed for GLIntercept. Using GLIntercept without image logging should be safe for GDI generic OpenGL renderer.
Upvotes: 0
Reputation: 22358
glBindTexture(GL_TEXTURE_2D, 1)
: it's unusual to assign a fixed texture ID: (1)
, as opposed to a value returned from glGenTextures
. That said, an invalid value shouldn't return GL_INVALID_ENUM
.
Are you binding the texture 'name' (ID) as a GL_TEXTURE_2D
before assigning the texture data via glTexImage2D
? Does your GL driver support non-power-of-2 (NPOT) textures?
Finally, are you enabling texturing in the GL state: glEnable(GL_TEXTURE_2D)
? Although I'm not convinced that would yield the error code you mention.
I don't really know. You might have done everything I've mentioned! I'm just trying to consider possible oversights.
Upvotes: 2