Reputation: 123
In my (C++/OpenGL) program, I am loading a set of textures and setting the texture parameters as follows:
//TEXTURES
glGenTextures(1, &texture1);
glBindTexture(GL_TEXTURE_2D, texture1);
// set the texture wrapping parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
// set texture filtering parameters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
I found out that anisotropic filtering would help me to enhance the looks on the scene. Therefore, I used this line to achieve it:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);
While I had no problems compiling this line of code on my laptop (which had AMD GPU vendor), I cannot achieve to compile this piece of code on my other computer, using Intel(R) HD Graphics 530 (Skylake GT2). Specifically, trying to compile that piece of code using g++ outputs the followin error:
error: ‘GL_TEXTURE_MAX_ANISOTROPY’ was not declared in this scope
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY, 16);
More specifically, running in my Linux terminal the following command:
glxinfo | grep -i opengl
reveals the following details about my GPU vendor and OpenGL support:
I understand that the ANISOTROPIC FILTERING was enabled in the ARB_texture_filter_anisotropic
, but I honestly don't know how to check whether my GPU vendor supports the extension, and, if he does, how do I make it possible to use the ANISOTROPIC filtering?
BTW: I am using glfw3 and GLAD loader.
Upvotes: 1
Views: 6883
Reputation: 22328
The anisotropic value is a floating-point value, using the f
prefix: e.g.,
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, value);
Where value
is a floating-point value. It's worth noting that despite anisotropic filtering not being technically part of a GL standard, it can be considered to be a ubiquitous extension. That is, you can rely on it's existence on all platforms that matter.
If you want to clamp to some maximum anisotropy available, try something like:
GLfloat value, max_anisotropy = 8.0f; /* don't exceed this value...*/
glGetFloatv(GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT, & value);
value = (value > max_anisotropy) ? max_anisotropy : value;
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, value);
Upvotes: 6
Reputation: 45352
error: ‘GL_TEXTURE_MAX_ANISOTROPY’ was not declared in this scope
This GLenum
value was defined in GL_ARB_texture_filter_anisotropic
, which is also a core feature of OpenGL 4.6. It is not clear what mechanisms for OpenGL extension handling you are using, and if you use a particular GL loader library.
However, chances are that on your other system, the system-installed glext.h
or some header of your loader like glew.h
or glad.h
or whatever you use, are not as recent as the ones you used on the other system. As a result, this value will not be defined.
In the case of anisotropic filtering, this is not a big issue, since the GL_EXT_texture_filter_anisotropic
offers exaclty the same functionality and is around since the year 2000, so you can just switch to the constant GL_TEXTURE_MAX_ANISOTROPY_EXT
. The reason this extension was so late to be promoted to ARB
status and core GL functionality were some patents, which finally expired only recently.
Upvotes: 2