Reputation: 6594
I'm trying to set up an OpenGL 3.2 context on Lion. I've got this code to set up the window:
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
glfwOpenWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
if (!glfwInit())
return -1;
if (!glfwOpenWindow(640, 480, 8, 8, 8, 0, 0, 0, GLFW_WINDOW))
return -1;
glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
// Get OpenGL details
int major, minor, rev;
glfwGetGLVersion(&major, &minor, &rev);
std::cout << "GL Version: " << major << "." << minor << "." << rev << std::endl;
The GL version comes out as 2.1, even though I'm running OS X Lion and an AMD Radeon 6750M which apparently supports 3.2. Also, I ran this bit of C code here which returned 3.2. I'm using GLFW 2.7.8. Does anyone know what's going on here?
Upvotes: 1
Views: 243
Reputation: 6594
It turns out I needed to call glfwInit()
before the calls to glfwOpenWindowHint
:
if (!glfwInit())
return -1;
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
glfwOpenWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
if (!glfwOpenWindow(500, 500, 8, 8, 8, 0, 0, 0, GLFW_WINDOW))
return -1;
Upvotes: 1