Jonathan Peck
Jonathan Peck

Reputation: 1

GLEW crashing in XCode

I'm trying to run a simple OpenGL program using GLFW (version 3.0.2) and GLEW (version 1.10.0) in XCode (version 4.6.3) on OS X 10.8.4. The entire code is shown below.

#include <GLFW/glfw3.h>
#include <OpenGL/OpenGL.h>
#include <iostream>
using namespace std;

void RenderScene()
{
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
}

void InitGL()
{
    glClearColor(1, 0, 0, 1);
}

void ErrorFunc(int code, const char *msg)
{
    cerr << "Error " << code << ": " << msg << endl;
}

int main(void)
{
    GLFWwindow* window;

    /* Report errors */
    glfwSetErrorCallback(ErrorFunc);

    /* Initialize the library */
    if (!glfwInit())
        return -1;

    /* Window hints */
    glfwWindowHint (GLFW_CONTEXT_VERSION_MAJOR,  3);
    glfwWindowHint (GLFW_CONTEXT_VERSION_MINOR,  2);
    glfwWindowHint (GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
    glfwWindowHint (GLFW_OPENGL_PROFILE,        GLFW_OPENGL_CORE_PROFILE);

    /* Create a windowed mode window and its OpenGL context */
    window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
    if (!window)
    {
        glfwTerminate();
        return -1;
    }

    /* Make the window's context current */
    glfwMakeContextCurrent(window);

    /* Initialize OpenGL */
    InitGL();

    /* Loop until the user closes the window */
    while (!glfwWindowShouldClose(window))
    {
        /* Render here */
        RenderScene();

        /* Swap front and back buffers */
        glfwSwapBuffers(window);

        /* Poll for and process events */
        glfwPollEvents();
    }

    glfwTerminate();
    return 0;
}

Most of this came straight from GLFW's documentation; only the rendering function and GLEW initialization are mine. I have added frameworks for OpenGL, Cocoa and IOKit and linked against libGLEW.a and libglfw3.a. The program compiles successfully but appears to crash when attempting to execute functions GLEW was supposed to take care of. Here, the program crashes on glClearBufferfv. If I comment that out, I get a window with a black background. My guess is GLEW is secretly not working, since it reports no errors but doesn't seem to be doing its job at all.

The exact error message XCode throws at me is error: address doesn't contain a section that points to a section in a object file with an error code of EXC_BAD_ACCESS. If I replace glClearBufferfv with glClearColor the program doesn't crash, but still has a black background when it should actually be red. When queried, OpenGL returns the version string 2.1 NVIDIA-8.12.47 310.40.00.05f01, which explains why calls to newer functions aren't working, but shouldn't GLEW have set up the correct OpenGL context? Moreover, GLFW's documentation says that they've been creating OpenGL 3+ contexts since GLFW 2.7.2. I really don't know what to do.

Upvotes: 0

Views: 1984

Answers (2)

TheAmateurProgrammer
TheAmateurProgrammer

Reputation: 9392

GLEW doesn't really work on mac unless you enable the experimental option. Enable it after setting all your stuff in GLFW.

glewExperimental = GL_TRUE;

Edit:

And you also set to use OpenGL Core with

glfwOpenWindowHint( GLFW_OPENGL_VERSION_MAJOR, 3 );
glfwOpenWindowHint( GLFW_OPENGL_VERSION_MINOR, 2 );

Slightly different from yours.

Upvotes: 0

Andon M. Coleman
Andon M. Coleman

Reputation: 43319

glClearBuffer (...) is an OpenGL 3.0 function, it is not implemented in all versions of OS X (some only implement OpenGL 2.1). Because OS X does not use runtime extensions, GLEW is not going to fix this problem for you.

You will have to resort to the traditional method for clearing buffers in older versions of OS X (10.6 or older). This means setting the "clear color" and then clearing the color buffer as a two-step process. Instead of a single function call that can clear a specific buffer to a specific value, use this:

#define USE_GL3     // This code requires OpenGL 3.0, comment out if unavailable

void RenderScene()
{
    GLfloat color[] = {1.0f, 0.0f, 0.0f};

#ifdef USE_GL3      // Any system that implements OpenGL 3.0+
    glClearBufferfv (GL_COLOR, 0, color);
#else               // Any other system
    glClearColor    (color [0], color [1], color [2]);
    glClear         (GL_COLOR_BUFFER_BIT);
#endif
}

This is not ideal, however. There is no point in setting the clear color multiple times. You should set the clear color one time when you initialize the application and replace the ! USE_GL3 branch of the code with glClear (GL_COLOR_BUFFER_BIT);


Now, because you mentioned you are using Mac OS X 10.8, you can ignore a lot of what I wrote above. OS X 10.8 actually implements OpenGL 3.2 if you do things correctly.

You need two things for glClearBuffer (...) to work on OS X:

  1. Mac OS X 10.7+ (which you have)
  2. Tell glfw to create an OpenGL 3.2 core context

Before you create your window in glfw, add the following code:

glfwWindowHint (GLFW_OPENGL_VERSION_MAJOR,  3);
glfwWindowHint (GLFW_OPENGL_VERSION_MINOR,  2);
glfwWindowHint (GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint (GLFW_OPENGL_PROFILE,        GLFW_OPENGL_CORE_PROFILE);

Once you have an OpenGL 3.2 core context, you can also eliminate the whole ! USE_GL3 pre-processor branch from your code. This was a provision to allow your code to work on OS X implementations that do not support OpenGL 3.2.

Upvotes: 3

Related Questions