atanamir
atanamir

Reputation: 4953

Enabling OpenGL extensions

I'm trying to perform some integer operations (division and modulo) in my GLSL shader, but they don't seem to work and I read that I need to enable EXT_GPU_shader4 in order to get integer operations. What I can't find, however, is how to do that. Is the line:

#version 330 core
#extension GL_EXT_GPU_SHADER4 : require

enough? Or do I need to enable it using the C API somehow too? Currently I get an error during compilation that the extension is not supported. I'm on a GeForce 670; a pretty recent card.

Upvotes: 4

Views: 11401

Answers (1)

Nicol Bolas
Nicol Bolas

Reputation: 473174

If you want to enable GPU_shader4, then yes, that line will do it.

However, you shouldn't be enabling EXT_gpu_shader4 at all. OpenGL 3.0 already incorporates all of this extension into core functionality. There's no reason to enable an extension to access stuff you already have access to thanks to your #version declaration.

Upvotes: 7

Related Questions