Thums
Thums

Reputation: 187

OpenGL 3.3 on Ubuntu 14.04 (Linux Mint 17.1) + Intel Graphics + LWJGL

I am trying to run OpenGL software built with LWJGL and GLSL 3.3 but having trouble doing so under Linux Mint 17.1 and Intel Ivy Bridge (HD4000) + Mesa 10.6.0-devel.

From what I read Mesa 10.1+ should have support to OpenGL and GLSL 3.3 for Sandry Bridge and more recent Intel CPUs.

glxinfo | grep OpenGL returns:

OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile x86/MMX/SSE2
OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.6.0-devel (git-4348046 2015-05-02 trusty-oibaf-ppa)
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 10.6.0-devel (git-4348046 2015-05-02 trusty-oibaf-ppa)
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 10.6.0-devel (git-4348046 2015-05-02 trusty-oibaf-ppa)
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:

(I'm guessing I need OpenGL shading language version string: 3.30, and not just Core Profile 3.3?)

When calling Display.create() (LWJGL) with no parameters I get the following error:

> 0:1(10): error: GLSL 3.30 is not supported. Supported versions are:
> 1.10, 1.20, 1.30, 1.00 ES, and 3.00 ES

and glGetString(GL_VERSION) returns:

> 3.0 Mesa 10.6.0-devel (git-4348046 2015-05-02 trusty-oibaf-ppa)

If I try to call Display.create() with core profile true, like so:

Display.create(new PixelFormat(), new ContextAttribs(3,3).withProfileCore(true));

I receive the following error:

0:11(6): error: operands of `==' must have the same type

and GL_VERSION is:

3.3 (Core Profile) Mesa 10.6.0-devel (git-4348046 2015-05-02 trusty-oibaf-ppa)

I'm not sure what this means or what should I do to be able to run OpenGL 3.3 on Intel integrated graphics. I'm positive this same code works on nVidia (4.4+ support).

Any help on this matter will be appreciated, thank you!

Edit: the shader that's causing the problem:

#version 330

in vec2 texCoord0;

uniform vec3 color;
uniform sampler2D sampler;

void main() {
    vec4 textureColor = texture2D(sampler, texCoord0.xy);

    if (textureColor == 0)
        gl_FragColor = vec4(color, 1);
    else
        gl_FragColor = textureColor * vec4(color, 1);
}

Comparing textureColor == vec4(0, 0, 0, 0) did work, sorry.

I can run it now. But I don't see any of the textures. I will try to find the problem, is there anything obvious that could be causing this in my shader?

Upvotes: 0

Views: 1807

Answers (1)

derhass
derhass

Reputation: 45352

This one is quite obvious, isn't it?

vec4 textureColor = ...
[...]
if (textureColor == 0)

You can't compare a vec4 to a scalar. You could use somehting like if (textureColor == vec4(0) ).

However, comparing floating point values for equality is a bad idea in most cases. I really would recommend you use a different approach, like testing for length(textureColor) < 0.5/256.0.

Upvotes: 0

Related Questions