wan
wan

Reputation: 190

How to recognise intel graphic card in GLSL program?

On Nvidia cards, if you want clip plane is enabled, gl_ClipVertex must be assigned in glsl program. But on Ati cards, there is a problem if gl_ClipVertex is assigned.

For Nvidia/Ati compatibility, we write codes like this:

// fix the clipping bug for both Nvidia and Ati   
#ifdef __GLSL_CG_DATA_TYPES
    gl_ClipVertex = gl_ModelViewMatrix*gl_Vertex;
#endif

You can check this link for more information.

There is a problem. On intel graphic cards (e.g. HD Graphics 3000), gl_GlipVertex must be assigned too. If not, the clip plane is useless. But as we know, __GLSL_CG_DATA_TYPES is only defined on a Nvidia system. So the gl_ClipVertex line is skippd on intel. Now it seems hard to write compatible glsl programs both right on Nvidia/Ati/Intel cards.

Is there something like __GLSL_CG_DATA_TYPES can recognise intel graphic card in GLSL program?

Upvotes: 4

Views: 1114

Answers (2)

D-rk
D-rk

Reputation: 5919

As suggested by Nicol you will have to detect the hardware outside the shader and pass in a define. You can pass additional defines when compiling the shader:

const char *vertSrcs[2] = { "#define INTEL\n", vShaderSrc };
glShaderSource(vId, 2, vertSrcs, NULL);

something like this should work.

Upvotes: 1

Nicol Bolas
Nicol Bolas

Reputation: 474436

I assume that you're talking about a bug workaround. Well, the only real way to work around this is to #define __GLSL_CG_DATA_TYPES yourself from outside the shader (ie: inserting the string into the shader). Better yet, create your own #define that you insert into the shader string, after the #version declaration.

How you go about inserting the string into the shader is up to you. You could do some simple parsing of GLSL, finding the first non-comment line after the #version directive and doing the insertion there.

From outside the shader, you'll have to use the GL_RENDERER and GL_VENDOR strings to detect whether you should provide the #define or not.

Upvotes: 4

Related Questions