decden
decden

Reputation: 719

Wrong integer math in WebGL shaders

I am using WebGL and custom shaders. In the fragment shader I need to do some precise integer math in order to index a tiled texture.

I seem to get rounding errors however, for some values, e.g. on the Ipad 4. Are integer operations implemented in terms of floats?

As a proof of concept, I tried the following shader:

const int eleven = int(11.0);

highp float f(const int nr11)
{
    if (int(nr11 / nr11) != 1)
        return 1.0;
    return 0.0;
}

void main()
{
    gl_FragColor = vec4(f(eleven));
}

Here is a runnable link.

The shader produces a black background on my desktop, but a white one on my IPad. Can somebody explain me what's going on?

Thanks

Upvotes: 1

Views: 521

Answers (1)

newprogrammer
newprogrammer

Reputation: 2604

That's correct, webgl uses glsl 1.00 shaders, which isn't required to implement real integers.

You can read about it on page 19: https://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf

You won't be guaranteed true integer support in webgl for the time being.

Upvotes: 1

Related Questions