fuzic
fuzic

Reputation: 2522

Three.js immediateRenderCallback with MeshShaderMaterial throws "index out of range" warnings

I have a Three.js scene containing objects which have an immediateRenderCallback method. I have also created a custom shader and I am trying to use MeshShaderMaterial(myShader).

The shader material works without warnings on basic Three.js objects. Regular materials work fine on my custom immediateRenderCallback objects.

The shader material throws warnings when I use it on my custom immediateRenderCallback objects. These are the warnings:

    WebGL: INVALID_VALUE: enableVertexAttribArray: index out of range
    WebGL: INVALID_VALUE: vertexAttribPointer: index out of range

Now, despite these warnings, everything seems to be working fine. The objects appear, the shader works correctly, but I do not understand why these warnings appear and whether they can be safely ignored.

Here is my immediateRenderCallback function:

THREE.Segment.prototype.immediateRenderCallback = function ( program, _gl, _frustum )
{
    if ( ! this.__webglPositionNormalBuffer ) this.__webglPositionNormalBuffer = _gl.createBuffer();
    if ( ! this.__webglStripBuffer ) this.__webglStripBuffer = _gl.createBuffer();

    _gl.bindBuffer( _gl.ARRAY_BUFFER, this.__webglPositionNormalBuffer );
    _gl.bufferData( _gl.ARRAY_BUFFER, this.interleavedData, _gl.STATIC_DRAW );
    _gl.enableVertexAttribArray( program.attributes.position );
    _gl.enableVertexAttribArray( program.attributes.normal );
    _gl.vertexAttribPointer( program.attributes.position, 3, _gl.FLOAT, false, 24, 0 );
    _gl.vertexAttribPointer( program.attributes.normal, 3, _gl.FLOAT, false, 24, 12 );

    _gl.bindBuffer( _gl.ELEMENT_ARRAY_BUFFER, this.__webglStripBuffer );
    _gl.bufferData( _gl.ELEMENT_ARRAY_BUFFER, this.stripData, _gl.STATIC_DRAW );

    for(var i=0; i<this.stripOffsets.length; i+=2)
    {
        var size = this.stripOffsets[i+1];
        var offset = this.stripOffsets[i]*2;
        _gl.drawElements( _gl.TRIANGLE_STRIP, size, _gl.UNSIGNED_SHORT, offset);
    }
}

Here is my shader:

'depthPacked': {
    uniforms: {},

    vertexShader: [
        "void main() {",
            "vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );",
            "gl_Position = projectionMatrix * mvPosition;",
        "}"
    ].join("\n"),

    fragmentShader: [
        "vec4 pack_depth( const in highp float depth ) {",
            "const highp vec4 bit_shift = vec4( 256.0, 256.0*256.0, 256.0*256.0*256.0, 0.0 );",
            "float power = floor(log2(depth));",
            "float mantissa = (power + 127.0) / 256.0;",
            "vec4 res = (depth/exp2(power)) * bit_shift;",
            "res = fract(floor(res) / 256.0);",
            "res.w = mantissa;",
            "return res;",
        "}",
        "void main() {",
            "gl_FragData[0] = pack_depth( gl_FragCoord.z );",
        "}"
    ].join("\n")
}

Thanks!

Upvotes: 2

Views: 2172

Answers (1)

fuzic
fuzic

Reputation: 2522

I've figured this out from a related problem here: WebGL: glsl attributes issue, getProgramParameter returns wrong number of attributes

My custom immediateRenderCallback assigns values to the attributes position and normal, which are default attributes within Three.js. However, because my vertex shader does not utilize the normal attribute, the GLSL compiler optimizes out the normal attribute declaration, causing the immediateRenderCallback to complain because there is no normal attribute for it to assign data to!

Because I'm using two different shaders (in two passes) on the same geometry and the other (phong) shader requires normal information, in the depth shader I simply assigned the normal attribute value to a varying in the vertex shader to prevent it from being optimized out.

Upvotes: 2

Related Questions