vals
vals

Reputation: 64194

Handling colors in webgl

I Have a WebGL example that works fine in local, but I can't make it work in a fiddle.

It works ok with this modified shader:

precision mediump float;
varying vec4 vColor;
void main(void) {
/*
    gl_FragColor = vColor;
*/
    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

So it's clear that I have something wrong in the colors.

I set them with this:

gl.bindBuffer (gl.ARRAY_BUFFER, triangleVertexColorBuffer);
gl.vertexAttribPointer (shaderProgram.vertexColorAttribute,triangleVertexColorBuffer.itemSize,gl.FLOAT,false,0,0);

And previously I have set the buffer this way:

triangleVertexColorBuffer=gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER,triangleVertexColorBuffer); 
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(colors),gl.STATIC_DRAW);

triangleVertexColorBuffer.itemSize=4;
triangleVertexColorBuffer.numItems=colors.length / 4;

colors is a standard javascript array.

This is the fiddle

Besides knowing what is going wrong, I would like to know if there is some technique to track what is happening inside webgl, to check where the problem is.

Upvotes: 0

Views: 256

Answers (2)

virtualnobi
virtualnobi

Reputation: 1180

To track what is going on, there's the Chrome WebGL inspector, from http://benvanik.github.io/WebGL-Inspector/. It will not really show what's going on, rather only what's there: It's not a debugger, but shows what the state of WebGL data is at every moment.

Upvotes: 2

vals
vals

Reputation: 64194

Finally I have find the problem.

Corrected fiddle:

fiddle

the problem was here

shaderProgram.vertexColorAttribute=gl.getAttribLocation(shaderProgram,"aVertexColor");
gl.enableVertexAttribArray(shaderProgram.vertexColorAttribute);

Upvotes: 0

Related Questions