Reputation: 1
I have a very basic question. I am new to webgl and trying to draw a simple square. I am using the gl matrix library for matrix manipulation.
Javascript Code:
squareVertexPositionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, squareVertexPositionBuffer);
vertices = [
0.9, 0.9, 0.0,1.0,
-0.9, 0.9, 0.0,1.0,
0.9, -0.9, 0.0,1.0,
-0.9, -0.9, 0.0,1.0
];
squareVertexPositionBuffer.itemSize = 4;
squareVertexPositionBuffer.numItems = 4;
mat4.identity(pMatrix);
mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);
mat4.identity(mvMatrix);
mat4.translate(mvMatrix, [-1.5, 0.0, -7.0]);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, squareVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);
setMatrixUniforms();
gl.drawArrays(gl.TRIANGLE_STRIP, 0, squareVertexPositionBuffer.numItems);
Shader:
attribute vec3 aVertexPosition;
uniform mat4 uMVMatrix;
uniform mat4 uPMatrix;
varying vec3 debug;
void main(void) {
gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition.xyz, 1.0);
debug = aVertexPosition;
}
This seems to work out fine.Here i am passing the model view and perspective matrices as uniforms to the shader programs and multiplying them with the vertex coordinates there. But if multiply the model view and perspective matrices in the javascript and the then pass the modified vertices to the shader, it doesnt seem to work.
I'm not able to spot the mistake. Help highly appreciated!
Upvotes: 0
Views: 2320
Reputation: 2023
The issues outlined in John's answer are worth looking into, but if you're getting the primitive to render properly now then I don't think they are the root cause of this problem.
I've split my answer into two segments since I'm not sure which scenario applies to you:
You didn't show how you are constructing the JavaScript version of the MVP, but double check that your multiplication syntax is correct. It should be of the form:
mat4.multiply(p, mv, mvp);
where p
is the projection matrix, mv
is the modelview matrix, and mvp
is the receiving modelview projection matrix (where the result will be stored).
Similarly, if you are premultiplying the vertex positions themselves, the correct syntax is:
mat4.multiplyVec4(mvp, vertex, transformedVertex);
where mvp
is the modelview projection matrix, vertex
is the unmodified vertex (a vec4
), and transformedVertex
is a receiving vec4
where the values will be stored. If you use this approach, you'll have to flatten an array of vertices into a single contiguous array of floats before sending them to the GPU.
Be careful not to confuse mat4.multiplyVec3
with mat4.multiplyVec4
. The fourth component of vertex
should be set to 1
prior to multiplication; this way, the fourth component of transformedVertex
can be used by the GPU to perform perspective division (this happens automatically but you need to send the correct values).
Note also that if you send vec3
attributes down to the GPU by mistake and the attribute is actually of type vec4
, the 4th component defaults to 1.0. This is often convenient, but can also lead to tricky bugs if you're premultiplying these values in JavaScript. As John says, try to keep the shader definitions consistent with the JavaScript side; this will lead to less confusion in the long run.
Upvotes: 1
Reputation: 3575
Right off the bat I notice a couple issues:
1) Your vertex shader input attribute does not match the parameters passed to vertexAttribPointer:
You specify that the attribute is 4 floats wide but your vertex shader specifies it as a vec3. These should be consistent.
2) I don't see a call to enableVertexAttribArray, something like:
gl.enableVertexAttribArray(shaderProgram.vertexPositionAttribute);
You must enable the vertex attribute arrays that your vertex shader uses.
3) From the code shown- you are only specifying a model matrix (identity, translate). You need to create a view matrix. An easy way to do that is to call mat4.lookAt. It has the following definition:
mat4.lookAt = function (eye, center, up, dest)
eye is the position of the camera center is the position the camera is looking at up should always be (0, 1, 0)
An example would be:
eye = (0, 0, 1) center = (0, 0, 0)
That puts the camera 1 unit down the positive Z axis and has it looking at the origin.
Multiplying this with your model matrix will give you a modelview matrix.
There is so much code missing that it makes it difficult to determine if your mistake lies somewhere else. For example, what does your fragment shader do?
When trying to get your first primitive on the screen, I recommend disabling culling and depth testing. Also vary your clear color over time in case your fragment output matches the clear color.
John
Upvotes: 2