Reputation: 2281
I have a WebGL renderer and I want to transform random world coordinates to screen coordinates in the fragment shaders.
That will allow me to compare the "computed screen coordinates" to the current fragment screen coordinates.
I have all the information needed to do so I believe, just not sure which is the right way to do so.
...
// available information:
// uCanvasWidth (float)
// uCanvasHeight (float)
// modelMatrix (mat4)
// modelViewMatrix (mat4)
// projectionMatrix (mat4)
vec4 someWorldCoordinates = vec4(1., 2., 3., 1.);
// map it to screen coordinates?
vec4 screenCoordinates = ???
// compare to current fragment location
if(screenCoordinates.x > gl_FragCoord.x){
// do stuff
}
...
Upvotes: 2
Views: 3817
Reputation: 8123
vec4 worldSpace = vec4(1., 2., 3., 1.);
// get homogeneous clip space coordinates
vec4 clipSpace = projectionMatrix * ( modelViewMatrix * worldCoords );
// apply perspective divide to get normalized device coordinates
vec3 ndc = clipSpace.xyz / clipSpace.w;
// do viewport transform
vec2 screenSpace = (ndc.xy * .5 + .5) * vec2(uCanvasWidth, uCanvasHeight);
screenSpace.y = uCanvasHeight - screenSpace.y;
Note that your modelView
matrix likely contains the transforms of the model you're rendering. I would assume that the points specified in in shader are thought to be already "final" world space(as in not relative to the model you're rendering), so you might want to use just the view
matrix here(preferrably a premultiplied viewProjection
).
Upvotes: 2