SquareFeet
SquareFeet

Reputation: 641

Projecting FBO value to screen-space to read from depth texture

EDIT: Updated the JSFiddle link as it wasn't rendering correctly in Chrome on Windows 7.

Context

I'm playing around with particles in THREE.JS and using a frame buffer / render target (double buffered) to write positions to a texture. This texture is affected by its own ShaderMaterial, and then read by the PointCloud's ShaderMaterial to position the particles. All well and good so far; everything works as expected.

What I'm trying to do now is use my scene's depth texture to see if any of the particles are intersecting my scene's geometry.

The first thing I did was to reference my depth texture in the PointCloud's fragment shader, using gl_FragCoord.xy / screenResolution.xy to generate my uv for the depth texture lookup.

There's a JSFiddle of this here. It's working well - when a particle is behind something in the scene, I tell the particle to be rendered red, not white.

My issue arises when I try to do the same depth comparison in the position texture shader. In the draw fragment shader, I can use the value of gl_FragCoord to get the particle's position in screen space and use that for the depth uv lookup, since in the draw vertex shader I use the modelViewMatrix and projectionMatrix to set the value of gl_Position.

I've tried doing this in the position fragment shader, but to no avail. By the way, what I'm aiming to do with this is particle collision with the scene on the GPU.

So... the question (finally!):

What I've tried

Help! And thanks in advance.

Upvotes: 1

Views: 780

Answers (1)

SquareFeet
SquareFeet

Reputation: 641

After a lot of experimentation and research, I narrowed the issue down to the values of modelViewMatrix and projectionMatrix that THREE.js automatically assigns when one creates an instance of THREE.ShaderMaterial.

What I wanted to do was working absolutely fine when in my 'draw' shaders, where the modelViewMatrix for these shaders was set (by THREE.js) to:

new THREE.Matrix4().multiplyMatrices( camera.matrixWorldInverse, object.matrixWorld)

It appears that when one creates a ShaderMaterial to render values to a texture (and thus not attached to an object in the scene/world), the object.matrixWorld is essentially an identity matrix. What I needed to do was to make my position texture shaders have the same modelViewMatrix value as my draw shaders (which were attached to an object in the scene/world).

Once that was in place, the only other thing to do was make sure I was transforming a particle's position to screen-space correctly. I wrote some helper functions in GLSL to do this:

    // Transform a worldspace coordinate to a clipspace coordinate
    // Note that `mvpMatrix` is: `projectionMatrix * modelViewMatrix`
    vec4 worldToClip( vec3 v, mat4 mvpMatrix ) {
        return ( mvpMatrix * vec4( v, 1.0 ) );
    }

    // Transform a clipspace coordinate to a screenspace one.
    vec3 clipToScreen( vec4 v ) {
        return ( vec3( v.xyz ) / ( v.w * 2.0 ) );
    }

    // Transform a screenspace coordinate to a 2d vector for
    // use as a texture UV lookup.
    vec2 screenToUV( vec2 v ) {
        return 0.5 - vec2( v.xy ) * -1.0;
    }

I've made a JSFiddle to show this in action, here. I've commented it (probably too much) so hopefully it explains what is going on well enough for people that aren't familiar with this kind of stuff to understand.

Quick note about the fiddle: it doesn't look all that impressive, as all I'm doing is emulating what depthTest: true would do were that property set on the PointCloud, albeit in this example I'm setting the y position of particles that have collided with scene geometry to 70.0, so that's what the white band is near the top of the rendering screen. Eventually, I'll do this calculation in a velocity texture shader, so I can do proper collision response.

Hope this helps someone :)

EDIT: Here's a version of this implemented with a (possibly buggy) collision response.

Upvotes: 0

Related Questions