Reputation: 43457
If I have a shader that discards (or makes otherwise transparent) portions of a mesh, this (understandably) does not affect the behavior of the raycasting. It should be possible to sample the Z buffer to obtain raycast positions, though of course we'd have other side-effects such as no longer being able to get any data about which object was "found".
Basically though if we can do a "normal" raycast, and then have the ability to do a z-buffer check, we can then start combing through the complete set of raycast intersections to find out the one that really corresponds to the thing we clicked that we're looking at...
It's unclear if it is possible to sample the Z buffer with three.js. Is it possible at all with WebGL?
Upvotes: 0
Views: 891
Reputation: 104823
No, Raycaster
cannot sample the depth buffer.
However, you can use another technique referred to as "GPU-Picking".
By assigning a unique color to each object, you can figure out which object was selected. You can use a pattern like this one:
//render the picking scene off-screen
renderer.render( pickingScene, camera, pickingTexture );
//create buffer for reading single pixel
var pixelBuffer = new Uint8Array( 4 );
//read the pixel under the mouse from the texture
renderer.readRenderTargetPixels(pickingTexture, mouse.x, pickingTexture.height - mouse.y, 1, 1, pixelBuffer);
//interpret the pixel as an ID
var id = ( pixelBuffer[0] << 16 ) | ( pixelBuffer[1] << 8 ) | ( pixelBuffer[2] );
var data = pickingData[ id ];
renderer.render( scene, camera );
See these three.js examples:
http://threejs.org/examples/webgl_interactive_cubes_gpu.html http://threejs.org/examples/webgl_interactive_instances_gpu.html
three.js r.84
Upvotes: 2