mjanisz1
mjanisz1

Reputation: 1516

Mesh led by noise

Just looking at http://lab.samsy.ninja/ intro experiment and wondering how it is done.

Must be an instanced geometry led by noise calculated using gpgpu.

But after fiddling a bit I got a headache wondering how to get the position from a simulated texture with noise for each vertex in a mesh? It would be easy for only particles representing a single pixel on a texture, but how to achieve it for a more complex geometry?

Maybe it is possible to read each pixel from such a simulated texture outside of the shader? Render it to a canvas, traverse through each pixel and assign it to corresponding instance?

Any hints are more than welcome

Upvotes: 0

Views: 1055

Answers (1)

M -
M -

Reputation: 28502

This is done via GPGPU in 2 rendering steps per frame.

  1. First, they created a plane with custom shaders that has an RGB noise generator, but instead of rendering to the canvas, it's rendered to a THREE.WebGLRenderTarget. The result is stored in a texture that can be accessed via WebGLRenderTarget.texture.

  2. Then they generated an instanced geometry with the colored bars you see in a second scene, where you can pass the stored texture from step 1 to the material with something like material.noiseMap = target.texture;. In the vertex shader, you can read the RGB channel of each pixel and assign it to the XYZ position of each vertex.

Here's a similar example available in the Three.js site for you to scan through the code: https://threejs.org/examples/?q=gpgpu#webgl_gpgpu_water

And here's a link to the standard GLSL noise generation functions: https://github.com/ashima/webgl-noise/tree/master/src I think they used psrdnoise2D.glsl for this type of movement.

Upvotes: 1

Related Questions