Reputation: 147
I have been trying to write a simple shader for adding noise. But I can't get the uv coordinates with the following settings.
Fragment Shader:
uniform float seed;
uniform sampler2D pass;
varying vec2 vUv;
void main (){
//noise
vec2 pos = gl_FragCoord.xy;
pos.x *= seed;
pos.y *= seed;
float lum=fract(sin(dot(pos ,vec2(12.9898,78.233))) * 434658.5453116487577816842168767168087910388737310);
vec4 tx = texture2D(pass, vUv);
gl_FragColor = vec4(tx.r*lum,tx.g*lum,tx.b*lum,1.0);
}
Vertex Shader:
varying vec2 vUv;
void main (){
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
Rendering:
OBJECT.material = OBJECT.mat.flat; // THREE MeshPhongMaterial ({color: 0xE40D59,shading:THREE.FlatShading});
RENDERER.render(SCENE,CAMERA,BEAUTY_PASS,false);
OBJECT.material = OBJECT.mat.noise; // THREE ShaderMaterial
RENDERER.render(SCENE,CAMERA);
I get the following error:
Error: WebGL: DrawElements: bound vertex attribute buffers do not have sufficient size for given indices from the bound element array @ http://threejs.org/build/three.min.js:439
I did some tests and I can run it by picking the same coordinate for all the pixel
vec4 tx = texture2D(pass, vec2(0.5,0.5));
which display my object with a reddish noisy color, however the vUv variable works perfectly fine if I remove the first rendering pass (RENDERER.render(SCENE,CAMERA,BEAUTY_PASS,False)
).
Why I can't get the uv coordinate on the second render? According to several examples I should be able to render it using the same scene and camera like in this example
Upvotes: 0
Views: 873
Reputation: 104783
Without an initial texture, the geometry will not have the necessary baked-in WebGL UV buffers.
There are several solutions, but perhaps the easiest is to be sure that the first rendering of the mesh has a texture. A simple white one will do.
three.js r.58
Upvotes: 1