Reputation: 2114
After messing around with this demo of Three.js rendering a scene to a texture, I successfully replicated the essence of it in my project: amidst my main scene, there's a now sphere and a secondary scene is drawn on it via a THREE.WebGLRenderTarget
buffer.
I don't really need a sphere, though, and that's where I've hit a huge brick wall. When trying to map the buffer onto my simple custom mesh, I get an infinite stream of the following errors:
three.js:23444 WebGL: INVALID_VALUE: pixelStorei: invalid parameter for alignment
three.js:23557 Uncaught TypeError: Cannot read property 'width' of undefined
My geometry, approximating an annular shape, is created using this code. I've successfully UV-mapped a canvas
onto it by passing {map: new THREE.Texture(canvas)}
into the material options, but if I use {map: myWebGLRenderTarget}
I get the errors above.
A cursory look through the call stack makes it look like three.js is assuming the presence of the texture.image
attribute on myWebGLRenderTarget
and attempting to call clampToMaxSize
on it.
Is this a bug in three.js or am I simply doing something wrong? Since I only need flat rendering (with MeshBasicMaterial
), one of the first thing I did when adapting the render-to-texture demo above was remove all trace of the shaders, and it worked great with just the sphere. Do I need those shaders back in order to use UV mapping and a custom mesh?
Upvotes: 3
Views: 542
Reputation: 2114
For what its worth, I was needlessly setting needsUpdate = true
on my texture. (The handling of needsUpdate
apparently assumes the presence of a <canvas>
that the texture is based on.)
Upvotes: 2