posit labs
posit labs

Reputation: 9471

threejs fragment shader using recycled frame buffers

I'm trying to make an app that will simulate long exposure photography. The idea is that I grab the current frame from the webcam and composite it onto a canvas. Over time, the photo will 'expose', getting brighter and brighter. (see http://www.chromeexperiments.com/detail/light-paint-live-mercury/?f=)

I have a shader that works perfectly. It's just like the 'add' blend mode in photoshop. The problem is that I can't get it to recycle the previous frame.

I thought that it would be something simple like renderer.autoClear = false; but that option seems to do nothing in this context.

Here's the code that uses THREE.EffectComposer to apply the shader.

        onWebcamInit: function () {    
            var $stream = $("#user-stream"),
                width = $stream.width(),
                height = $stream.height(),
                near = .1,
                far = 10000;

            this.renderer = new THREE.WebGLRenderer();
            this.renderer.setSize(width, height);
            this.renderer.autoClear = false;
            this.scene = new THREE.Scene();

            this.camera = new THREE.OrthographicCamera(width / -2, width / 2, height / 2, height / -2, near, far);
            this.scene.add(this.camera);

            this.$el.append(this.renderer.domElement);

            this.frameTexture = new THREE.Texture(document.querySelector("#webcam"));
            this.compositeTexture = new THREE.Texture(this.renderer.domElement);

            this.composer = new THREE.EffectComposer(this.renderer);

            // same effect with or without this line
            // this.composer.addPass(new THREE.RenderPass(this.scene, this.camera));

            var addEffect = new THREE.ShaderPass(addShader);
            addEffect.uniforms[ 'exposure' ].value = .5;
            addEffect.uniforms[ 'frameTexture' ].value = this.frameTexture;
            addEffect.renderToScreen = true;
            this.composer.addPass(addEffect);

            this.plane = new THREE.Mesh(new THREE.PlaneGeometry(width, height, 1, 1), new THREE.MeshBasicMaterial({map: this.compositeTexture}));
            this.scene.add(this.plane);

            this.frameTexture.needsUpdate = true;
            this.compositeTexture.needsUpdate = true;

            new FrameImpulse(this.renderFrame);

        },
        renderFrame: function () {
            this.frameTexture.needsUpdate = true;
            this.compositeTexture.needsUpdate = true;
            this.composer.render();
        }

Here is the shader. Nothing fancy.

        uniforms: {
            "tDiffuse": { type: "t", value: null },
            "frameTexture": { type: "t", value: null },
            "exposure": { type: "f", value: 1.0 }
        },

        vertexShader: [
            "varying vec2 vUv;",
            "void main() {",
            "vUv = uv;",
            "gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",

            "}"
        ].join("\n"),

        fragmentShader: [

            "uniform sampler2D frameTexture;",
            "uniform sampler2D tDiffuse;",
            "uniform float exposure;",
            "varying vec2 vUv;",

            "void main() {",
            "vec4 n = texture2D(frameTexture, vUv);",
            "vec4 o = texture2D(tDiffuse, vUv);",
            "vec3 sum = n.rgb + o.rgb;",
            "gl_FragColor = vec4(mix(o.rgb, sum.rgb, exposure), 1.0);",
            "}"

        ].join("\n")

Upvotes: 4

Views: 2011

Answers (3)

16807
16807

Reputation: 1525

This is in essence equivalent to posit labs answer, but I've had success with a more streamlined solution - I create an EffectComposer with only the ShaderPass I want recycled, then swap renderTargets for that composer with each render.

Initialization:

THREE.EffectComposer.prototype.swapTargets = function() {
    var tmp = this.renderTarget2;
    this.renderTarget2 = this.renderTarget1;
    this.renderTarget1 = tmp;
};

...

composer = new THREE.EffectComposer(renderer,  
    new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat })
);

var addEffect = new THREE.ShaderPass(addShader, 'frameTexture');
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);

render:

composer.render();
composer.swapTargets();

A secondary EffectComposer can then take one of the two renderTargets and push it to the screen or transform it further.

Also note I declare the "frameTexture" as textureID when initializing the ShaderPass. This lets ShaderPass know to update the frameTexture uniform with the results from the previous Pass.

Upvotes: 2

mrdoob
mrdoob

Reputation: 19602

Try with this:

this.renderer = new THREE.WebGLRenderer( { preserveDrawingBuffer: true } );

Upvotes: 0

posit labs
posit labs

Reputation: 9471

To achieve this kind of feedback effect, you have to alternate writing to separate instances of WebGLRenderTarget. Otherwise, the frame buffer is overwritten. Not totally sure why this happens... but here is the solution.

init:

    this.rt1 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
    this.rt2 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });

render:

    this.renderer.render(this.scene, this.camera);
    this.renderer.render(this.scene, this.camera, this.rt1, false);

    // swap buffers
    var a = this.rt2;
    this.rt2 = this.rt1;
    this.rt1 = a;
    this.shaders.add.uniforms.tDiffuse.value = this.rt2;

Upvotes: 1

Related Questions