Postprocessing with multiple EffectComposers and WebGLRenderTargets


I am (trying) to make a video editor. I want the users to be able to apply effects to their created scenes, something like this effetcpipeline:

I made it so that each scenecontainer has a quad with a WebGLRenderTarget as a texture, and a EffectComposer using that rendertarget as its output. The main scene then renders these quads to output to the canvas.

In the SceneContainer:

this.buffer = new WebGLRenderTarget( width, height, rtParameters )
const texture = this.buffer.texture
this.quad = new THREE.Mesh( new THREE.PlaneBufferGeometry( 2, 2 ),  new THREE.MeshBasicMaterial({map: texture}));
this.quad.frustumCulled = false;

this.effectComposer = new EffectComposer(renderer, this.buffer)
this.renderPass = new RenderPass(scene, camera)
this.sepiaPass = new ShaderPass(SepiaShader)
//this.sepiaPass.renderToScreen = true //<--- 

And in the main loop

    this.sceneContainers.forEach(sceneContainer => {
     renderer.render(mainScene, mainCamera) // <--- 

My problem is when I try to render the scene-quad in the main scene, it flickers between its pre-effect and post effect texture, but with the pre on top. If I switch the comments on the the arrowed lines to make it render directly to screen it works fine, does anyone anyone know why it does this?

Also is this a sane way to approach having multiple EffectComposers? Or would multiple WebGLRenderers be a better option?

No. The problem is that each instance of WebGLRenderer has its own WebGL context. Sharing data like render targets between contexts is not possible. However, using multiple effect composers is okay.

It would be easier to provide more help if you can show the flicker with a reduced testcase. Maybe as a live example?

Hello! thanks for the answer, I had forgot to set the quad mesh to transparent, and after it was fine, so I think it caused some weird z-fighting between the meshes.

Actually the transparent: true parameter was only part of the problem, it seems I need to control the buffer swapping in some way of the passes on my effect-chain, but I’m dumbfounded as to how.

I have it set up with a EffectComposer like the one in this example, but using a WebGLRenderingTarget as the target (the structure from the original post)

If I do a composer.swapBuffers() after every render then every other effect “blinks” between the previous “state” (when rendered with the previous effects) and the new one. So I figured I would swap the buffers at effectComposer.passes.length % 2 === 1, however this yielded the same results.

Any ideas on what is wrong?

I set up an example at with a live demo at

its SSAARenderPass -> CopyShaderPass -> TestShader (which is just outputting a random color to the screen, if you go to the graphics layer and add more TestShaders every other pass will flicker)

I made a fiddle showcasing the flickering,

I’m not sure why the flickering happens in your setup. If I change the code to the usual workflow, it seems to work:

Yeah, I haven’t been able to figure out what is going wrong. I made a similar set up with a CanvasTexture as input here: which seems to working fine, maybe I should try something similar with the three-js scene, but it feels kind of wonky.

Anybody ever discover a fix here. I am trying to do something similar and having a similar issue here

Hi @Mugen87 is there a demo example of rendering multiple effect composers over eachother with depth preserved? i’d like to render one scene with a certain set of post effects and another scene composed of different objects with another set of post effects.

i’ve tried this approach where the result is only the last composer in queue that renders to screen and also this approach where the result yeilds almost a half transparency to both scenes where the scenes are composed of different objects.

I’m not aware of something like that. Using multiple instances of EffectComposer is not that common.