Postprocessing on a WebGLRenderTargets

Hello! I’m trying to apply Postprocessing to a scene rendered inside a mesh (a cube) through a WebGLRenderTarget.

“Standard” Postprocessing it is quite clear to me, but applying it to a different target gives me some difficulties.

For example, I would like to apply vignetting only to the scene rendered in the cube, to a specific WebGLRenderTarget. Everything outside the cube, inside the main scene, must not be affected.

It’s possible to do it?
And secondly, is it possible to achieve this without killing performance?

Thank you


I am not at all clear on what you are trying to achieve. Perhaps an illustration would help.

Ok I created a test scene.

As you can see there is the main scene with 2 torus and 1 cube. Then I render on the cube material a second scene with the WebGLRenderTarget. In this second scene the background is yellow and there is a torus that rotates.

I would like to apply postprocessing only to this second scene

Here the code sandbox
(it’s in react)

I think that’s fine. Maybe consider using lowest resolution that still works for that render target. Otherwise it seems like the right approach to me.

Unfortunately I have not yet managed to get the expected result.
Among other things, as soon as I apply the postprocessing the ClearColor of the main scene changes, from aquamarine to yellow :man_shrugging:t2:

Probably you want to pass the rendertarget in as the read buffer of the EffectComposer. I don’t use react so I’m not able to help with your sample code. Maybe @drcmda knows?

the react part is just the setup, it’s about the postprocessing bit inside useframe. i don’t know enough about postprocessing to solve it. with jsm/effectcomposer you’d render into a webglrendertarget and use the texture, but that seems different with pp?

Hello! Just yesterday I found the solution. “Found” is the wrong word, I opened an issue on the postprocessing github and they told me how :slight_smile:

The solution can be seen here

Unfortunately the link to the codesandbox I posted above is no longer valid because I’m stupid and I changed it.

I made a small demo to test everything. As you can see Glitch + Noise + Vignette is applied to the rendered scene inside the television. While in the main scene SSAO + Bloom + Vignette.

Everything works and I honestly thought the performance was much worse :slight_smile:


how did you get the screen space reflections?

It’s a gag :sweat_smile:
I created a RenderPass and a SavePass on purpose to save the scene on texture in a certain position.

Then I used that texture as a map on a transparent plane that I placed on top of the floor.
It’s definitely the wrong method but it’s something :slight_smile:

I have to investigate how to achieve a similar effect

Maybe I should use Three.js CubeCamera or Reflector to get that effect, but when I try to integrate them into the scene nothing works anymore. I guess it’s because in their onBeforeRender they go to act on the renderer in a way that is incompatible with what I did.