Technically you can read the content of the <canvas>
into a texture with WebGLRenderer.readRenderTargetPixels(), but this approach is really resource-intensive and will decimate your framerate. See this discussion for more details on that.
The recommended approach is for you to render to a WebGLRenderTarget and then use the resulting WebGLRenderTarget.texture
as texture input for your next render pass. Here’s a quick pseudocode on how to do that:
const renderer = new WebGLRenderer();
const rt = new WebGLRenderTarget();
animate() {
// Set rt as the destination of the render
renderer.setRenderTarget(rt);
renderer.render(scene, camera);
// Use result of rt as input for next pass
material.inputTexture = rt.texture;
// Passing null sets the canvas as the destination
renderer.setRenderTarget(null);
renderer.render(postScene, postCamera);
}
- You can keep chaining renderTargets one after the other as needed.
- Make sure you use a different scene for each render pass, otherwise WebGL will stop the operation to prevent an infinite feedback loop.