How can I copy a WebGLRenderTarget texture to a canvas

I want to do a picture-in-a-picture effect with a bit of a twist. Rather then rendering the mini rendering in the single scene canvas I want to:

  1. Render the scene to a texture - with different camera and material.
  2. Paint the rendering into a canvas element in a div that overlays the three,js scene/canvas.

So I need some way to map the texture to a canvas. Is that possible?


Um, I don’t understand why you are working with a render target if you want to display your render result on a canvas element. Consider to perform two renderings: The first one will create your effect, the second is the normal render pass. After the first rendering, you copy the contents of the renderer’s canvas to a different one. This approach is demonstrated in the following example:

The interesting code section is:

context.drawImage( renderer.domElement, 0, 0 );

context is the 2D rendering context of another canvas element.

1 Like

The idea is I use the normal three.js canvas - webgl context - dedicated rendering direct to screen.
Now, in addition I will have one - or more - divs positioned above that context. The div will have a canvas - 2D context - into which I draw the contents of a render-to-texture pass. That pass is only drawn to the 2D context in the div. Not to the webgl context. Make sense?

Actually, I just looked at the example you pointed me to and that should do the trick!

Thanks as always.

1 Like

I have a similar issue where this solution works, however it’s a case where my canvases are different sizes, and unfortunately the renderer.setSize() when modifying the canvas size is really slow. It’s unfortunate since these canvases only require to be actually resized when a window resize handler is called and on initiation. I have a work around now but it does not work if I want different pixel ratios for each render (higher res for main screen, lower for minimap).