Multiple renderer & texture

Hi,

I use 2 WebGL renderers:

  • the first one is rendering inside a renderTarget (RT1)
  • RT1 is then used inside the second renderer to render an objet (on which RT1.texture is applied using its material) and this object is rendered inside a render target (RT2)
  • RT2 is then used by the first renderer to display it on a sprite (applying RT2.texture on its material) on screen.

Not using RT1.texture in second renderer is OK (I can get the sprite rendered) but as soon as I try to use it as a texture on material my object is black (as if RT1.texture was not correctly rendered). Applying directly RT1.texture on a sprite in first renderer comforted me that RT1.texture is well rendered.

So my question is (at last :slight_smile: !) : why texture generated in first renderer is not properly updated/interpreted in second renderer ?

I don’t know if I’m clear enough, I hope you understand my concern and could help me.
Regards

Each renderer has its own WebGL rendering context. And it’s not possible to share WebGL resources like framebuffers or textures across multiple contexts.

Try to use just a single instance of WebGLRenderer.

Thanks for your reply, I knew about OpenGL contexts (and was wondering is there was a way to share textures between them in Three.js).
I already achieved my dev with a single WebGLRenderer … but I was trying to package it so I can give a texture to a method and generate a new one as an output, without knowing the renderer already instanciated outside from the method. The solution I found (before your reply) is to the renderer as a parameter of my method so I can reuse it to render inside a renderTarget. It seems to be the only viable solution regading what you advised me ;).

I put your answer as “solution”.