I use 2 WebGL renderers:
- the first one is rendering inside a renderTarget (RT1)
- RT1 is then used inside the second renderer to render an objet (on which RT1.texture is applied using its material) and this object is rendered inside a render target (RT2)
- RT2 is then used by the first renderer to display it on a sprite (applying RT2.texture on its material) on screen.
Not using RT1.texture in second renderer is OK (I can get the sprite rendered) but as soon as I try to use it as a texture on material my object is black (as if RT1.texture was not correctly rendered). Applying directly RT1.texture on a sprite in first renderer comforted me that RT1.texture is well rendered.
So my question is (at last !) : why texture generated in first renderer is not properly updated/interpreted in second renderer ?
I don’t know if I’m clear enough, I hope you understand my concern and could help me.