This works when screen is the render target (Webgl1Rrenderer) for rendering stereo, but does it work when the render target is a texture?
While I haven’t managed to make this work, I can’t find any documentation that states or implies that it cannot be done.
This could eliminate the 3d pass (left cam to a texture, right cam to another texture, then read both and combine them to a third texture).