It works great on web but on VR it’s all messy. I tried this code from Reflector example and also recommended from threejs’s Github to disable xr before rendering but still doesn’t work.
I get it’s hard to look at this without a proper repro demo. I can’t make a small sized repro right now but at least would appreciate some guiding on using render targets in VR mode. Yes I’ve looked at the issues but the solutions proposed right now seem to revolve around disabling xr when rendering into a render targets. But for cases that want to manipulate view/projection matrices like the above I think there’s more to the story.
I’ve looked at the sources and well it seems the vr mode modifies the camera to match the info provided by VR system so that would in part explain why the shader above didn’t work: the fragment shader would use the VR camera matrices to compute the clip space coordinates of the portal vertices.
I modified the shader so that I compute the ‘texture matrix’ manually in JS and then send it as a uniform to the shader. The result seems less messy (at least I can see the scene inside the portal) but the proportions are still incorrect.
Reflector seems to work well in VR so I looked at its sources. It seems to use a different approach called projective texturing
Unlike the above approach it relies on the portal’s virtual camera for projection. So maybe that’s the way to go although I’m skeptic about it: the idea is to render the portal in full size but clip it to the portal mesh. The portal mesh is rendered in the main scene so it seems more appropriate to use the main camera projection to compute the clip space coordinates. It wouldn’t make a difference for Reflector because the same scene is rendered in the mirror using the same camera (although reflected about the normal of the mesh). But in the portal case the 2 cameras can have different projection matrices.