Portal clipping shader in VR

I was trying to recreate a portal demo by adapting the code from Coding Adventure: Portals - YouTube

I use a render target with a custom shader to clip the target scene to the portal mesh.

const vertexShader = `
    varying vec4 v_pos;

    void main() {
        vec4 clip_pos = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
        v_pos = clip_pos;
        gl_Position = clip_pos;
    }
`;

const fragmentShader = `
    varying vec4 v_pos;
    uniform sampler2D tImage;

    void main() {
        vec2 uv = v_pos.xy / v_pos.w;
        uv = (uv + vec2(1.0)) * 0.5;
        gl_FragColor = texture2D(tImage, uv);
    }
`;

const material = new THREE.ShaderMaterial({
    vertexShader,
    fragmentShader,
    uniforms: {
        tImage: { value: null },
    },
});

It works great on web but on VR it’s all messy. I tried this code from Reflector example and also recommended from threejs’s Github to disable xr before rendering but still doesn’t work.

renderer.xr.enabled = false
renderer.setRenderTarget(renderTarget);
renderer.render(portalScene, portalCamera);
renderer.xr.enabled = true;

I know the issue is from the shader because the portal renders fine (but not clipped) when I use a regular material with default uv coordinates

2 Likes

I get it’s hard to look at this without a proper repro demo. I can’t make a small sized repro right now but at least would appreciate some guiding on using render targets in VR mode. Yes I’ve looked at the issues but the solutions proposed right now seem to revolve around disabling xr when rendering into a render targets. But for cases that want to manipulate view/projection matrices like the above I think there’s more to the story.

I’ve looked at the sources and well it seems the vr mode modifies the camera to match the info provided by VR system so that would in part explain why the shader above didn’t work: the fragment shader would use the VR camera matrices to compute the clip space coordinates of the portal vertices.

I modified the shader so that I compute the ‘texture matrix’ manually in JS and then send it as a uniform to the shader. The result seems less messy (at least I can see the scene inside the portal) but the proportions are still incorrect.

1 Like

Reflector seems to work well in VR so I looked at its sources. It seems to use a different approach called projective texturing

Unlike the above approach it relies on the portal’s virtual camera for projection. So maybe that’s the way to go although I’m skeptic about it: the idea is to render the portal in full size but clip it to the portal mesh. The portal mesh is rendered in the main scene so it seems more appropriate to use the main camera projection to compute the clip space coordinates. It wouldn’t make a difference for Reflector because the same scene is rendered in the mirror using the same camera (although reflected about the normal of the mesh). But in the portal case the 2 cameras can have different projection matrices.

1 Like

Did you manage to make it work? I need to render a secondary scene/object in 3D in VR