I’ve seen, and followed, various ways to teleport in a VR session, including in the official examples. All work nicely, including the official example here.
An increasing number of users have VR headsets with 6 Degrees of Freedom, allowing them to physically move around their local environment. Say a headset viewer does these three things:
- Teleports to some point (x1, y, z), say because a controller ray has intersected with the floor at that point.
- Then moves in their local area to a new point (x2, y, z).
- Then wants to teleport to the intersection (x3, y, z) made by their controller pointing to another place on the floor.
How should this second teleport be computed? If I jump to the actual intersection point (x3, y, z) the teleport will not account for the local movement I made to x2. It doesn’t seem to matter how I jump (eg lerp a rig that has the camera as a child or create a new reference space) the end result is that the physical movement to x2 is never accounted for.
I see that webXR has an experimental feature called getFrame(), which then allows me to getViewerPose, which contains exactly the information I want - i.e. how much the viewer has moved within their local physical space:
const xrFrame = renderer.xr.getFrame();
const baseReferenceSpace = renderer.xr.getReferenceSpace();
const viewerPose = xrFrame.getViewerPose(baseReferenceSpace);
const physicalViewerMovement = viewerPose.transform.position;
BUT, this is an experimental feature with poor browser support.
Am I missing something? What is the correct way to teleport a VR viewer while accounting for any physical movement made by the viewer, who I can’t assume is always standing/sitting still?