WebXR - Controllers

I’m creating a WebXR game app where a user can move around a virtual environment. I’m testing on an Oculus Go with 3DOF. I’ve got the app to the stage where you can shoot a ray at a nav mesh and move to a new location. However the controller gets left behind. Should I parent the controller to the camera and if so do I need to update any matrices? I tried repositioning the controller using the console with the emulator and it always jumps back to the default, so I suspect there is a matrix to update after a forced setting of the position value. Any advice appreciated.

https://niksgames.com/webxr/example6 - work in progress

Looks like the solution is to attach the camera, the controller and grip to a Object3D that tracks with the path movement. For a 3DOF device the camera is hard set at 0, 1.6, 0. Eye height in other words and the right controller is at 0.5, 1.5, -1. Attach all these to an Object3D or Group and then use that for movement. Then rotational values are received from the XRInputSource in the framework.

You can also solved this issue in another way: Keep the player always at the origin and move the scene instead.

1 Like

Is there a preferred option? To me it feels more intuitive to move the camera and controllers. I’m assuming there is no performance advantage to either approach.
I notice on the Oculus Go when you enter vr it goes fairly pixellated. Not done enough experimenting with it, but I guess this is the purpose of the frame buffer scaling property of the WebXRManager. I’m getting over 60fps so I wonder if I should be playing with that. Or would anti-aliasing achieve a better result without hitting the performance?
I’m considering the UI options to switch the controller between a device for movement and using it as a weapon for enemies. I’ll check-out some games to see the preferred route. Very impressed with just how easy it was to get VR working using the THREE.js library. Even includes an excellent model of the controller.

Attaching the VR camera to dolly object is actually the most common approach. I’ve just wanted to highlight that keeping the player static and transform the scene instead is a valid solution, too.

I’m not able to verify this since I can only test with an Oculus Quest. Framebuffer scaling can be used to optimize the performance or rendering quality. If you think you have some free resources on your device, you can of course give it a try. Using anti-aliasing also sharpens your scene but it’s definitely a performance hit.

Thanks for the advice, much appreciated.

1 Like