So I finally got VR working in my game. Being inside the world I’ve been working on building for almost a decade was enough to bring a tear to my eye. It was a powerful moment.
It works on my Reverb G2 on PC and even works right in the Quest 2 native browser. I am so happy. I still have to setup controls and UI though. Keyboard mouse works but of course not being able to see them is a problem. I haven’t tried using the Three.js WebXR control API yet, but I intend to soon.
I do have a question though. Is there any preferred method for handling UI in WebXR? I’m assuming that HTML is out of the question (right?). Is creating textures via Canvas 2D and attaching them to a plane a decent approach? Any pointers would be appreciated. Displaying chat is one of the things that I would like to do.
In context of VR, yes. This kind of screen-space (overlay) UI does not work. Instead UI elements have to be spatial.
There are many existing resources about the theory behind UI design in VR. Especially the distinction between diegetic, non-diegetic, spatial and meta UIs is important to understand for designers and developers. It’s from Beyond the HUD – User Interfaces for Increased Player Immersion in FPS Games.
Yes, that is the usual approach for implementing basic spatial UIs.
A few online resource about that topic:
From a development perspective, you maybe want to start from this topic: