VR does'nt render content

Hey there,
I got another issue…
I did some vr apps with three js already and they worked pretty good. This time hoewever it is not working, so I must have made some mistake, that I can’t see…

So first of all, to start in a simple way, I enable vr

initRenderer() {
        let w = this.m_domParent.offsetWidth;
        let h = this.m_domParent.offsetHeight;

        this.m_renderer = new THREE.WebGLRenderer({
            antialias: true,
            alpha: true
        });

        this.m_renderer.setPixelRatio(window.devicePixelRatio);
        this.m_renderer.setSize(w , h);
        this.m_renderer.shadowMap.enabled = true;
        this.m_renderer.shadowMap.type = THREE.VSMShadowMap;
        this.m_renderer.xr.enabled = true;
        this.m_domParent.appendChild(this.m_renderer.domElement);        
    }

then I add the “enter vr button”

 initVR() {
        document.body.appendChild(VRButton.createButton(this.m_renderer));
    }

I also changed the animation loop

animateScene() {
        let self = this;

        function animate() {
            self.m_controls.update();
            self.m_renderer.render( self.m_scene, self.m_camera );
        }

        self.m_renderer.setAnimationLoop(animate);
        //requestAnimationFrame( () => {self.animateScene(); } );
    }

On my desktop everything looks just fine.
On my occulus in the “normal” browser view, everything looks just fine as well.
When I enter vr the background gets rendered (clear color) but I can’t see any content.
To test, I change the background color and the new color applies to the vr scene, but still no content.

Then I tried to force to look at the point where the objects in the scene should be placed, and moved the camera close enough. This works in the normal-browser view, but in vr, still no content visible.

I am fairly sure I missed something or I my init-order is wrong, something like that.

initUi() {
        this.initWindow();
        this.initScene();
        this.createVirtualFloor();
        this.initCamera();
        this.initRenderer();
        this.initSceneLigths();
        this.createSelectPointHelper();
        this.initControls();
        this.onResize();
        this.initSceneContextMenu();
        this.initVR();
        this.resetView();
        this.initRaycaster();
        //this.initOutlineSelection();
        this.initCadTree();
        this.createConstructionFloor();
        this.initCadRefferenceGroup();
        this.initFont();
        this.initDebugTools();
        this.animateScene();
        
    }

in initRenderer I enable the vr mode
in initVR I add the VR button (I enabled the renderers vr mode here at first, but that didn’t work either)

Has somebody any idea where I went wrong this time?

I just found out that the camera moves to the position 0,0,0 as soon as I enter vr. And it seems as if it won’t move away from there. At least when leaving the vr mode the camera is always placed at these coords. I also implemented the event “sessionstarted” and set an “alarm()” into the event. Normally this kicks me out of vr so I get to know that a certain function got called, but here it does’nt call anything.
So my current guess is, when in vr and the render loop starts, for some “unknown” reason, an error occurs, that I can’t see.

I just created a minimal example, that works in the classical browser view but not in vr.

import * as THREE from "three";
import { OrbitControls } from '/assets/js/three/jsm/controls/OrbitControls.js';
import {VRButton } from '/assets/js/three/jsm/webxr/VRButton.js'

export class Widget_3D {
    constructor(domParent , project) {
        this.m_domParent = domParent;
        this.m_camera = undefined;
        this.m_scene = undefined;
        this.m_renderer = undefined;
        this.m_controls = undefined;

        this.m_domParent.innerHTML = "";
        this.m_domParent.style.padding = "0px";
        this.m_domParent.style.overflow = "hidden";        
        this.m_domParent.style.position = "relative";

        this.initUi();
        this.animate();
    }
    
    initUi() {
        this.m_scene = new THREE.Scene();
        this.m_scene.background = new THREE.Color(0x505050);

        let w = this.m_domParent.offsetWidth;
        let h = this.m_domParent.offsetHeight;
        let fov = 70;
        let near = 0.1;
        let far = 10;
        let aspectRatio = w / h;
        this.m_camera = new THREE.PerspectiveCamera(fov, aspectRatio, near, far);
        this.m_camera.position.set(0,1,3);
        this.m_scene.add(this.m_camera);

        const geometry = new THREE.BoxGeometry( 1, 1, 1 );
        const material = new THREE.MeshBasicMaterial( { color: 0x00ff00 } );
        const cube = new THREE.Mesh( geometry, material );
        this.m_scene.add( cube );

        this.m_scene.add( new THREE.HemisphereLight( 0xa5a5a5, 0x898989, 3 ) );
        

        this.m_renderer = new THREE.WebGLRenderer( { antialias: true } );
        this.m_renderer.setPixelRatio( window.devicePixelRatio );
        this.m_renderer.setSize( w , h );
        this.m_domParent.appendChild(this.m_renderer.domElement);
        document.body.appendChild(VRButton.createButton(this.m_renderer));

        window.addEventListener("resize", () => {
            this.onResize();
        });
    }

    animate() {
        let self = this;
        this.m_renderer.setAnimationLoop(function() {
            self.render(); 
        });
    }

    render() {
        this.m_renderer.render(this.m_scene , this.m_camera);
    }

    onResize() {
        let w = this.m_domParent.offsetWidth;
        let h = this.m_domParent.offsetHeight;
        this.m_camera.aspect = w / h;
		this.m_camera.updateProjectionMatrix();
        this.m_renderer.setSize(w, h);
    }
}

this simply shows me a green cube.
image

but in vr I can only see the background…

The weirdest thing is, the three.js examples do work, like the teleporting example, the rollercoaster or grabbing objects.
I just don’t get it.

If you want to move the camera programmatically in VR, you cannot just set camera’s position and rotation. This is because the camera will use the data from the headset and will ignore whatever values you give. To overcome this issue, make an empty object, put the camera in it and change the position of that object. In this way you can control where the camera is.

var user = new THREE.Group();

user.add( camera );
scene.add( user );

// now you can change user's position and rotation
// in order to manually control the camera in VR
1 Like

Setting the camera’s position in vr is not the most important thing, but thanks for the information. I will give that a try.

What I figured out so far:
If I change the scene’s background in the event “onsessionstart” the scene’s background does change!
If I add a standard cube to the scene it gets rendered. If I can even rotate the cube in the animation loop.
But everything else (what I am really interessted in) does’nt get rendered.
My current idea is, that this might be related to layers. I set every object on a specific layer and turned that layer on in the scene. And on my normal desktop this works just fine.
Maybe when entering the vr scene, these layer system does’nt work. But That’s what I will have to test next.
Good news is, that the vr scene is rendering in general, and I am not crazy hunting for an error only appearing when entering vr mode…

One question about the “user” trick…
Should I reset the cameras position to 0/0/0 when putting in the group? I believe the cameras position would stay relative to the group it is put in, just as any other Object3D does, right?
So to make things simple best would be to have the cameras coords at 0 and only the user, I suppose?

I found the reason!!! :smiley:

If you use layers as I wanted for using selective raycaster, the objects won’t get rendered.
I removed every appearance of layers in my code and everything works just as expected.
Two days of frustation to figure this out.

Hopefully, if somebody runs into a similar problem he / she will read this and try to get rid of the layers just to be aware to exclude these as the reason for this issue.

Some layers are reserved for VR/XR:

I think 32 layers available and I set them to layer 31 (the last available layer) and that didn’t work.
But I will give it a try on layer 3 as you suggested. Since using layers is WAY easier when it comes to raycasting, than grouping and regrouping everthing depending on what the user should be able to select.
I personally like the layers system, therefore, I will try to keep it.
Good thing is, in VR the layers aren’t needed, if I got that right in my mind. So maybe I can use onsessionstart and end to turn on / off the layer system, if I don’t get it.

But thanks for the adive of layer 1 and 2 in vr. This is really nice to know.

Maybe you want to read the other messages in that thread too. There is example (and solution) when using layer 3 still does not show it:

Okay, I just tried out using layer 3 just as Mugen87 suggested, but that didn’t work.
Then I tried out to enable layer 3 on the the xr camera just as PavelBoytchev suggested, but still I get no content then.

But that’s okay for now. I now know where I have to look and try things out to get them running. But for now, I will have to charge my vr device as I have 5% battery left over :slight_smile:

Next thing I will try is to enable layers when not in vr and disable them while in vr. The VR mode is to there to look around but not to edit things.(in this specific app).

But I am very happy to have a way to get this up and running, now this is more a question of maximizing the comfort in coding :smiley:

Thank you guys for all your patient help.