Hi guys
I’m playing a bit with layers (set, enable, etc…), so far I can assign a specific mask to an Object3D (lets say “A”) that can be filtered by a camera or a raycaster. For instance lets suppose A.layers.enable(2) so that I can configure a raycaster to pick up only stuff from layer 2.
I noticed some issues when presenting to XR: it seems it automatically enables custom per-eye masks, thus the A object was visible only by right eye. I tried A.layers.enableAll() (also for camera) but still, the object is visible only for one eye - whenever .layers is set/enabled with value 2.
How do layers work when presenting for WebXR?
Is there a way to maintain assigned masks for immersive sessions?
When using WebXR, layer 1 and 2 are reserved for XR rendering. When assigning objects just to layer 1, they are only rendered for the left eye. Layer 2 is assigned to the right eye. Objects on the default layer 0 are rendered in both views.
However, when using other layers the visibility and raycasting control should work as usual. Can you give it a try with layer 3?
Yep, thanks!
Enabling (not setting) layers from 3 and up works, thus not touching 1 and 2 in the bitmask.
Maybe could be useful to add some notes in https://threejs.org/docs/#api/en/core/Layers regarding 1 and 2 being reserved?
Hi, I have been having this issue too, I am working in AR on the phone using a xr context.
I have assigned item to layers 3 4 and 5 using children[4].layers.set(4)
and then set my camera to those layers as well. But they do not appear.
camera.layers.enable(4)
camera.layers.set(4)
Please help
I came across this same issue as you ShawnWhy, and after a bit of frustrating debuggin, I managed to solve the issue. Just in case someone else gets stuck on the same thing… If you have an object in layer 3 and want it to render in VR, this works:
const VRCamera = renderer.xr.getCamera();
VRCamera.layers.enable( 3 );
for (const camera of VRCamera.cameras){camera.layers.enable( 3 );}