Previously I was looking for answers about touch & select 3D objects in WebXR, which appears not to be possible.
I was thinking about adding moving div elements to cover 3D objects and make them selectable by clicking on these divs. I found out in the forum how to get objects world position to screen space but it acts as if the camera is stationary at (0, 0, 0) and not moving with the mobile device.
This works in browser but not in AR, any idea about this?
Months ago I was struggling with a similar problem, namely creating user interfaces for VR. Using the DOM is ill-suited for this task, and it may be also for AR. I found that an alternative solution would be to create user interfaces with three.js and use them in the scene like any three.js object. So I did that, and ended up creating an entire library, three-mesh-ui. It supports flexbox-like layouts, big texts, hidden overflow, a lot of other stuff and of course it’s three.js objects so you can just object.position.set( nextPosition ).
Hi @felixmariotto thank you for the reply. I need this specifically for AR, there will be no controllers but the mobile device and only way to interact with the objects is touching the screen.
This is still a VR example, if this worked in AR with 3D drawn over camera feedback then it could be an option. Only controller input available is a ‘select’ event, which does not convey any info about where on the screen or what object was clicked. I am trying to solve this problem by using DOM overlay to detect where on the screen is clicked and from there find a solution.
@Sarge I am trying to implement the same now, It works in 3D view but not in AR mode. Have you been able to achieve overlaying the DOM to a mesh in AR mode? If yes please let me know your process.
Btw, This is my code for overlaying the DOM on to the mesh
//World space interaction annotation position update - Region
if(currentMeshForAnnotate != null){
annotationImage.style.display = "block";
currentMeshForAnnotate.updateWorldMatrix(true, true);
currentMeshForAnnotate.getWorldPosition(tempV);
camera.updateMatrixWorld();
tempV.project(camera);
// convert the normalized position to CSS coordinates
const x = (tempV.x * .5 + .5) * canvas.clientWidth;
const y = (tempV.y * -.5 + .5) * canvas.clientHeight;
// move the elem to that position
annotationImage.style.transform = `translate(-50%, -50%) translate(${x}px,${y}px)`;
}
else{
annotationImage.style.display = "none";
}
//World space interaction annotation position update - End Region
Sorry, I forgot to update it here. I fixed this by creating a Raycaster and subscribing to touch events. Then every time a touch event listeners receive a message, I cast a ray to the scene from the camera and check the hits. Also you may need to correct the offsets of the mouse position regarding where your canvas is positioned in the page.
No, I think you have misunderstood the requirement, To recap, I had implemented a DOM based UI and set it up so that it works like a worldspace UI.
Due to the current state of the WebAR, It is not being positioned over the meshes in AR (It kindoff hovers somewhere over the mesh or be in the center at some cases)
I overcame this by using the 3DGUI module from github and implemented the GUI as a pure 3D object which I can easily raycast and trigger click events.