Creating GPS Based AR-App

I am trying to create an AR app with Three.js. I have already been able to implement the following points

  • Camera image as background image
  • Drawing 3D elements on the background image
  • Controlling the camera with mobile phone sensors (DeviceOrientationControls)

My code looks like this:

Creating Three.js Scene

const scene = new THREE.Scene();
		const camera = new THREE.PerspectiveCamera( 75, window.innerWidth / 
 window.innerHeight, 0.01, 1000 );
		const renderer = new THREE.WebGLRenderer({ alpha: true });
		renderer.setSize( window.innerWidth, window.innerHeight );
		const controls = new DeviceOrientationControls( camera );
		document.body.appendChild( renderer.domElement );

Creating. Cube and Axeshelper

const geometry = new THREE.BoxGeometry( 1, 1, 1 );
		const material = new THREE.MeshBasicMaterial( { color: 0x00ff00 } );
		const cube = new THREE.Mesh( geometry, material );
                const geometry2 = new THREE.BoxGeometry( 1, 1, 1 );
		const material2 = new THREE.MeshBasicMaterial( { color: 0xff0000 } );
		const cube2 = new THREE.Mesh( geometry2, material2 );
		const axeshelper = new THREE.AxesHelper( 5 );
		camera.position.z = 2;

Converting Lat/Lon Data to cartesian Coordinates and position the cube

const myPos = this.calcPosFromLatLonRad(52.498604, 13.391799, 1);
		const myPosX = myPos[0];
		const myPosY = myPos[1];
		const myPosZ = myPos[2];
		cube.position.set(myPosX, myPosY, myPosZ);
		scene.add ( cube );
                scene.add ( cube2 );

Rendering the scene

const animate  = () => {
	requestAnimationFrame( animate );
	renderer.render( scene, camera );

But now I don’t know whether simply calculating the GPS coordinates into a Cartesian product is sufficient. It definitely does not work at the moment. The points get almost the same value after the conversion, so that two points that should theoretically be displayed at completely different positions are displayed directly next to each other. In addition, I think I have to look at the world from the user’s coordinate point. But I don’t know how to do that.

GPS coordinates do not map to the cartesian corrdinate system used in ThreeJS. They are completely different.
To position threejs assets referenced with real world, you’ll need more than GPS. Consumer GPS have precision errors (~5 m) which render it unviable for the level of precision you may need to position objects on a room scale. Most AR solutions use photogrammetry and/or more complicated systems to position virtual objects in space. This can be pretty challenging to build on your own and stacks like ARKit, WebXR, ARCore etc. try to make this easier for developers to build on top of.

Following links might help with your search:

I would love to use AR.js because it does exactly what I need. But I didn’t make it run it on my localhost. No matter what, the certificate wasn’t accepted by google chrome. WebXR is not supported by safari and I am developing a cross platform browser based application. So it won’t run on iOS.

The certificate issue might not really be an issue. It’s only because you are trying it on localhost. When the app is deployed your hosting would typically provide the certificate for the domain and you won’t have to deal with it.

For developing locally some common flows to work around this are:

  • Use a self-signed certificate and add it to your trusted store.
  • Run chrome with this flag to avoid the issue chrome://flags/#allow-insecure-localhost
  • If you are using a build pipeline, there’s usually a way to circumvent this issue
  • Tweak the application you are working with to work with http instead of https

I’m not sure what might be the best approach for ARjs.
Anyhow these are all more about https and web security than threejs.

WebXR not being supported Safari is just sad to hear. Hopefully they start supporting it soon!