Update PerspectiveCamera position with the device motion: acceleration

Hi everyone :wave:

I would like to build an AR experience (without web xr). The idea is to populate your surrounding with 3D cubes and be able to walk around them. I’ve created a github/codesandbox with a minimal demo. If you want to test the code run it from your favorite mobile phone. (it’s just an experiment with some code sorry if it’s not working on all phones tested with ios13+ and android OxygenOS 10.0 only for now)

What I did so far:
~Populate your surrounding with 3D objects :white_check_mark:
~Create a gyro camera to see around you all the objects :white_check_mark:
~Update the camera position using the accelerometer from you phone to walk around the 3D scene and objects

Details:

  • In my scene I have X number of cubes placed around me given a distance and an angle as follow:
public async randomPosition(object: THREE.Object3D) {
    const normalizedDistance = new THREE.Vector3()
    const randomPosition = new THREE.Vector3()
    const distance = getRandomNumber(1, 10)
    const angleDeg = getRandomNumber(0, 360)

    normalizedDistance.copy(new THREE.Vector3(0, 0, this.distance))
    const angleRad = THREE.MathUtils.degToRad(this.angleDeg)

    randomPosition.copy(normalizedDistance.applyAxisAngle(new THREE.Vector3(0, 1, 0), angleRad))
    object.position.copy(this.randomPosition)
  }
}
  • Then I used a gyro camera and now when I move around with my phone I’m able to see all the cubes around me. For now this code is based on the DeviceOrientationControls.

  • Finally I would like to use the devicemotion event to get the acceleration x/y/z of my device and update the camera’s position with those values every frame to be able to walk around my cubes. At the moment I’m doing this:

    const normalizedDistance = new THREE.Vector3()
    const currentPosition = new THREE.Vector3()
    const accX = this.findDistance(this.accelerationX, 0.9) // m/s^2 convert to distance <=> 0.5 * acceleration * speed ** 2
    const accY = this.findDistance(this.accelerationY, 0.9)
    const accZ = this.findDistance(this.accelerationZ, 0.9)

    normalizedDistance.copy(new THREE.Vector3(accX, accY, accZ))
    const angleRad = THREE.MathUtils.degToRad(this.rotationGamma)

    currentPosition.copy(
      normalizedDistance.applyAxisAngle(new THREE.Vector3(0, 1, 0), angleRad)
    )

    this.camera.position.copy(currentPosition)

The result I’m getting is the camera jiggling a lot when using a value of ~1s for the findDistance (see code above: findDistance(this.accelerationX, speed) // m/s^2 convert to distance <=> 0.5 * acceleration * speed ** 2 ) and the camera is not moving when using the clock delta time for the speed.
Also when I’m moving my phone the whole scene is moving with it, at the same time, so it doesn’t give the effect of a camera moving throughout the 3D environment around you, if it makes sense.

I’m looking for advices, maybe someone has already been through this and would know if I’m on the good way or not. Thank you.

Hi! There is a long discussion/set of link and resources about the same topic here: Device Motion === Accelerometer !== walking physical mobile device around a room. It does look like an unsolved problem.

I looked into it myself while attempting something similar to what you are doing and came up empty handed: from my point of view the data you get from the accelerometer is simply not reliable enough to infer movement from it.

1 Like

What you’re describing is SLAM tracking

Unfortunately you won’t be able to implement that by yourself, unless you have some hardcore math/coding skills as well as months of time to do it. Using the device’s gyroscope isn’t enough, you’ll also need to analyse the camera feed, probably with some kind of neural network.

And you’ll have to run your code in WebAssembly in order to get sufficient performance for real time rendering.

AFAIK there aren’t any open source implementations of SLAM tracking in a web browser on mobile.
AR.js (open source AR library for JS) doesn’t support it, only AR features like image tracking.

If you want to build an AR experience the user can walk around in, you’ll need to spend some money on a closed source library to do it for you. I’ve used 8th wall on a bunch of commercial projects, seems to be the industry standard at the moment. I’m not aware of other platforms.

1 Like

Thank you for your replies, it already helped me to understand more the complexity of what I want to build.

I couldn’t get good results at the end with this approach, so I will try to use the GPS phone coordinates to achieve the same thing and see! If anyone has some experience with this and would like to share it with me, any comment is welcome. Thanks