I’ve been testing architectural models with various applications. These models are often use a base point that is not 0,0,0 and often is very far away from origin. I’ve noticed that three.js doesn’t show many of the precision errors I see in other applications where meshes rendered far from origin start getting wacky due to floating point precision errors. Where could I get some insights on how this is achieved in threejs?
From what I can tell the lighting values used in the shaders are in view space (AKA. their positions/directions are relative to the camera). This means that the math is always in the [-1.0, 1.0] range, which minimizes floating point precision errors.
How are you loading the models? As OBJ, glTF, something else? It’s possible ThreeJS is automatically normalizing the coordinates of the geometry, which is how you would normally handle this kind of precision. Typically you’d store a geographic origin of the model, and the model geometry would still be centered around (0, 0, 0), instead of baking in geographic coordinates directly in model geometry.
three.js does no explicit handling of precision related issues.
I’m not sure how other 3D APIs work on Desktop but WebGL in general (and thus
three.js) always execute shader code in
highp (for floats and integers).