Hey guys,
This is not a complaining pitch about “why Three.js is bad” or a promotional “please use my library” post. It’s more about my insane journey and addiction to showcasing grand scale—millions of objects, rendering 100k+ items with the coolest post-processing filters. I even had one guy mistake my library video for movie VFX, which was pretty cool!
Hold on, this is going to be an epic story.
It started before the AI era. Back when I was a novice in Three.js, I did my fair share of tutorial projects. Then one day, I visited the NASA Eyes website and got completely captivated by its beauty. I thought, no way, could I do this too? Since this was before GPT, I had it rough. But I did pretty well! I rendered stars and our planets using circle geometry by rendering points, and the sun by rendering a sphere. It was awesome. With orbital controls, it almost looked like 0.001% of NASA Eyes. I was so happy… until I decided to add labels to the planets, and I got completely stuck. In theory, I put the text in a group so it would follow, but the problem came when I clicked on a planet to lerp toward it. Some mystery bug kept fixing the sun to the center of the screen. After spending days trying to fix it, I gave up, shoved the project in the closet, forgot about it, and moved on.
Then AI came along, and boy, did I remember this project. Out of nowhere, I decided to let AI take a crack at it, and voila—it solved it! I kept improving the project from that point on. It’s a long story, but here’s what I did in order:
-
Got LabelManagers and CameraManagers working, so clicking takes you to the planet just like the NASA Eyes website.
-
Added support for wormhole portals—a mini-game-like layer where we can fly around the solar system.
-
Made it so you can cross wormholes to see other star systems (SceneRegistry).
-
Moved away from circular orbits to elliptical orbits following Kepler’s laws.
-
Parsed NASA exoplanet data so it could be consumed by the Three.js layer (pulled 4,100 star systems!).
-
Modularized it so I only needed JSONs to create a single star system (even created fictional ones with this).
-
Added solar system textures, procedural planets, and post-processing, keeping the spaceship mode separate from the game layer.
-
Added support for multi-star systems and a 3I/Atlas mode to capture that hype.
End result: Grabbed 500k views on Reddit with mentions in a couple of “Site of the Day” galleries (orionrealms.com).
After this, I pondered and researched a lot: How can I make a walkable planet layer? The conclusion: it might be possible if we use offscreen rendering with a Data-Oriented Design (DOD) approach and try WebGPU.
Enter the creation of the Axion Engine. At its beginning, it had two web workers running alongside the main thread (using R3F and Three/WebGPU). It worked fabulously for its early iterations, rendering a million objects via InstancedMesh and animating 100k objects by supplying transferable arrays.
I thought, Wow, this is great, a walkable planet is possible! But then came the biggest boss: Cell-based origin rebasing architecture. It’s an absolute must if I want this dream to land. I sketched out my sim worker and render worker again, until I got hit with another problem—specifically with R3F. In origin rebasing, there’s an operation where we change the position for all objects in the visible 3x3x3 grid. This is the minimum baseline, but doing this shakes the R3F tree, causing scene rebuilds that drop the frame rate to 1 FPS!
I was like, okay, this is horrible, and I figured out the exact reason why. Next step: ditch React and R3F, and do it in vanilla Three.js.
It took some time to migrate the code, but finally… wow. It worked like seeing magic for the first time. I could cross a grid without triggering endless scene rebuilds in any direction. Just like those big open-world games where you can spend days inside, lol. I was thinking, heck yes, I’m going to be the one to bring this to the web! I might go down in history books like mrdoob! Well, until I hit more fundamental problems:
-
I couldn’t put a new material into the scene without dropping frame rates.
-
In 10 minutes of gameplay, a player could get hit with Garbage Collection (GC) stutters, ruining the gameplay.
-
I tried a lot of caching strategies—cached lights, materials, geometry. I thought, okay, this works and looks cool. If I ignored the initial initialization lag or frame drops, it worked like magic. I could walk in any direction endlessly and see new objects and lights because they used cached materials and geometries.
-
Enter the breaking point: InstancedMesh. I literally never would have guessed that a thing designed for optimization and large scale would be the exact reason why I had to create my own renderer in the end.
-
You might ask, “Why exactly is this a problem?” As I said, I was caching geometries and materials, and it worked like magic until it involved InstancedMesh. The reason is that it allocates a fixed block of memory, and the count is fixed. Adding or updating an item guarantees a scene rebuild (not to be confused with animating it, which works fine). In an origin rebasing scenario, you want a setup that takes items dynamically and removes them from the scene to maintain the illusion of an infinite world.
If anyone wants to see the magic of the Axion Engine—infinite objects—and can bear the patience for the initial lag and the first 2-3 grid jumps (the time it takes to fully cache materials and geometries), you can check it out here: (https://axion-engine.web.app/)
Now, not to bore you too much, let me tell you about my custom WebGPU renderer and what I achieved with it:
-
It’s a fully DOD library that deals exclusively in ArrayBuffers (I believe that’s the only thing that actually makes objects move at the end of the day).
-
It’s a minimal wrapper so I can experiment freely with it.
-
I called it “Null-Graph” because of the absence of a scene graph. It does “null” things and has “zero” features out of the box, but in the hands of the right person, you can make your dreams come true with it.
-
Live test Engine: https://null-graph.web.app/
-
WebGPU Experiments compilation: https://www.youtube.com/watch?v=FUVB_oKp6jI
Why not integrate it with Axion Engine yet, you might ask? Well, the renderer right now is highly coupled with Three/WebGPU, so it’s going to take some time to migrate. For now, I’ve just been playing with my custom renderer, and man, it is sick AF. I posted it on LinkedIn and got random researchers, CEOs, CTOs, and PhD grads liking my posts—it’s super cool. It renders a lot of physics and math papers as part of the experiments.
Also, yes, I used AI to build my projects. In my defense, I feel like I obviously know how to code, but with AI, I save a ton of time that instead gets spent on building intuition for 3D. I might have full-blown AI psychosis, lol, but I guess it’s a ‘pick your poison’ kind of thing in the end. I could try building it all by myself without AI, reading through documentation like the old days, and reach a mediocre result—or I could do it with AI and reach the exact same place in half the time.
What do you guys think about my wild ideas, techniques, and shamelessness when it comes to pushing AI, GPUs, and CPUs to their absolute limits?