I am using InstancedBufferGeometry+ShaderMaterial to render like a galaxy of 200,000 stars as glowing coloured orbs.
A little snag imperceptible at first: the stars in front don’t always render on top of the stars behind.
I understand normal three.js process of clipping and culling don’t work for InstancedBufferGeometry, because the final shape of the objects is decided in GPU.
But is there a way to order the instances rendered? Or any other way to make instances closer to the camera render on top?
If it’s possible to skip rendering those behind that’s a bonus. But at this point I’m fine with rendering all of them anyway.
1 Like
To better illustrate the issue:
In the left-bottom corner, the occlusion follows the dashed line, rather than dotted.
The choice is arbitrary, and I’d like to order the instances by their distance to the camera.
At the moment I’m tricking the perception by making all orbs cloudy and semi-translucent.
They are rendered in the order they are dispatched to the gpu.
You can sort the particles on the CPU, and write them back in sorted order to the buffer.
Another approach might be to generate an Index buffer… and InstancedMesh. The index buffer is an array of instance indices… which you could create a few version of, corresponding to sorting along the 4 cardinal axes… and then at runtime you pick which index to use, based on camera orientation.
Another (cheaper) trick is to use AdditiveBlending instead of the default normalblending, since additiveblending works out the same no matter what the order really… you just might lower the brightness of your colors a bit so they don’t all just blow out to white when layered…
1 Like
BatchedMesh now supports instancing per mesh in addition to frustum culling and per-object sorting. You could try that.
3 Likes
Very very interesting!
Tried the example, and it does survive at 200,000 instances. 11FPS on iMac M1, but works.
https://threejs.org/examples/webgl_mesh_batch
I might use it for showing 5,000-10,000 nearby “stars” in high detail, while retaining more scalable InstancedBufferGeometry for the rest.
Remote stars blend into a mulchy fog anyway, who cares which one comes on top. In fact I’m already washing out the remote stars’ colours to white, to reduce “quantum” colour shimmering during movement.
Great advice, thanks a lot @gkjohnson 
AdditiveBlending is an ace move!
No more weird overlaps, although clumps of matter get a washed-out white-grey look. Which in fact feels more natural.
I’m thinking of rendering close stars with BatchedMesh suggested by @gkjohnson and flipping the AdditiveBlending for the rest.
And your clues suggest a scalable solution for cases where AdditiveBlending doesn’t fit:
- Make what currently is per-instance attributes (star color, size, position) as shared for all instances
- Create a couple dozen buffers containing indices ordered by camera distance, each for a different angle of view — these are per-instance attributes
- A scalar attribute saying which index to pick based on the camera direction
This way everything is ordered, and nothing needs to be pumped to the GPU, except that scalar picking the view angle index.
20-30 angles pre-ordered into index buffers should be enough when looking outside onto the galaxy. If you fly close into the midst of stars, it may not quite work on peripheral vision though: they are still ordered as if looked from the remote distance.
A secondary solution may be needed, if these aberrations are noticeable.
But I’ll go with the first simpler option: majority on AdditiveBlending, BatchedMesh doing the nearest. For the good of the galaxy 
Thanks a lot @manthrax 
1 Like
Yet another approach… render the dots as solid circles with alphaTest:.5, depthTest:true,depthWrite:true… (forces the correct ordering but has no glow/fuzz…
then use unrealBlend postprocess to blob/bloom them out.
How does that work? How do you render those circles?
Do you mean 2D rendering using pixel shader on some sort of remote rectangle filling the camera view flush?
are your star blobs made w a soft blob texture? i mean replace that with a hard edge circle and enable alpha test.
Then they will sort correctly like solid geometry… but then once u apply the bloom on top, they will glow…
(sorry I’m talking about getting correct depth sorting… not sure if I drifted off topic?)
No textures. My InstancedBufferGeometry has 200,000 instances of a single triangle.
The vertex shader places a triangle facing the camera, sized to contain the whole of the orb to render.
The pixel shader then draws colour pixels, fading the alpha depending on the distance to the centre of triangle. That makes the center a fuzzy spot, and corners are transparent.
This performs well on low-end GPUs: only 200K triangles.
Here’s the code: atlas/src/webgl/layers/mass-spot-mesh.js at 21d5328faa2b43c3d34e399914e438ef758be369 · mihailik/atlas · GitHub
So if I tweak the rendering to make triangles visible:
Hi Amy this reply doesn’t show on the web somehow? I’ll reply from the email.
At the moment it’s one instance per star, so the vertex shader also runs once for each “star”.
It’d be a lot to do that sorting repeatedly for each star.
It may be the ultimate solution to have a pre-pass, first sorting the input into some kinda texture buffer, then using that texture in the second pass for actual rendering.
I’ve never done 2-pass rendering, fear it’s a bit of a complexity to deal with?
Thanks again for your help, I didn’t even expect such quick support from the community here 
Ahh I see… ok yeah so… So if instead of doing a fancy smooth blob… you do a circle blob… and then “discard” outside the circle. That gets you a solid circle but rendered with depth test to get correct ordering. Then you run bloom post process on that is what I’m thinking…