Objects jitter when too far from camera

I want to make stars which are far away look like small dots but when distance gets too high they appear to flicker.

I’ve tried setting logarithmicDepthBuffer to true but the issue still remains.

What is actually causing this?

Can’t promise that’s the only issue, but your stars at some point seem to reach sub-pixel sizes and antialiasing starts smudging them.

Setting renderer.setPixelRatio(2.0) (on Mac) and doubling the size of the stars does seem to fix the blinking. For a very distant stars you may want to limit the point size or apply some smooth fog, so the blinking is not so evident.

Thanks, this does work but I was looking for an effect where the stars slowly appear out of nothing and they are still really small. I see that the renderer can handle small pixels without flickering in this example https://threejs.org/examples/?q=logarithmicDepthBuffer#webgl_camera_logarithmicdepthbuffer

but I can’t seem to get logarithmicDepthBuffer to work with the custom shaders.

Basically when your sprite (a quad in this case) is really far away a small shift in the position will change the “UV” value (or point on the sprite) for the piece of the triangle that is being rasterized to the screen which is in turn being used to determine the color of the pixel:

vec2 uv = vec2(gl_PointCoord);
vec2 centre = vec2(0.5, 0.5);
vec2 pos = centre - uv;
float intensity = smoothstep(0.15, 0.02, length(pos));

Here if length(pos) is large then the pixel color will be dimmer and if it’s small it will be brighter. So hopefully you see how a small change in the uv value could cause a change the brightness of the pixel frame to frame causing the jitter or twinkling here.

It may be easier to think about this through texture sampling – when a triangle is really small (the size of a pixel) which part of the texture should be sampled? To solve this mipmaps are used and the smaller the triangle is the smaller the mipmap will be used (it actually uses derivatives of the uv but you get the idea). You can see how a GPU calculates which mipmap to use here. Because you’re not using textures here you don’t get that for free so you have to come up with a formula that accounts for this yourself using the dFdx and dFdy functions.

Now keep in mind I’m not an expert on writing shaders that account for this type of problem but I’ve messed around with your example and gotten it working such that there is no twinkling. This is all trial and error so I’m sure there’s a better way to do this but you can toy around with it more to get it looking the way you want. Here’s what I changed in the fragment shader:


float intensity = smoothstep(0.15, 0.02, length(pos));


vec2 ddx = dFdx(uv);
vec2 ddy = dFdy(uv);
float delta_max = max(length(ddx), length(ddy));

float mult = 1.0 - delta_max;
float len = length(pos) * pow(mult, 3.0);
float intensity = smoothstep(0.15, 0.02, len) * mult;

Thank you, this works for the stars.

I’m in the beginning of “the book of shaders”, don’t know much about webgl and rendering, I’m happy how three js abstracts away the difficult stuff.

Anyway I can’t seem to grasp what’s happening there.
Is some kind of averaging of pixels being performed like this

If I don’t use smoothstep for example with a glow shader
I have no idea how to do it, since I don’t get what’s going on I can’t apply it elsewhere.

float dist = 1./len;
dist *= 0.05;
vec3 col = dist * vcolor;

I think this topic is a bit complicated and like I said I’m not an expert on it myself so I’m not sure if I’ll be successful in conveying all the details of what’s going on without a lot of diagrams etc.

If you’re just getting into shaders I recommend getting a bit further into it first – I’d consider this an advanced topic. If you want to understand the details of the problem you’re running into here I’d look into texture mipmaps, how / why they’re generated, and how the GPU selects which mipmap to use automatically.

1 Like