Ah, that’s almost perfect! Maybe tweak the edge smoothness with a uniform, adjusted depending on screen resolution? Hm. I think you would transform d
like this:
d = 0.5+sharpness*(d-0.5);
Ah, that’s almost perfect! Maybe tweak the edge smoothness with a uniform, adjusted depending on screen resolution? Hm. I think you would transform d
like this:
d = 0.5+sharpness*(d-0.5);
Hm, I don’t understand this. Isn’t sorting the particles by distance from camera what the vertex shader does with gl_Position.z
? Is it a matter of z-fighting? In that case, could reducing camera range be enough in most cases?
Just commet out that line sortPoints(collective);
in my last codepen and you’ll see the difference.
Seems, for objects with enabled transparency, it depends on the order of vertices in its geometry, and for an indexed geomety it depends on its .index
(which we change by sorting in the last codepen here).
Are you manipulating the drawing order of the vertices by ordering the index, to not rely on correct depth testing and alpha blending? Isn’t drawing order device-dependent, since vertex processing is in principle parallel?
Whoah, that’s a strange effect! (On my computer, at least.) The (supposedly transparent) surrounding squares partly overwrite neighbor particles. I haven’t yet understood how/why you use(d?) alphaTest
here.
Sorting points like that at every animation step is not sustainable/scalable. Hm. I still don’t understand why it is necessary. I am encountering the same problem with my spheres, when I try to set opacity on them instead of discarding fragments (eventually to achieve antialiasing). It is as if points forcibly write the clear color on top of other points.
Agree
If everyting was so simple with transparency, we wouldn’t have such interesting topics, like that Depth peel and transparency.