[Solved] Problem with blending or fragment shader gl_FragDepthEXT

I am in the process of implementing a 3D billboard, but writing to gl_FragDepthEXT is not working as expected. I think that maybe the drawn pixels is whatever is furthest from the camera. I’ve also tried flipping the sign of the gl_FragDepthEXT in case I got my camera orientation reversed, but to no avail. gl_FragColor is mapped from the fragment z depth and the color/shade looks as expected, so I think math is correct.

Here is a description of how the billboard is implemented - each “sphere” is actually comprised of two triangles that always face the camera. At each fragment position, the z-depth is calculated as if the surface was in fact spherical and that is written to gl_FragDepthEXT.

I have created a jsfiddle to demo the issue. All the spheres are on the Y-axis and the camera is on the Z-axis.

And this is what it looks like. I think it’s either a blending issue or an issue with the depth buffer.

So I solved this, but I’m not absolutely sure why it works. I think the reason is that now all values assigned are in the range [0, 1]. I replaced this:

gl_FragDepthEXT = unit_offset_z;

with this:

gl_FragDepthEXT = 1.-unit_offset_z;

So I guess the variable’s sign or camera view was reversed. Previously I had tried this, but that puts values negative.

gl_FragDepthEXT = -unit_offset_z

Oh well, please chime in if you know what the range of valid values are for gl_FragDepthEXT. I haven’t seen any specs describe this. Anyway, now it looks correct: