Hi there,
I’m currently working on rendering a mesh with a wireframe drawing. This is done by setting barycentric coordinates to the geometry’s vertices attributes and use them in the fragment shader. That works pretty well.
The problem I face is the wireframe appears brighter when zooming out, as the lines get nearer. So my idea is to get the depth of the vertices to decrease brightness depending on this parameter.
I’ve spend some time to play with the depthBuffer of a render target, especially with https://threejs.org/examples/#webgl_depth_texture. There are 2 draw passes. The first to render the objects, the second to display the depth from the depth texture in a quad/ortho camera.
I need to access depth in the same pass as I draw the geometry, in my fragment shader. Anyone knows I can achieve that?
In a vertex shader, you generally calculate gl_position as position multiplied by the model view projection. What you can do is, send say view space z as a varying variable to the fragment shader.
So
viewZ = -(modelViewMatrix * vec4(position.xyz, 1.)).xyz
And now use viewZ inside your fragment shader.
Probably there might be a variable viewPosition already available in three.js.
You can also have a look at the implementation of e.g. SAOShader. It shows, how you can extract the depth of a depth texture and then compute the viewZ value:
Functions like perspectiveDepthToViewZ() are defined in various built-in shader chunks. You can include these chunks in your custom shader via an include statement e.g.: #include <packing>.