We have a scene rendered to a WebGLRenderTarget with a depth texture which is being passed to a ShaderMaterial in a different scene as tDepth. Following is the fragment shader for that ShaderMaterial. It does a depth check and then renders onto the canvas. (I can just disable autoclear of depth and have it working, but I want a custom depth function, hence the code).
Instead of using clipCoord, if I just use vUv it works only when the other scene is a full screen quad with ortho camera. How do I get the correct value of oldDepth in case of other scenes?
BTW: Instead of using two uniforms width and height, i suggest you use vec2 resolution. The variable name clipCoord is wrong. You are not calculating clip coordinates but normalized pixel coordinates (with values from 0 to 1).
The problem that I’m trying to solve is the losing anti aliasing the moment we render to WebGLRenderTarget. Edges look really bad without AA.
The idea is I draw the main object first, then do the post processing using EffectComposer (I need to use outline_pass for somethings), then draw the edges directly to the canvas so that I don’t lose AA.
To be honest, when I posted this question I wasn’t getting the right value for tDepth, but when I tried to make the code simpler, I got the right values . I was seeing edges on the other side as well.
However, there’s still a problem: the lines are jagged which shouldn’t be the case.
I’ve refactored your codepen so the actual depth texture is rendered. You can still see the effect similar to missing antialiasing. You could try to blur the depth texture a bit. Maybe this helps to smooth your lines…