Depth texture banding

I use Rendertarget with scene.overrideMaterial = depthMat to get the depth map of the scene. I sampled the depth map on the full screen triangle in post-processing and saw a lot of bands of color. Is there any way to fix them? These bands affect my calculations when I use the sobel operator to calculate edges


float LinearEyeDepth(float depth) {
  float x = 1. - uCameraFar / uCameraNear;
  float y = uCameraFar / uCameraNear;
  float z = x / uCameraFar;
  float w = y / uCameraFar;
  return 1.0 / (z * depth + w);
}


float SobelSampleDepth(sampler2D s, vec2 uv, vec3 offset) {
  float pixelCenter = LinearEyeDepth(texture2D(s, uv).r);
  float pixelLeft = LinearEyeDepth(texture2D(s, uv - offset.xz).r);
  float pixelRight = LinearEyeDepth(texture2D(s, uv + offset.xz).r);
  float pixelUp = LinearEyeDepth(texture2D(s, uv + offset.zy).r);
  float pixelDown = LinearEyeDepth(texture2D(s, uv - offset.zy).r);

  return abs(pixelLeft - pixelCenter) +
    abs(pixelRight - pixelCenter) +
    abs(pixelUp - pixelCenter) +
    abs(pixelDown - pixelCenter);
}


void mainImage(const in vec4 inputColor, const in vec2 uv, out vec4 outputColor) {

  vec4 outlineColor = vec4(0.0, 0.0, 0.0, 1.0);

  vec3 texel = vec3(1. / resolution.x, 1. / resolution.y, 0.0);

  float sobelDepth = SobelSampleDepth(uDepthTexture, uv, texel);

  sobelDepth = pow(abs(clamp(sobelDepth * uOutLineDepthMul, 0.0, 1.0)), uOutLineDepthBias);

  outputColor = vec4(vec3(sobelDepth ),1.);
}

I think depth textures are unfiltered by default.

You could try..

depthTexture.minFilter = depthTexture.magFilter = THREE.LinearFilter;

after you create your depth texture..
not sure but worth a try.

I tried it, and when I set it to LinearFilter, the final output color was black. I couldn’t see anything. I don’t know what went wrong.

const depthTexture = useMemo(() => {
    const depthTexture = new DepthTexture(width, height )
    depthTexture.format = DepthFormat
    depthTexture.type = HalfFloatType
    depthTexture.minFilter = depthTexture.magFilter = LinearFilter
    return depthTexture
  }, [width, height])

  const depthBuffer = useFBO(width , height ,{
    generateMipmaps: false,
    depthTexture,
  })

What are your camera near/far settings?

If you have a large difference between near and far, you get less precision in depth.
So good values are like.. near .1 far 1000. or a similar ratio…

near is 0.1 and far is 50,I tried changing it to 1 to 10,the final result displayed on the screen is gray.

near:0.1 far:50

near:1 far :10

You may have to adjust uOutLineDepthMul and uOutLineDepthBias when changing the ranges?
Or make it a function of near-far.

I tried sampling the depth map directly and converting it to a linear 0-1 to eliminate the effect of the uniform value passed in, but the result did not change.

float Linear01Depth(sampler2D depthTexture, vec2 coord) {
  float depth = texture2D(depthTexture, coord).x;
  float x = 1. - uCameraFar / uCameraNear;
  float y = uCameraFar / uCameraNear;
  float z = x / uCameraFar;
  float w = y / uCameraFar;
  return 1.0 / (x * depth + y);
}
outputColor = vec4(vec3(Linear01Depth(uDepthTexture, uv)), 1.0);

By the way, without setting LinearFilter, you can see the change in depth, but there will be banding. This confuses me. I also tried to visualize the depth map in babylon. In babylon, if the saved depth map is 32bit, then no banding will be seen.

Have you tried in three with FloatType instead of HalfFloat?

yeah,this is not much different from HalfFloatType

const depthTexture = useMemo(() => {
    const depthTexture = new DepthTexture(width, height )
    depthTexture.format = DepthFormat
    depthTexture.type = FloatType
    // depthTexture.minFilter = depthTexture.magFilter = LinearFilter
    return depthTexture
  }, [])

  const depthBuffer = useFBO(width , height ,{
    generateMipmaps: false,
    depthTexture,
    type:FloatType
  })

result:

Try: depthTexture.minFilter = depthTexture.magFilter =NearestFilter;
Camera near 0.05-0.1 can give artefacts.
For depth without sending camera near far uniforms:

float LinearEyeDepth(float depth) {
return projectionMatrix[3][2]/((depth*2.0-1.0)+projectionMatrix[2][2]);
}

almost white,I zoomed in and saw a little bit of gray, but there were still bands.

By the way, I also tried using the depth in pmndr’s postprocess pipeline and converting it to linear01. It has obvious banding, just like the depthtexure I provided.

I wrote a demo for everyone to view and debug

demo

1 Like

Commenting out the devicePixelRatio settings made it look a lot better.

I suspect because the effectcomposer and depth target are using window size, but the renderer is using non 1 devicePixelratio.
To get it to work correctly with devicePixelRatio, you might have to multiply your width/height for depth texture + composer by that ratio also.

https://codesandbox.io/p/devbox/sharp-lucy-xdpqcr

1 Like

I’m glad to see your modified demo. The banding is indeed reduced. The best solution I’ve found is to replace the LinearEyeDepth function with Linear01Depth, that is, from [near, far] to [near/far, 1]. I think the specific reason is that in linear space, the range is [near, far], the rate of change is too large and the depth map is not accurate enough. Converting to linear 01 depth can greatly reduce the banding.

This is the demo I modified. You can refer to it.
https://codesandbox.io/p/devbox/interesting-carlos-kcrkm5

The specific basis for this guess is that I reproduced the stripe in Babylon, and the solution is to force storage as 32bit

Here are the relevant screenshots for your reference:

3 Likes