Severe shadow acne after switching to WebGPURenderer

I’m in the progress of switching our app from WebGLRenderer (0.161) to WebGPURenderer (0.182)

I am experiencing severe shadow map artifacts / acne, as can be seen on the following screenshots:

WebGLRenderer (0.161):

WebGPURenderer (0.182):

Both have been setup with the identical:

  • Shadow map resolution (1024)
  • Shadow map camera near / far
  • Render camera near / far
  • Shadow bias (0)
  • Normal bias (0)

If I set the shadow bias to -0.0005, I can mitigate some of the sawtooth artifacts, but it does not resolve them completely (as can be seen on the left part), and also introduces an unwanted offset to the shadows.

shadow.bias: -0.0005

Interestingly, choosing any positive shadow bias, will make the scene completely shadowed.

I am running out of ideas what could be the problem. Do I need to configure the renderer in some specific way?

Any input would be highly appreciated!

Any chances to demonstrate the issue with a small live example?

We have observed multiple reports regarding shadow acne issues with WebGPURenderer. It is more often required to set a shadow bias or a larger bias to mitigate artifacts. However, larger bias leads to “Peter Panning” artifacts which means the shadow translates away from the geometry. We potentially need an issue for this at GitHub. A live example would still be useful though.

I ran into similar issues when testing WebGPURenderer. In the end, I reverted back to WebGLRenderer for production. WebGPU is promising, but it still has some way to go before it’s fully reliable for real-world applications, especially around shadows and stability. For now, I’m keeping WebGPU strictly experimental until it’s mature enough to be a true replacement for WebGL.

Interesting! This is unfortunate, so far I was really happy with the overall functionality. I hope that I don’t run into stability issues as well

Thanks for the input, yes the “Peter Panning” artifact is exactly what I experienced when increasing the shadow bias value. However, this should not even be necessary, as the WebGPURenderer result demonstrates.

Luckily, I was able to reproduce a minimal example that shows the same issues:

For me, it points towards a bug in the WebGPU shadow mapping :thinking:

1 Like

By the way, this is exactly the same scene using WebGLRenderer. It looks fine!

I created an issue on github as this is most likely a bug:

1 Like

@TobiasNoell the following setting seems to fix the issue in your fiddle demo…

light.shadow.normalBias = 2;

I’m not sure if normalBias has to be so high due to the large scale of your scene but this looks like it cleans up the shadow acne.

you can remove shadow.bias all together and this still works

This is interesting, but it introduces artifacts at the edges of the sphere

And in WebGL this is not required, and it does not introduce this artifact

Hi @TobiasNoell

I got this PrtScr in jsfiddle in .WebGLRenderer…

Similar issues:

Does tightening the shadow frustum help in the WebGPU case?

Here’s a fiddle with added light and shadow camera helpers, and we can see the frustum is huge, much larger than the content of the scene (typically not ideal): three.js dev template - module - JSFiddle - Code Playground

Here’s a fiddle with the shadow camera frustum adjusted to encompass the scene, but in this case, unlike mine, the much tighter frustum did not help: three.js dev template - module - JSFiddle - Code Playground

Indeed, normalBias 2 does work well: three.js dev template - module - JSFiddle - Code Playground

I wish we had per-object shadow bias values (f.e. like Blender). A bias value per light is sometimes too inflexible.

1 Like

If these bounds are the same then there is probably a bug somewhere.

Yeah, in general I’ve noticed various rendering differences across WebGLRenderer and WebGPURenderer that previously prevented me from migrating. If some algos are supposed to be identical, then yeah this shows they aren’t. I do imagine WebGPURenderer is the main focus now, and that algos will start to differ as WebGLRenderer becomes outdated (and eventually deprecated/removed?).

1 Like

Thanks for the additional investigation.

It is true that normalBias and or shadow bias can mitigate some of the effects.

But, as the WebGL case shows it should not even be necessary to increase them here. In general, they should be kept as low as possible, as they would introduce unwanted artifacts. In my opinion, the only viable explanations for them requiring to be adapted in the WebGPU case would be that either

  • WebGPU would have lower precision for their depth buffers
  • There is a bug in the WebGPU implementation adding some imprecisions

Both would make sense to be investigated further :slight_smile: