In three.webgpu.js there is only one place in the library where the logarithmicDepthBuffer
renderer flag is used. It is used in a function called setupDepth()
in the NodeMaterial
class:
class NodeMaterial extends Material {
// ...
// ...
setupDepth(builder) {
const {renderer} = builder;
// Depth
let depthNode = this.depthNode;
if (depthNode === null) {
const mrt = renderer.getMRT();
if (mrt && mrt.has('depth')) {
depthNode = mrt.get('depth');
} else if (renderer.logarithmicDepthBuffer === true) {
const fragDepth = modelViewProjection().w.add(1);
depthNode = fragDepth.log2().mul(cameraLogDepth).mul(0.5);
}
}
if (depthNode !== null) {
depth.assign(depthNode).append();
}
}
// ...
// ...
}
What code in three.webgpu.js must be changed in order to force the generation of depth textures to remain non-logarithmic, but still allow for logarithmic depth for the rest of the scene?
The reason I would like to do this is because I am trying to fix a bug involving shadows and the logarithmicDepthBuffer
.
There is currently a bug in the three.js WebGPU build 168 whereby setting logarithmicDepthBuffer = true
causes shadows to appear on meshes above the object casting the shadow. See report here: WebGPU - Duplicate shadow when logarithmicDepthBuffer is true - Shadow appears below and above object (WebGPU build 167) · Issue #29200 · mrdoob/three.js · GitHub
I am thinking that the bug is caused by the depth texture generation for shadowmaps being logarithmic, but the depth values in fragment shaders remain in normal “non-logarithmic” state. So, when the shaders try to compare the depth value of each fragment against the depth value of the shadow depth texture, it is like comparing apples and oranges because one is logarithmic and one is not.
I would like to see what happens if I can disable the logarithmic calculations for the depth texture only, to confirm my theory.