Hi all–I have scoured the internet and researched for 6 weeks and have learned all about the new node-based shader system in order to attempt a solution to this, but have had no luck.
My question is this: In three.webgpu.js, how can I selectively disable the logarithmic depth buffer? Ultimately I want to only disable the logarithmic depth buffer just for depth textures, but leave it enabled for all other rendering.
In three.webgpu.js there is a tiny bit of code that is making the fragment depth value logarithmic when the logarithmicDepthBuffer
option is true
. Whatever depthNode
is set to, determines the depth strategy. This is the existing code on line 32506 in three.webgpu.js v168:
// three.webgpu.js v168 (line 32506)
// https://unpkg.com/browse/three@0.168.0/build/three.webgpu.js
const fragDepth = modelViewProjection().w.add(1);
depthNode = fragDepth.log2().mul(cameraLogDepth).mul(0.5);
I would like to make use of a uniform
node to selectively enable or disable the logarithmic depth buffer. This is what I have so far:
const myUniform = uniform('float');
myUniform.value = 0; // 0 = false, 1 = true; boolean uniforms not supported yet
const fragDepth = modelViewProjection().w.add(1);
const logarithmicDepth = fragDepth.log2().mul(cameraLogDepth).mul(0.5);
depthNode = select(myUniform.equal(1), logarithmicDepth, ??? );
^^^ As you can see in my code above, what do I put in place of ???
(question marks)? Whatever is used there will be just the standard (non-logarithmic) depth node. I have tried using the local variables fragDepth
, positionView.z
, and all sorts of constant float values. Some values cause the renderer to crash, and some are still able to render the scene but the depths are all messed up.