WebGPURender with logarithmicDepthBuffer is not work!

The logarithmic depthbuffer in webgpu works very well. I use it. You have to use the depthNode in webgpu if you use the logarithmic depthbuffer with custom raw shaders.

My whole app works with the logarithic depthbuffer. Works great :blush: And my app is huge.
In webgl you operated the depth buffer from the fragment shader. In webgpu there is an extra node for the depth buffer because wgsl works a little differently.

Three.js even has an example with the logarithmic depthbuffer in webgpu

https://threejs.org/examples/?q=loga#webgpu_camera_logarithmicdepthbuffer

In the screenshot you can see the webgpuRenderer and that the logarithmic depth buffer is active. The example is up to date with r167. It uses three.webgpu.js instead of three.module.js but that all worked since at least r161with three.module.js. The logarithmic depth buffer is therefore fully integrated into the webgpuRenderer. However, if you want to avoid the node materials and use your own shaders like I do, then you will have to operate the depthNode yourself. If you ignore it you will get your error messages.

Regarding the general status of webgpu in three.js:
I’m working very intensively with webgpu. I only work with webgpu. Three.js has changed its architecture with r167. The node system is now in the src folder and no longer in examples/jsm/, which I personally think is very good. With three.webgpu.js, three.js is also completely set up for webgpu. I assume in order to be able to maintain webgpu completely independently of webgl in the future. The webgpu system already works very extensively. I don’t miss anything from webgl anymore because the webGPURenderer and the node system meanwhile offer everything I know from webgl.

5 Likes