I am trying to reconstruct the world position of any geometry contained within a volume (a BoxGeometry to which the shader material is applied) using the depth buffer. And I am trying to do this with the webgpu renderer and TSL.
I see people have tackled this challenge in threejs before such as here – I think I’ve grasped the basic idea of applying the inverseViewProj matrix to clip position. However I couldn’t make any solution work thus far.
Here’s how far I’ve got:
const getWorldSpaceFromDepth = Fn(() => {
const depth = viewportDepthTexture(screenUV).r;
const clipPos = vec4(
screenUV.xy.sub(0.5).mul(2),
depth.sub(0.5).mul(2),
1.0
);
// clip -> view
const viewPos = cameraProjectionMatrixInverse.mul(clipPos).toVar();
// perspective division
viewPos.assign(div(viewPos, viewPos.w));
// view -> world
const worldPos = cameraWorldMatrix.mul(viewPos);
return worldPos.xyz;
});
material.colorNode = vec4(getWorldSpaceFromDepth().mul(0.01), 1.0)
The output col however does not look like world space position, looking around and moving the camera affects it:
Being fairly new to shaders and very new to TSL I’m having a hard time figuring this one out, any help would be appreciated!