WebGPUTextureRenderer: Make TextureNode from depth buffer

Hi everybody,

I want to modify the WebGPU-RTT example (which renders to a fullscreen quad) to try post-processing in the WebGPURenderer using the NodeMaterials.

As the title says, I also need the depth buffer as a Node, to use depth information in my shader. Currently I only have the texture from the renderTarget. My first idea was to create a TextureNode from a Depth Texture of the TextureRenderer´s RenderTarget, but unfortunately this doesnt seem to work, here is the relevant code:

    // create a WebGPUTextureRenderer and set a DepthTexture
    textureRenderer = new WebGPUTextureRenderer(renderer);
    textureRenderer.setSize(window.innerWidth * window.devicePixelRatio, window.innerHeight * window.devicePixelRatio);

    let depthTexture = new THREE.DepthTexture();
    textureRenderer.renderTarget.width = window.innerWidth * window.devicePixelRatio;
    textureRenderer.renderTarget.height = window.innerHeight * window.devicePixelRatio;
    textureRenderer.renderTarget.texture.minFilter = THREE.NearestFilter;
    textureRenderer.renderTarget.texture.magFilter = THREE.NearestFilter;
    textureRenderer.renderTarget.stencilBuffer = false;
    textureRenderer.renderTarget.depthTexture = depthTexture;
    textureRenderer.renderTarget.depthTexture.format = THREE.DepthFormat;
    textureRenderer.renderTarget.depthTexture.type = THREE.UnsignedShortType;

    // set up FX
    cameraFX = new THREE.OrthographicCamera(-1, 1, 1, -1, 0, 1);
    sceneFX = new THREE.Scene();
    const geometryFX = new THREE.PlaneGeometry(2, 2);
    const materialFX = new Nodes.MeshBasicNodeMaterial({
        side: THREE.DoubleSide
    });

    // first try: just visualize the depth texture
    // this returns an error: "Multiple aspects selected in [TextureView]"
    materialFX.colorNode = new Nodes.TextureNode(textureRenderer.renderTarget.depthTexture);
    
    const quad = new THREE.Mesh(geometryFX, materialFX);
    sceneFX.add(quad);

Is my general approach right, or if not can anyone push me in the right direction? I saw that the image object of the depthTexture has width and height as undefined, maybe this has something to do with the issue?

Thanks in advance for help!

Hi @geolehmann.

The logic is right, this PR added this new feature for WebGPURenderer:

1 Like

Thanks @sunag ! For me this comes a bit too late - I already switched to rendering using Rust’s wgpu library via WASM - but for other three.js users this is important since now the depth buffer can be sampled e.g. for post processing effects