Apply depth buffer from render target to the next render

Hello
I’ve been dealing with an issue for some days now. I have the following situation in my render loop:

  1. Render scene1 to a render target
  2. Render scene2

I need the depth buffer from the first render to be applied to the second render so that the objects in scene2 are occluded by scene1 objects. The main goal is to get the benefits of the early depth testing as the scene2 objects are very likely to be fragment-shader expensive.

renderer.autoClearDepth = false; allows the depth buffer to remain for the second render, but only if the first render is to default frame buffer and not a render target. This is the case regardless of whether I set depthBuffer: true on RenderTarget settings.

I understand I can render the depth of scene1 and do my own Z-testing by comparing depths in the fragment shader of scene2 objects, but that’s still a texture lookup and requires me to modify the shaders of scene2 objects. That’s why im looking to take advantage of the built-in depth testing.

Is there a way to achieve this in Three.js ?
I know there are ways in raw webgl to do something like this, as shown in WebGL2 Copy depth value to default renderbuffer error - Stack Overflow, but it is problematic to me working with both, as Three.js hides some internal webgl stuff like framebuffers of textures etc. However if you know how to make that work as an alternative I’d very much appreciate that as well :slight_smile:

If necessary I can write more details about the rendering process, but I dont want to make this a wall of text. I’ll just mention that scene1 renders to an MRT in case it’s important.

Thank you for reading! :hugs: :heartbeat:

It’s possible to use the render buffer from a render target in another render call. I have been doing this for using gbuffer as a depth-prepass.
You will have to get the depth buffer from the target, then before the render call attach it to the framebuffer.
To get the depth buffer:

const renderBufferProps = renderer.properties.get(gbuffer)
depthRenderBuffer = renderBufferProps.__webglDepthRenderbuffer || renderBufferProps.__webglDepthbuffer

Then before the renderer.render call and after renderer.setRenderTarget

const _gl = renderer.getContext();
_gl.framebufferRenderbuffer( _gl.FRAMEBUFFER, _gl.DEPTH_ATTACHMENT, _gl.RENDERBUFFER, depthRenderBuffer );

After renderer.render call:

const _gl = renderer.getContext();
_gl.framebufferRenderbuffer( _gl.FRAMEBUFFER, _gl.DEPTH_ATTACHMENT, _gl.RENDERBUFFER, null );

I am not sure about MRT targets though if they have a different private variable. You can check the code for WebGLTextures.js for this.

2 Likes

Thank you so much for the reply :pleading_face:


I made a codepen with your changes, but I was unable to get the right result. Did I do something wrong?

Oh my bad, I guess we cannot attach the depth buffer when not rendering to a texture.
You can have another render target, render to that then copy it to the canvas.
I have edited the codepen: https://codepen.io/repalash/pen/ExoMRpO?editors=0010

2 Likes