WebGLRenderTarget depthTexture behaves differently in Safari, Chrome and Firefox

Hey

I am running into behaviour I don’t understand using WebGLRenderTarget. I am trying to code shadow casting following a tutorial. The algorithm is:

  1. Render the scene into a render target from the point of view of a light source, storing the depth data in a texture
  2. Render the scene on canvas using the main camera and read the depthTexture data to decide light/shadow colors for fragments

According to the examples I have seen this can be achieved by:

var shadowTarget = new THREE.WebGLRenderTarget(textureSize, textureSize);
shadowTarget.depthBuffer = true;
shadowTarget.depthTexture = new THREE.DepthTexture();

//Render depth information from light
renderer.setRenderTarget(shadowTarget);
renderer.render(scene, shadowCamera);
  
//Render main scene
renderer.setRenderTarget(null);  
renderer.render(scene, camera);

An example:

https://codepen.io/anon/pen/OGKKKa

On some machines/browsers, I get wrong behaviour which I suspect comes from misunderstanding textures.

  1. In Safari on a Macbook and Chrome on Linux, the scene is correctly rendered

  2. In Chrome on my Macbook, the scene initially displays lines across the geometry which is a known error of the shadow algorithm. The shader code solves it with a bias variable but the scene is not updated correctly. One of the several things that results in correct visuals is adding a camera helper (checkbox in the example). Does adding the helper trigger some scene/renderer update that I could call explicitly?

  3. In Firefox, the resulting depthTexture is not used at all and an error in the console says:
    Error: WebGL warning: drawElements: Texture level 0 would be read by TEXTURE_2D unit 0, but written by framebuffer attachment DEPTH_ATTACHMENT, which would be illegal feedback.

As I understand, read and write textures can’t be the same within the same shader. When it comes to reading the texture, I am rendering to canvas. The error is only visible in FF and not in Chrome and Safari.

My questions:

  1. Why does the code give an error in FF but not in Chrome?
  2. What is the update/change that adding a helper triggers?
  3. Can I unbind the depthTexture from the material/program so I can use it for reading without undefined behaviour?
  4. How to properly clone depthTexture data if I need to use two WebGLRenderTarget objects?
  5. Am I the only one seeing this behaviour?

Thanks in advance!

I have debugged your code and the problem is that you render the depth pass (the first invocation of render()) with the same material that is intended for the actual render pass. By specifying a cheap override material you can fix the errors in Chrome and Firefox :tada:

Besides, you should init your render loop not with requestAnimationFrame(). A simple call of draw() is sufficient. Also consider to optimize the render target for the depth texture. You don’t need a stencil buffer, you can use NearestFilter and RGB as format.

That fixed everything. Thank you!

This is for this specific use case? Depth textures could utilize stencil buffers very well, the problem is they cannot be shared between targets :slightly_frowning_face:

That’s right.

1 Like