I am running into behaviour I don’t understand using WebGLRenderTarget. I am trying to code shadow casting following a tutorial. The algorithm is:
- Render the scene into a render target from the point of view of a light source, storing the depth data in a texture
- Render the scene on canvas using the main camera and read the depthTexture data to decide light/shadow colors for fragments
According to the examples I have seen this can be achieved by:
var shadowTarget = new THREE.WebGLRenderTarget(textureSize, textureSize); shadowTarget.depthBuffer = true; shadowTarget.depthTexture = new THREE.DepthTexture(); //Render depth information from light renderer.setRenderTarget(shadowTarget); renderer.render(scene, shadowCamera); //Render main scene renderer.setRenderTarget(null); renderer.render(scene, camera);
On some machines/browsers, I get wrong behaviour which I suspect comes from misunderstanding textures.
In Safari on a Macbook and Chrome on Linux, the scene is correctly rendered
In Chrome on my Macbook, the scene initially displays lines across the geometry which is a known error of the shadow algorithm. The shader code solves it with a bias variable but the scene is not updated correctly. One of the several things that results in correct visuals is adding a camera helper (checkbox in the example). Does adding the helper trigger some scene/renderer update that I could call explicitly?
In Firefox, the resulting depthTexture is not used at all and an error in the console says:
Error: WebGL warning: drawElements: Texture level 0 would be read by TEXTURE_2D unit 0, but written by framebuffer attachment DEPTH_ATTACHMENT, which would be illegal feedback.
As I understand, read and write textures can’t be the same within the same shader. When it comes to reading the texture, I am rendering to canvas. The error is only visible in FF and not in Chrome and Safari.
- Why does the code give an error in FF but not in Chrome?
- What is the update/change that adding a helper triggers?
- Can I unbind the depthTexture from the material/program so I can use it for reading without undefined behaviour?
- How to properly clone depthTexture data if I need to use two WebGLRenderTarget objects?
- Am I the only one seeing this behaviour?
Thanks in advance!