Hi,
I am working on a physics simulation project using GPGPU and was happy to hear that DataTexture2DArray is now being supported in three. However, when trying to render to a DataTexture2DArray I have faced some problems.
Problem description
Rendering to a DataTexture2DArray and reading the pixels from the renderTarget afterward works fine, the output is as expected. But when using the renderTarget.texture as input for a subsequent render, it does not work – but just the first time.
I was following the “webgl2_rendertarget_texture2darray” example at three.js examples where a DataTexture2DArray is being used as a render target. It looks like it works (visually) but digging a bit deeper, I believe that the first render succeeds, but the subsequent render fails to use the render target as input, but only the first time. The following times it works, obviously – which could be verified visually. But I am guessing that the first render (output to the screen) is totally black. I believe it is a bug in three.js (R126 / latest) or in the example.
I have made fiddle to reproduce the behavior https://jsfiddle.net/sdr5vpxh/
The input texture (2x1x2) contains values 1,2,3,4,5,6,7,8 divided into two layers, and the shader just simply outputs the inputs.
When rendering two times with a DataTexture (2x1 = 1,2,3,4)
Result after first render: [0, 1, 2, 3] (as expected)
Result after second render: [0, 1, 2, 3] (as expected)
But when rendering two times using a DataTexture2DArray (both layers) the output is
Result after first render of layer 0: [0, 1, 2, 3] (as expected)
Result after second render of layer 0: [0, 0, 0, 0] (unexpected)
Result after first render of layer 1: [4, 5, 6, 7] (as expected)
Result after second render of layer 1: [4, 5, 6, 7] (as expected)
Thus, the second pass does not seem to use the first render target as input, resulting in only zeroes. When adding more layers, it works, but the first second render always fails to use the previous render target as input. I suspect three.js doesn’t see the renderTarget.texture as a texture on the GPU and reinitializes the texture upon the first usage as input.
I do not know if the render target (for DataTexture2DArray) should be created differently, but I have used the same method as in the original example (webgl2_rendertarget_texture2darray)
renderTarget.depth = xx;
renderTarget.setTexture( renderTargetTexture );
When using a DataTexture2DArray as a render target texture, the example shows that you need to explicitly set the texture. Maybe this is why it fails. I don’t know if it is possible to provide the depth from the beginning when creating the render target.
I would be grateful if someone knows anything about this.
Thanks
Magnus