Render to DataTexture2DArray does not work correctly - is it a bug?

Hi,

I am working on a physics simulation project using GPGPU and was happy to hear that DataTexture2DArray is now being supported in three. However, when trying to render to a DataTexture2DArray I have faced some problems.

Problem description
Rendering to a DataTexture2DArray and reading the pixels from the renderTarget afterward works fine, the output is as expected. But when using the renderTarget.texture as input for a subsequent render, it does not work – but just the first time.

I was following the “webgl2_rendertarget_texture2darray” example at three.js examples where a DataTexture2DArray is being used as a render target. It looks like it works (visually) but digging a bit deeper, I believe that the first render succeeds, but the subsequent render fails to use the render target as input, but only the first time. The following times it works, obviously – which could be verified visually. But I am guessing that the first render (output to the screen) is totally black. I believe it is a bug in three.js (R126 / latest) or in the example.

I have made fiddle to reproduce the behavior https://jsfiddle.net/sdr5vpxh/

The input texture (2x1x2) contains values 1,2,3,4,5,6,7,8 divided into two layers, and the shader just simply outputs the inputs.

When rendering two times with a DataTexture (2x1 = 1,2,3,4)
Result after first render: [0, 1, 2, 3] (as expected)
Result after second render: [0, 1, 2, 3] (as expected)

But when rendering two times using a DataTexture2DArray (both layers) the output is
Result after first render of layer 0: [0, 1, 2, 3] (as expected)
Result after second render of layer 0: [0, 0, 0, 0] (unexpected)
Result after first render of layer 1: [4, 5, 6, 7] (as expected)
Result after second render of layer 1: [4, 5, 6, 7] (as expected)

Thus, the second pass does not seem to use the first render target as input, resulting in only zeroes. When adding more layers, it works, but the first second render always fails to use the previous render target as input. I suspect three.js doesn’t see the renderTarget.texture as a texture on the GPU and reinitializes the texture upon the first usage as input.

I do not know if the render target (for DataTexture2DArray) should be created differently, but I have used the same method as in the original example (webgl2_rendertarget_texture2darray)

renderTarget.depth = xx;
renderTarget.setTexture( renderTargetTexture );

When using a DataTexture2DArray as a render target texture, the example shows that you need to explicitly set the texture. Maybe this is why it fails. I don’t know if it is possible to provide the depth from the beginning when creating the render target.

I would be grateful if someone knows anything about this.

Thanks

Magnus

After some debugging in three.js I could observe that the texture actually gets uploaded again in WebGLTextures.setTexture2DArray when it shouldn’t.

Compared to the case with texture2D it is not uploaded again, the previous framebuffer texture is being used correctly.

I have finally found a workaround!

By setting the version to 0, the texture is not being uploaded again.

renderTargetTexture.version = 0;

I have updated the fiddle with this workaround https://jsfiddle.net/trhxqmgb/

Another way would be to make fixes in the source code. One example could be in the constructor of DataTexture2DArray to check if data contains anything, and only then mark it as it needsUpdate.

if(data) {
    this.needsUpdate = true;
}

Which also works.

As I understand it, needsUpdate cannot be revoked outside the texture (by setting it to false), because the version number has already been increased by then.

An even better fix might be to create the DataTexture2DArray in the constructor of WebGLRenderTarget already considering if it should render to a DataTexture2DArray instead of a Texture, perhaps by using depth as an option. I leave this to the experts :slight_smile:

And, perhaps there are better ways to workaround this issue.

1 Like

Hey, I contributed this render target feature. Thanks for letting us know, because I didn’t see that bug at the time.

Your workaround works but that’s indeed not super nice, I will make a PR in Three.js this week-end.

Thanks again for investigating that bug!

EDIT: I would say a good fix is just to use:

So basically to set the version. Changing the needsUpdate in the DataTexture class makes sense as well anyway to me, but still here we have the issue that the render target generate a texture that has no version.

1 Like

Thank you!

I am so happy that the feature exists in the first place in three.js which you contributed to because, with a lot of particles, the maximum texture size can “easily” be exceeded, so adding layers solves a lot of issues. Wrote some own WebGL 2.0 code that worked well on its own, but when trying to merge it with three.js, it failed big time, of course…!

The workaround works for me, but I agree it it not nice.

Yes, I saw that DataTexture also has needsUpdate = true. I agree, it makes sense to make the change there as well. However, as I understand it, that causes no issues as you are never really rendering to a DataTexture, it will always be a Texture per default. That works if it is not a multilayered texture. So, that issue would never be encountered.

Hey!

Can you confirm this PR fixed your issue please?

If it didn’t, let me know and I will look more into it

1 Like

Yes, confirmed working.

I’ve updated the fiddle https://jsfiddle.net/8oLsczby/ to use the dev build, and it seems to work as expected – without the workaround.

Thanks a lot for your quick response and actions!

Still, maybe there should be a more elegant way to create a render target with depth in the future, but I am still very happy with it as it is.

1 Like