As of now (r104), three.js doesn’t seem to have a WebGLRenderTarget
implementation compatible with DataTexture3D
and DataTexture2DArray
. Would such an implementation be possible and does it make any sense?
The way I understand, due to WebGL limitations it’s impossible to render directly to a Texture3D. So, such implementation would require to bind slices of the texture data as an output for the renderer and iterate over z-slices (depthsize).
Is this possible with the current texture implementation or would it require modifications in the way the texture data is currently used?
The main reason for this question is the necessity to use DataTexture3D on a GPGPU pipeline. But without being able to output to a new texture, it invalidates any attempt of doing so.
I understand that it is possible to extend a Texture2D to simulate a DataTexture3D, but it wouldn’t be possible to sample depth information using LinearFilter. Computing the depth interpolation directly in the shader seems to add just too much overhead, also making it not a viable option.
Would appreciate any insights and perhaps different approaches to the problem.