How to use arrayTextures with wgslFn

I haven’t had a question about WebGPU for a while now, but one has been bothering me again for about a week now and I haven’t found a solution yet.

I have a CodePen example here. I’m interested in passing a DataArrayTexture to a wgsl shader.

The example is as short as possible and therefore quite boring, but for me at the moment it’s all about correct use.

Does anyone know how to get a texture 2D array from three.js into the shader?

I get a long error message and it looks as if the texture node was not designed for this.
If, on the other hand, you attach the texture node directly to the outputNode with uv() and depth value, as in the WebGPU example, then it works. As if the use is limited to this way at the moment but that’s just pure speculation on my part. If anyone knows better, please feel free to correct me.

As i know,wgsl don’t have array texture

wgsl has texture_2d_array. And over the table you can see the sample type f32, i32, u32

In my case then: texture_2d_array<f32>

So far wgsl has everything I know from glsl. And I already use a lot of things intensively. Just no array textures yet.
But before I create a ticket in the developer forum on github, I want to be as sure as possible that it is one and not a user error on my part.

may be unrelated but on line 48 did you mean arraySampler?
image

when corrected, this brings up the shader error you’re experiencing assumedly?

1 Like

You’re right, I’ll correct that right away, but that doesn’t fix the warning. I just recreated the codePen from my computer. But thank you anyway, of course it has to be right

P.S.

I’ve already used cube textures texture_cube<f32>. There is an extra node cubeTexture for this in the node system
For arrayTexture I don’t see anything like that in Nodes.js under three/examples/jsm/nodes/Nodes.js. Therefore I suspect that the texture node applies to this, but I don’t know. I get a typemismatch in the console

@Attila_Schroeder it looks like your original github issue in august last year was looked into with a solution provided, is StorageTextureNode what you’re looking for?

That’s solved. Back then, it was a story that lasted several months and involved being able to generate textures with compute shaders. That wasn’t possible at the time. It’s all working wonderfully now. This made my WebGPU ocean possible.
Postprocessing and much more now also works very well in WebGPU. But getting an array 2D texture into a wgsl shader using the wgslFn node is a new thing.

I don’t think you can do much wrong on the shader side, because I can see the W3C documentation

array_texture: texture_2d_array<f32>,
array_texture_sampler: sampler,

I’m just not sure yet whether there is an issue with the texture node in connection with array textures and the wgslFn node. Or whether I’m simply missing the right node for it

1 Like