In r167 of threejs, I write this code below, it seems that the renderer doesn’t support the array uniform of Texture
, how to deal with it ?
this.fragmentNode = tslFn(() => {
const uniformDiffuseTextures = uniforms(this.diffuseTextures);
const _uv = uv();
let color = vec4(0.0).toVar('diffuseColor');
loop(this.texturesCount, ({i}) => {
const _texture = texture(uniformDiffuseTextures.element(i), _uv);
color.assign(color.mix(_texture, 0.5));
});
return color;
})();
Three.js has supported a wide range of textures since r166, including “texture_2d_array”.
Since I only work with wgslFn with WGSL raw shaders, I can’t say whether the wide range of texture types also applies to tslFn. It certainly works in wgslFn because I use texture arrays.
I have few experience with tslFn. The reason why I prefer wgslFn is because I can use the W3C documentation for WGSL, but that’s definitely not for everyone
Thanks a lot, Could you please give me some example code? wgslFn also may be helpful.
I actually noticed an issue with the texture arrays. Texture arrays work but the mipmaps are not generated correctly. This is currently being fixed by the developers.
I can let you know as soon as this is resolved and share an example code.
Ok issue has been solved by the developers. Here is an example code on how to use texture arrays in three.webgpu.js.
But this will only work properly with r168. I created the CodePen for error analysis but now it can be used as an example. I hope it helps.
1 Like
Use DataArrayTexture
is ok. But converting image to canvas may cost much time on CPU, I think. It may cause a performance issue.
Do you want single textures in an array?
But in that way you’ll quickly use up the maximum number of textures allowed in a shader? Someone had this problem recently.
With a DataArrayTexture you don’t have this problem because the texture_2d_array is handled as a single texture. But I made a performance error. This is exactly what OffscreenCanvas was developed for. I’ve updated the example with this.
GetImageData(image) {
const offscreen = new OffscreenCanvas(image.width, image.height);
offscreen.width = image.width;
offscreen.height = image.height;
const context = offscreen.getContext( '2d' );
context.drawImage(image, 0, 0);
return context.getImageData(0, 0, image.width, image.height);
}
Alternatively, you can use “fast-png”. This means you can decode the rgba values directly and don’t need a canvas. But OffscreenCanvas is a good solution if you don’t want to use an additional library like fast-png.
If you really want to pass individual textures in an array to a shader, I haven’t looked into that yet as it never seemed like a good solution to me.
But that is just my personal opinion and not the measure of things.
P.S. I also made the code asynchronous. This means that as long as the textures are not loaded you could use the code with dummy colors or textures in the shader because the main thread is no longer blocked.
Yes, that’s what I want, Actually, some of these textures are RenderTargert.texture
, and others may be Image or Canvas. The amount of textures is 1, or 2, or 3, or more, but not too many. The dimension of these textures may be different. In the old version of three.js, I can use array of texture in js and write code like Sampler2D textures[3]
in glsl.
Meantime, I realize the benifit of OffscreenCanvas
and DataArrayTexture
from your answer, maybe I should modify some logic of my APP to use these effective features.