Hi,
I have a ShaderMaterial
and I’m just changing a uniform (Texture
) in it.
First I call .dispose()
on the texture (material.uniforms.uPalette.value as THREE.Texture).dispose()
), then assign a new one.
What I find weird, is that in the stats, renderer.info.memory.textures
keeps increasing, but the memory usage stays relatively the same, even goes down a bit sometimes, so I’m assuming the texture is really being disposed…
Is my assumption wrong? If not, what can be the reason of the growing texture count in the stats?
That is strange since when you call dispose()
on a texture, the memory count should go down if the texture’s source data are not used by other textures. Do you mind demonstrating the issue with a live example?
Thanks @Mugen87, I did some debugging and it looks like this was caused by some CubeTexture
I was generating. I created a new WebGLCubeRenderTarget
for every new cube texture and even though I called the dispose on the texture itself, it was not enough… Now I reuse the same render target every time, and so far it looks perfectly good (I still have that one extra texture lying around in the render target, but i can live with that for now
).
When using render targets, you should call dispose()
on the render target, not on the render target’s texture. Maybe this created the confusion?
Yeah, I guess I was under the impression that calling .dispose()
on the render target would destroy it as well, so I was just trying to get rid of the texture.
Now I do the following (render target is a static prop in the class):
- create a new texture using the render target
- assign texture to shader uniform
- repeat…
So the last created texture is always “stuck”, but I’m fine with that for now.
I wonder if disposing the render target every time, after assigning the texture to the uniform would be a better/more performant way, but I don’t think so.
Definitely no. A lot of resources (frame/renderbuffers) are allocated in the background when render targets are created so it’s best to avoid dispose()
whenever possible.
Makes sense 
While I have you here Master @Mugen87, do you maybe have some pointers on how should I approach the following issue (or if it’s possible at all):
Building on the previous, instead of using a loop to generate many normal/canvas/cube textures and blocking the main thread while doing it, I want to generate a texture sprite in a web worker, using OffScreenCanvas
. It works really well for normal textures (gradients/text/etc), but now I somehow need to add these cube textures… I need to get them on the canvas, so I can use .transferToImageBitmap()
and send everything as Transferable objects to the main thread.
Here is the code I use for generating these:
private cubeTextureScene = new THREE.Scene();
private cubeRenderTarget = new THREE.WebGLCubeRenderTarget(this.resolution, {
format: THREE.RGBAFormat,
generateMipmaps: true
});
private cubeCamera = new THREE.CubeCamera(0.01, 10, this.cubeRenderTarget);
private cubeMesh = new THREE.Mesh(new THREE.SphereGeometry(0.5, this.sphereSegments, this.sphereSegments));
public getCubeTexture(
radius: number,
uniforms: { [uniform: string]: THREE.IUniform },
vertexShader: string,
fragmentShader: string,
side = THREE.DoubleSide
): THREE.CubeTexture {
this.cubeMesh.scale.set(radius, radius, radius);
this.cubeMesh.updateMatrix();
this.cubeMesh.material = new THREE.ShaderMaterial({ side, uniforms, vertexShader, fragmentShader });
this.cubeCamera.update(App.renderer, this.cubeTextureScene);
this.cubeMesh.material.dispose();
const texture = this.cubeRenderTarget.texture;
texture.colorSpace = THREE.SRGBColorSpace;
return texture;
}
Is there any way to get the actual images out of this (I couldn’t really find anything helpful so far…)? If yes, and I put them on the sprite canvas, then in the main thread I get the parts from the sprite, what would be the best way to re-create the cube texture? Or even better, just use the sprite as is and somehow pinpoint the actual part as cube texture(s)?