Use TexturePacker atlas in an InstancedMesh

Hi all!

I’m trying to render a sh*t ton of icons with the help of an InstancedMesh :slight_smile: .

The geometry of each instance is just a simple PlaneGeometry/Quad.

The possible different icons are all included in a texture atlas created with TexturePacker.
TexturePacker generates the png for the texture (trimming transparent pixels, adding paddings/margins to prevent texture bleeding, removing duplicates, etc) with a Json file where each icon is defined by a frame object that looks like this (basically it’s the rectangle of the specific icon in the generated image):
“frame”: {“x”:68,“y”:4028,“w”:64,“h”:64}

The ideal generated image looks more or less like this and is 2048x4096px (I have different ones for different resolutions, types, etc):

Thanks to other threads in the forum (like this one or this one) I managed to somewhat draw a portion of the texture atlas using a modified vertex shader from MeshLambertMaterial:

 shader.vertexShader = `
	uniform float texScale;
	uniform float texSize;
	attribute vec2 texOffset;
        ${shader.vertexShader}
 `;

shader.vertexShader = shader.vertexShader.replace(
	'#include <uv_vertex>',
	`
	#include <uv_vertex>
	vec2 customUV = (uv * texSize * texScale) + texOffset * texScale;
	vMapUv = customUV;
	`
);

But that only works with an ‘ideal’ texture atlas image where:

  • all icons have the same size
  • there is no variable ‘empty’ space in the texture (TexturePacker tries to pack all it can in the top left corner of the image, but UV coodinates start in the bottom left which can be empty pixels…)

Even if I tweak the texture to try to make all icons the same size and try to prevent empty space in the bottom of the image it kinda works but with no precission as it’s not taking into account the real position of the icon in the atlas because of the paddings, etc (notice the missing top and right debug red border):

That’s why I need somehow to further modify the shader to really be able to feed each of the instances as attributes these four important variables contained in each ‘frame’ object created by TexturePacker: x and y offset and width and height.

My shader understanding is sadly not as good as I would like so any help would be very appreciated, thx!

You can store the bounding boxes of the items in some of your atlas pixels… so like… a strip at the bottom that stores bounding boxes as pixel colors… R = x, G = y B=width A=height or something (this only works if your total atlas size is < 255… if your atlas is larger, you might need a more clever decoding scheme)
You can read this bounding box in the vertex shader… and pass into the fragment shader via a uniform.

This is just one approach… you might be able to store the bounding boxes in an array uniform instead…
You could also store them in a smaller separate FloatType DataTexture (might be simpler conceptually, gets around the 255 w/h limit)
lots of ways to approach these things with different tradeoffs etc.

4 Likes

@kikedao, you can try @mapbox/potpack if you need to programmatically generate the texture.

@manthrax, I’m rewriting the three-nebula particle system (So far, so good :crossed_fingers:), and that’s exactly what they’re using with a points shader, you can see it here under the GPURenderer/TextureAtlas directory.

4 Likes

Haha awesome… small world. :smiley:
I helped introduce potpack into that library… :smiley: @rohan-deshpande is a good friend of mine!

That’s really cool you’re digging into it… I’m sure he will be stoked. Feel free to hit me up if you have any questions or whatnot!

3 Likes

Thx for your answers @manthrax and @Fennec!

My problem is not passing these rectangles to the shader, I can do it with no problem using a couple vec2 attributes per instance.

What I don’t know is how to calculate the correct mapping from these pixel coordinates to ‘vMapUv’ coordinates (I think there is not even a need to further mess with the fragment shader if the correct vMapUv is calculated correctly).

 shader.vertexShader = `
    attribute vec2 texOffset; // The x,y offsets from TexturePacker Json
	attribute vec2 texSize; // The width, height size of the icon from TexturePacker Json
    ${shader.vertexShader}
 `;

shader.vertexShader = shader.vertexShader.replace(
	'#include <uv_vertex>',
	`
	#include <uv_vertex>
	vec2 customUV = // ??? what calculation using texOffset and texSize to get the correct per instance transformed uv?
	vMapUv = customUV;
	`
);
1 Like

I figured out the solution so I will post it here in case anyone needs it :slight_smile: .

This is how I create the instanced mesh with the modified vertext shader:

this.surface.iconsAtlasImg.anisotropy = this.surface.renderer.capabilities.getMaxAnisotropy();

const symbolsIconsInstancedMeshMat = new THREE.MeshLambertMaterial({
	map: this.surface.iconsAtlasImg,
    // color: 0xffffff,
    transparent: true,
});

this.symbolsIconsInstanceTexturePositions = new THREE.InstancedBufferAttribute(new Float32Array(this.symbols.length * 2), 2, true);

symbolsIconsInstancedMeshGeo.setAttribute('texOffset', this.symbolsIconsInstanceTexturePositions);

symbolsIconsInstancedMeshMat.onBeforeCompile = (shader) => {
	// The total size of the texture atlas
	shader.uniforms.atlasSize = {value: new THREE.Vector2(this.surface.iconsAtlasJson?.meta.size.w, this.surface.iconsAtlasJson?.meta.size.h)};
	// The size of each texture in the atlas, we get it from the first frame/texture in the atlas, all textures have to be the same size for this to work
	shader.uniforms.texSize = {value: new THREE.Vector2(this.surface.iconsAtlasJson?.frames[0].frame.w, this.surface.iconsAtlasJson?.frames[0].frame.h)};

    shader.vertexShader = `
		uniform vec2 atlasSize;
		uniform vec2 texSize;

		attribute vec2 texOffset;

        ${shader.vertexShader}
    `;

	shader.vertexShader = shader.vertexShader.replace(
		'#include <uv_vertex>',
		`
		#include <uv_vertex>

		// Calculate UV coordinates for the texture atlas
		float uOffset = texOffset.x / atlasSize.x;
		float vOffset = 1.0 - ((texOffset.y + texSize.y) / atlasSize.y);

		vMapUv = (uv * (texSize / atlasSize)) + vec2(uOffset, vOffset);
		`
	);
}
        
symbolsIconsInstancedMeshMat.defines = {"USE_MAP" : ""};
this.symbolsIconsInstancedMesh = new THREE.InstancedMesh(symbolsIconsInstancedMeshGeo, symbolsIconsInstancedMeshMat, this.symbols.length);

Thanks to that I’m able to use a texture atlas of any size and the only requirement is that all textures in the atlas have to be the same size.

Then in my proxy icon objects I set each instance texture like this:

set slug (value: string){
    if(this._slug === value) return;
    this._slug = value;

    if(this.instancedMesh && this.mesh.symbolsIconsInstanceTexturePositions){
        const iconImageData = this.mesh.surface.iconsAtlasMap?.get(this._slug);
        if(iconImageData){
            this.mesh.symbolsIconsInstanceTexturePositions.array [ this.instanceId * 2 ] = iconImageData.frame.x;
            this.mesh.symbolsIconsInstanceTexturePositions.array [ this.instanceId * 2 + 1 ] = iconImageData.frame.y;

            this.mesh.symbolsIconsInstanceTexturePositions.needsUpdate = true;
         }
     }
}

Thanks to that code now everything works perfectly, here is an example of what I’m able to render with flawless performance thanks to the instancing:


4 Likes

This needs a PR… @drcmda any ideas how this can be reused?