Sprite Instancing with UV mapping

I have a quick question since my understanding of instancing is not aligned with what I’m seeing happen.

Some background, I have been using canvases as the underlying source for sprite text labels, and this works really well (very fast, .01ms per sprite draw time), but for each sprite I have a draw call, and also an additional megabyte of THREE texture memory being allocated. The memory usage is constant, and regardless of the underlying canvas size, I always see an additional meg used although I don’t understand why, but that is not my main question.

To reduce memory consumption and the number of draw calls, I figured I’d use an InstancedMesh (some code I found explained it well), but now that I have a single canvas with different labels inside I am doing UV mapping for each vertex, which works, except that it seems the UV mappings for the last 4 sprite vertices is applied to all instanced sprites, which is not what I’d expect. Nothing in the code below is creating an error, but I don’t understand how the UV mapping works in this scenario.

My question is why do the UV mappings in my last 4 vertices apply to all instances below?


let geometry = new THREE.InstancedBufferGeometry();

// 4 vertices per sprite (figured to just do the raw UV mapping directly on each point below)
let positions = new Float32Array(10 * 3 * 4);
let uvs = new Float32Array(10 * 2 * 4);

// this is copied from the sprite code in three, and it positions on the screen fine
// to simplify this, imagine a texture that is 100x100
// and each new labels starts at x = 0, y = (i * 10) (so a column of 10 rows)
for (let i = 0, j = 0, k = 0; i < 10; ++i) {
	
	// x1, y1, x2, y2, are the texture cutouts for each sprite
	let x1, x2, y1, y2;
	
	// to keep it really explicit, only the first sprite has a unique UV cutout
	// the other 9 would be the same in this example
	if (i == 0) {
	
		x1 = 0;
		x2 = 1;

		y1 = 0;
		y2 = .1;
	
	// i = 1 to 9 are all what should be the second text label
	} else {

		x1 = 0;
		x2 = 1;

		y1 = .1;
		y2 = .2;

	}
	
	// so this is the issue above
	// expected the first sprite to be the first label
	// and all the rest would be the second label
	// except the first sprite is affected by the else block above
	// so all ten labels have the label of sprite 2
	// or whichever x1, y1, x2, y2 is last set

// BL
	positions[j++] = -.5;
	positions[j++] = -.5;
	positions[j++] = 0;

	uvs[k++] = x1;
	uvs[k++] = y1;

	// BR
	positions[j++] = .5;
	positions[j++] = -.5;
	positions[j++] = 0;

	uvs[k++] = x2;
	uvs[k++] = y1;

	// TR
	positions[j++] = .5;
	positions[j++] = .5;
	positions[j++] = 0;

	uvs[k++] = x2;
	uvs[k++] = y2;

	// TL
	positions[j++] = -.5;
	positions[j++] = .5;
	positions[j++] = 0;

	uvs[k++] = x1;
	uvs[k++] = y2;
}

geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geometry.setAttribute('uv', new THREE.BufferAttribute(uvs, 2));

// position and scale set on an Object3D here, then setMatrixAt() is called
// this all works fine
for (let i = 0; i < text3s.length; ++i) {
 ...
}

// mesh is created with a custom shader material that uses a custom vertex and fragment shader
// these shaders are basically the same as the ones for a sprite
// only modification is to use modelViewMatrix * instanceMatrix in the vertex shader
// which is properly setting the position of each sprite
// no issues in these shaders, everything looks as expected
let mesh = new THREE.InstancedMesh(...)

My guess is that because each vertex has the same position in the array buffer, that is somehow affecting the UV mapping, but since each sprite is correctly positioned, I don’t understand why the texture offsets seem to be stepping onto each other.

I’ve gotten this to work as I’d expect by using the full texture size:

x1 = 0;
x2 = 1;

y1 = 0;
y2 = 1;

And tracking my offsets and repeats:

geometry.setAttribute('uvOffset', new THREE.InstancedBufferAttribute(uvOffsets, 2));
geometry.setAttribute('uvRepeat', new THREE.InstancedBufferAttribute(uvRepeats, 2));

And using (in the vertex shader):

vUv = (uv * uvRepeat) + uvOffset;

But I’m still curios why my attempt to just internalize these offsets and repeats in the UV array didn’t work as expected, since in my limited understanding of shaders the vUv calculated in the shader is the same value in both scenarios.

1 Like

I’m curious as to why you needed to use sprites here. Was 3D text not possible? I cant stand the pixelation of sprites, when zoomed, and I ran into a similar issue with wanting to have multiple text geometries in the same instanced wrapper. However, the instancedMesh creates clones of the same geometry for the β€˜count’ of instances. Know that its possible to prerender your 3D text upfront and then create a meshWrapper with a different geometry at each vertex. I dunno, maybe I understood your question wrong but, when I see people use sprites its usually for text, and usually, to simply make the text always face the camera quaternion. There are better ways…

I wanted full unicode support and an easy way to outline text. Drawing text onto a canvas made this quick and easy. 3D text seemed like it would be more work with little benefit as the sprites are never close or big enough to look pixelated.

Makes sense, generateShapes is pretty simple though. The only downside used to be all the geometry but, with instancing, under 50k texts is very easy, with little optimization needed. Anyways, glad you found the solution that fit. I have to view my content on big screens, as well, and its very noticeable when you see the edge of sprite text, next to a 3D objects edge, on a large screen. Not so bad on a monitor. Instancing does add a level of complexity though. But it sounds like you know what you need…

Happy Modeling!

Hi, this is a multi-part question :slight_smile:

I’m doing a similar thing right now, adding labels to thousands of points.
The texture contains columns and rows of text, I’m doing the offsetting in the vertex shader, something like this (index is basically the instanceId):

// vTextureCoord is used in the fragment shader:
// float opacity = texture(uTexture, vTextureCoord).r
vTextureCoord = vec2(
  (floor(index / rows) + uv.x) * (1.0 / columns), 
  (mod(index, rows) + uv.y) * (1.0 / rows)
);

And this is how I keep the planes β€œspritey” :slight_smile: :

vec4 mvPosition = viewMatrix * modelMatrix * instanceMatrix * vec4(0.0, 0.0, 0.0, 1.0);
mvPosition.xy += position.xy * uScale * -mvPosition.z;

gl_Position = projectionMatrix * mvPosition;

So far so good…

But there are a few things I couldn’t wrap my head around until now:

  • Is there any way to change the anchor/pivot point of the instances to the top-left for example? So they would not β€œrotate” around the middle.
    EDIT: I figured this out, instead of simply using position.xy in β€œspritey” example, use an offset vector, like:

    vec2 offset = vec2(OFFSET_X, OFFSET_Y);
    mvPosition.xy += offset * uScale * -mvPosition.z;
    
  • How should I modify the raycast method in InstancedMesh to use the same transforms as in the shader? Because in actuality the planes are not looking at the camera all the time, it’s just the shader trickery and raytracing does not work.

  • How can I change the position of the instances at any point in time? I tried this, but it doesn’t seem to do anything:
    EDIT: instead of mesh.updateMatrix() , I had to use mesh.instanceMatrix.needsUpdate = true

    const mesh = this.instancedMesh;
    const transform = new THREE.Object3D();
    
    for (let i = 0; i < count; i++) {
      transform.position.set(pos[i].x, pos[i].y, pos[i].z);
      transform.updateMatrix();
    
      mesh.setMatrixAt(i, transform.matrix);
    }
    
    // WRONG:
    // mesh.updateMatrix();
    
    // CORRECT:
    mesh.instanceMatrix.needsUpdate = true;
    
  • And lastly, I want to add a horizontal offset to the labels, so their position is still the same as the point they belong to, but there is a gap between them (if I move the actual instance, it won’t be nice). They way I wanted to do this, is to keep uv.x at 0 until it reaches the threshold, then start to sample the texture. This is what I’m doing now, but it just stretches the texture… :confused:

    // if uv.x < offset, keep uv.x at 0, then start from 0
    // example: if offset=0.2 and uv.x=0.1 => uvX=0, then if uv.x=0.5 => uvX = 0.4
    
    // <-offset->
    // β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    // β”‚    0    β”‚ uv.x - offset * (1. - uv.x) β”‚
    // β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
    
    float uvX = (uv.x - offset * (1.0 - uv.x)) * ceil(uv.x - offset);
    // and I replace original uv.x to uvX in the previous example:
    vTextureCoord = vec2(
      (floor(index / rows) + uvX) * (1.0 / uColumns), 
      (mod(index, rows) + uv.y) * (1.0 / rows)
    );
    

Any help is greatly appreciated!