Instancing within bounds of mesh

Hello community,

I have made a sphere with “land masses” on it, I am trying to find the best way of instancing geometries on only the land masses, having them face the right way up, without going outside the “lands” bounds, I’ve been looking into using the position attribute of the land geometry but the vertices are not even and very sporadic (as its low poly and an uneven distribution of the topology) does anyone know a lightweight best approach to achieving this or some resources to point in the right direction? Or if it’s possible to use some sort of weight mapping as to where to place instances?

Any help would be greatly appreciated!!

Ps. Using r126

Would probably start with THREE.MeshSurfaceSampler:

https://threejs.org/examples/?q=scatter#webgl_instancing_scatter

Creating a particle system in Blender, then converting to individual meshes before export to glTF, would be another workflow. Would need to do some extra work after loading to make an InstancedMesh from this setup though, otherwise lots of draw calls.

1 Like

hey @donmccurdy,

yeah i was looking to base it off of this example…

as it’s ideal in certain regards but it seems it’s placing instances at vertices positions right? unless i’m wrong? if it’s just placing them at random points anywhere on the mesh ignoring vertex positions i think this will be the way to go, will look into it

thank you man!!

MeshSurfaceSampler selects a triangle “randomly”, where each triangle’s probability of being chosen is proportionate to its area. Then it selects a random point in that triangle. So instances will not end up on vertices most of the time. There’s also an option of using vertex colors (or another attribute) to weight certain triangles higher or lower, e.g. to keep a path empty of shrubbery.

2 Likes

ohh ok!! amazing!! yeah i had a small revalation finding this in the code…

that’s amazing help! thank you man, i will look into implementing this and let you know how it goes!!

thanks man!!

1 Like

hey @donmccurdy

i’ve managed to get some ok results so far, instancing on the “land” is working great, a problem i have encountered is if i rotate the parent object of the land the sampled position does not update with everything, i’ve tried a few approaches to get that functionality but with no luck, have had to rotate the entire scene instead, which is fine for the use case at the minute but do you know if it’s possible to update the sampled positions if i need to only rotate the parent of the globe?

thanks for your help so far man, really cool!!

EDIT: ^^^ i’ve found the solution for updating the instanced mesh rotation by simply rotating like so…

Mesh = new THREE.InstancedMesh( gasGeometry, gasMaterial, count );
Mesh.rotation.y = 1 * Math.PI/180;

but having a bit of difficulty with scaling, would you know how i could assign a variable as the scale attribute to the instanced geometries that would update the scale of all instances when the variable changes, without having to iterate through every instance? i think i’m stuck with the problem in the following image…

image

i’m basically looking for a way to update the scale of all instances at the same time to the same scale without iterating Mesh.setMatrixAt( i, dummy.matrix); is this even possible?

1 Like

I don’t believe that’d be possible without using a uniform and a custom shader. There’s an example somewhat like this in three.js examples. May or may not be worth it, depending on how much performance you’re losing by updating all the instances individually.

2 Likes

hey @donmccurdy

i managed to sort it out with a bit of effort like this (where “two” is the variable i’m feeding it)…

    var prevScale = 1;
    function updateParticle() {
        var currentScale = two/100;
        if(currentScale != prev)
        {                    
            Mesh.geometry.scale( 1, 1, two/100 / prevScale);                       
        }
        prevScale = currentScale;
    }

would be great to hear what you think about this as a practice, if it has any pitfalls etc? at the minute it seems to do what i needed it to do, which is change all instance scales at one time with a fluctuating variable…

thanks for your help on this!! really appreciated the direction!!

Hadn’t thought of that, but it’s definitely an option! The tradeoffs here are:

  1. mesh.setMatrixAt(...) → need to upload an attribute with length 16 * numInstances
  2. geometry.scale(...) → need to re-upload base geometry
  3. mesh.scale.set(...) → wrong effect / shifts instance origins

My guess would be that you should do whichever requires uploading less data. If you’re instancing a very simple geometry, uploading the geometry again could indeed be cheaper than updating lots instance transforms. But I haven’t tested this, definitely curious if there are other tradeoffs I’ve missed…

1 Like

mesh.setMatrixAt(...) is clearly the least expensive then in that case, the only thing was that I’m requesting json data from an api that can only allow a limited amount of requests over time so iterating that data for every instance was preventing any data from getting through at all, I may be overlooking a crucial part of formatting though, I’m going to have another look in the morning with fresh eyes. There isn’t a method like mesh.setMatrixAt(all, dummy.matrix) is there? sounds rediculous when I read it back aha!

@donmccurdy

In terms of weighting, I use weight maps in maya for things like rigging to paint mesh weights where joints will have more effect on lighter parts of the map and less effect on darker parts, this can be a color map too with same idea, what’s the best way of approaching this in three, I’m thinking it will probably be a blender job to paint vertices right? Just wondering what color information will read as null areas and populated areas when importing gltf into three, is it a case of using black and white like an alpha map or would it be color oriented, first time trying this and would love to understand how the pipeline works for achieving it.

EDIT: i have found the weight painting tool in blender, i’m wondering which tool will be the right approach to have the weights embedded in the gltf when exporting to three… i’m using this weight painting tool…

is this the correct tool to use or should i be using “paint Vertex tool” like this as an “alpha” sort of map…

or is there a defined approach to achieving this that i’m missing?

Sorry for being a pest you’ve really helped me out a tonne already, thank you!!

i’ve managed to get results using “vertex Paint” tool in blender and using color attribute in three js which makes the instances avoid the painted area… been looking online but can’t find a way to then not render the color information, i’d like to just keep the texture info on top or make the painted vertex color attribute invisible, do you have any advice on how to do this?

image

sorted, it’s odd to figure out as the color comes from bufferGeometry.attribute but i found you have to set material.vertexColors to false

That works, or another option would be:

var paintWeights = geometry.attributes.color;
geometry.deleteAttribute('color');

// use any unreserved name:
geometry.setAttribute('my_custom_attribute', paintWeights); 

Blender will also let you export more than one set of vertex colors per mesh if you want. Only the first is rendered. The vertex weight painting tool’s data cannot currently be exported, see discussion in Export Vertex Groups · Issue #1232 · KhronosGroup/glTF-Blender-IO · GitHub.

2 Likes

ahhh cool! so 'my_custom_attribute' is not recognised as a rendrable “color information” attribute! that’s cool, then you can assign that to…
sampler = new MeshSurfaceSampler( landobj ) .setWeightAttribute( api.distribution === 'weighted' ? 'my_custom_attribute' : null )

as it stands this works exactly how i need it to pretty much so no need for weight painting but i guess it sure would be nice in the future, instead of “on” and “off” you could have each color of the weight spectrum vary the scale or colour or even mesh all together in theory right? i’ll read the issue properly and see why it’s not possible :slight_smile:

can the instances be a gltf with animation embedded by any chance? or is that completely out of the scope of possibility?

thank you @donmccurdy you’re genius!!

Currently THREE.InstancedMesh does not support skinning or morph target animation. But you can move/rotate/scale the instances individually of course. It’s also possible to write a custom shader that animates the vertices of different instances differently, although this is more difficult and doesn’t really accommodate complex characters well. See the last example on Three.js NodeMaterial introduction.

1 Like

thanks for the info man, wasn’t to impliment just to understand more :slight_smile:

any chance you could point me to some documentation or the right direction to make instanced meshes with alpha test transparency cast alpha mapped shadows?!, i’ve tried a few ways from out of date documentation and nothings seemed to work in r126, it would be great to keep the texture on the instances too…

Edit: i sorted the shadows by using the custom depth material from the original cloth example here…

https://threejs.org/examples/webgl_animation_cloth.html