Merging multiple boxGeometries with different materials

Hello everyone,

I’m pretty new to THREE, so I would like to apologise upfront if my question is silly.

Here’s the context:
Three version: 0.135.0
I have multiple voxels in the scene, over 65k.
I also have around 11 material arrays that could potentially be applied to these voxels.

The reason on why I have material arrays is because depending on how many faces of the voxels are exposed each face should look slightly different.

I did a ‘dry’ run and managed to render all the 65k blocks as independent mesh, obviously the performance was suboptimal.

What I need is to figure out a way to merge these geometries into a single mesh, but somehow preserve the materials for each voxel.

I found many examples over the internet, but it seems THREEJS updates quite a lot and most of the stuff I saw on these examples / tutorials no longer exist…

Can somebody point me in the right direction / examples?

Thanks in advance.

As an FYI: The closest ‘solution’ I found to my problem is this one: three.js - Three js: Merge multiple objects into one - Stack Overflow

Although it has 2 basic problems:

1 - Can’t see how I would be able to have multiple materials;
2 - Probably because the date of the post ~2018, the code below simply won’t yield any visual results.

const voxel1 = new THREE.BoxBufferGeometry(1,1,1)
const mesh1 = new THREE.Mesh(voxel1, beachTextureArr)
mesh1.position.x = 1

const voxel2 = new THREE.BoxBufferGeometry(1,1,1)
const mesh2 = new THREE.Mesh(voxel2, rockyTextureArr)
mesh2.position.x = 2

const singleGeometry = new THREE.BufferGeometry()
singleGeometry.merge(mesh1.geometry, mesh1.matrix, 0)
singleGeometry.merge(mesh2.geometry, mesh2.matrix, 1)

return <mesh geometry={singleGeometry} position={[0,0,0]} material={ oceanTextureArr }/>

I believe it worth to mention, I’m using React-three-fiber…

Any ideas, suggestions, links or something?


you can have instances with varying colors. or create your own shadermaterial and feed it texture uvs per instance via bufferattributes. i think that’s how games do it, they have one god-material but they’re instancing it.
since you’re using r3f also check out dre/instances and merged. merged especially allows you to feed it meshes (with distinct materials) and then it creates instances for them. see: though i would only use this for a few hundreds, maybe a few thousands, but this has a clear limitation.

1 Like

Thanks for your reply.

By the way: I absolutely love your library. Is there a way to donate? or even get involved?

Back to my problem: What I’m trying to do is represent a real mine block model, you can think about it like a minecraft map.

As I progress (I hope) I’ll obviously implement chunks and camera distance thresholds, but as of right now I would like to render a grid 256x256 which is equal to 65.536 voxels / or BoxBufferGeometries.

Do you think drei / merge is still a good solution?

Also, do you know where I can find some threejs material about shadermaterials and how to feed it via bufferattributes?

Thanks a lot!

Do you think drei / merge is still a good solution?

that would be 65536 instances, no way. use plain three.instancedmesh. say you have 10 materials, then you create 10 instancedmeshes, which creates 10 drawcalls. you dont have 65536 different materials i hope.

Also, do you know where I can find some threejs material about shadermaterials and how to feed it via bufferattributes?

book of shaders? i mean, this is going way deep from here, you’d have to know how to code glsl. if you know the basics you could make a shadermaterial that reads buffer attribute uvs quite easily. you could then feed it a single texture atlas, but here i am completely clueless. i’d make this another topic if thats where you want to go.

ps, found this: three.js: BufferGeometry with texture coordinates - Stack Overflow

is there a way to donate? or even get involved?

react-three-fiber - Open Collective :slight_smile: and sure you can get involved. make prs or talk about your ideas on the poimandres discord.

1 Like

A texture atlas can be quite robust. You can fit a lot of tiny surfaces into a single atlas. Consider the size of your voxels and you can create hundreds of surfaces from a single atlas. The texture is loaded once to the GPU. If your surfaces are small enough (ie-faces small enough) then you can divide a 256x64px texture into a lot of small squares. As long as you plan on using similar geometry for each vox then achieving this goal with Instanced or BufferGeometry is an easy choice. A larger texture atlas might work too if your upper limit is only 256x256. You may not get 65k textures but it will be a larger than say 16 textures for sure. Try dividing your texture into 16x16 squares and using those as your surfaces. Change your math to fit 8x8 / 28x28 / however you want to divide it.

I just did what youre trying to do btw. I took a vox model that applied 256x256x256 and flattened the solution to 2D. Now it can be either 2D or 3D.

Also, it is better to simply store the data for each instance in a separate array. As an instance isnt a transferable object that means its just static after render so it containing its own data is pointless. Since your instances already have an index in the instanedMesh.matrix you can simply use this index value as the identifier for its data in a seperate array[ i ]. This is also better because now with your data decoupled you can reset data in the data array/object, with a service, and simply rebuild your instancedMesh with new data which makes it “dynamic” in essence. This can be done seamlessly and I do it all the time. Even if your model is in the middle of a transition the update is instantaneous.