Modifying Buffer Geometry vertices for low poly randomness

Vertices were removed from Geometry in r125, and despite a lot of posts I’m still having trouble translating this legacy code:

mesh.geometry.vertices.forEach(v => v.multiplyScalar(Math.random())

For example, modifying a low poly sphere to create unique instances of rocks.

const geometry = new THREE.SphereGeometry(0.3, 4, 4);
const material = new THREE.MeshStandardMaterial({ color: 0xffffff });
const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);

rock.geometry.vertices.forEach((v) =>
  v.multiplyScalar(Math.random() * 0.5 + 0.5),
);

Whether using indexed or non-indexed Buffer Geometry, I always end up with gaps between faces.

const geometry = new THREE.SphereGeometry(0.3, 4, 4);
const material = new THREE.MeshStandardMaterial({ color: 0xffffff });
const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);

const positionAttribute = geometry.getAttribute("position");

for (let i = 0; i < positionAttribute.count; i++) {
  const vertex = new THREE.Vector3().fromBufferAttribute(
    positionAttribute,
    i,
  );
  vertex.multiplyScalar(Math.random() * 0.5 + 0.5);
  positionAttribute.setXYZ(i, vertex.x, vertex.y, vertex.z);
}

positionAttribute.needsUpdate = true;

Likewise, tuning tolerance parameters of Buffer Geometry Utils merge vertices doesn’t seem like a solve to this problem, and generally impossible as a generic approach.

import * as BufferGeometryUtils from "three/addons/utils/BufferGeometryUtils.js";

geometry.deleteAttribute("normals");
const mergedGeometry = BufferGeometryUtils.mergeVertices(geometry)

Probably should just create new geometry classes, although that wouldn’t solve animation.

Any updated tips on this old issue?

References:

Note on that - vertices are almost 1:1 the same as current geometry.getAttribute('position'). Only the way of direct access changed (ie. now you just use attribute.setXYZ(vertexIndex) instead of directly modifying vertex vectors) but the data remains in the same form.

That’s indeed most likely caused by the geometry not being indexed - so you’re shifting pseudo-shared vertices in different directions. A good (and also way more optimal, looking at the amount of variations and geometries you’ve shown) solution would be to use InstancedMesh instead of a normal Mesh, and apply a custom vertex displacement shader to it. You can the sample a small 256x256 noise texture based on world position and / or instance index - and apply the vertex modifications on the GPU - that way you also won’t need to care whether the geometry is indexed or not, since the displacement would be applied based on position, not on vertex index.

4 Likes

@jasonsturges Another issue you may encounter…
The SphereGeometry has UV mapping that has discontinuities, and merging the vertices will not merge vertices that have differing UVs… which will result in a seam along one rib of the sphere…
so to truly get a vertex merged sphere, you can remove the uv attribute and the normal attribute first… then do the mergeVertices… then computeVertexNormals() on the resulting geometry, and then recompute uvs with a sphere unwrap or something similar (if you need them).

I too was initially frustrated by the changes to geometry, but the old way was Incredibly inefficient, and as mesh densities have increased, it became much more problematic from a performance perspective, so now vertex manipulation is a bit more complicated but its Soooo much more performant that it’s totally worth it.

1 Like

@mjurczyk Makes sense - most modern approaches I use are now shader-based. I miss this one-liner for simple low poly examples, though.

@manthrax After removing uv and normals, I still struggle with low poly geometry. Merging vertexes feels like I’m working around an index issue that I don’t understand. Understood, though.

Really appreciate the insights - felt like I was missing something dumb here regarding indexed vertices. Thanks for the sanity check.

1 Like

Some randoroids:

(they kinda look like walnuts but you get the idea…)

Live demo: https://chlorinated-cuddly-capacity.glitch.me

Code:

3 Likes

I’m late to the party, with an option of InstancedMesh :sweat_smile:
Picture:


Demo: https://codepen.io/prisoner849/full/PoMJyNY

4 Likes

@manthrax Ah, I see - it was order of operations. Geometry needs uv and normals removed, and vertices merged before modifying vertices:

let geometry = new THREE.SphereGeometry(0.3, 4, 4);
geometry.deleteAttribute("uv");
geometry.deleteAttribute("normal");
geometry = BufferGeometryUtils.mergeVertices(geometry);
geometry.computeVertexNormals();

const positionAttribute = geometry.getAttribute("position");

for (let i = 0; i < positionAttribute.count; i++) {
  const vertex = new THREE.Vector3().fromBufferAttribute(positionAttribute, i);
  vertex.multiplyScalar(Math.random() * 1.5 + 0.5);
  positionAttribute.setXYZ(i, vertex.x, vertex.y, vertex.z);
}

positionAttribute.needsUpdate = true;

const material = new THREE.MeshStandardMaterial({ color: 0x999999 });
const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);

Gotcha - that makes perfect sense now.

Think I’ll mark that as correct as that was exactly what I was trying to do.

@prisoner849 Awesome, really appreciate the example. Combined that example with the approach I was using:

rock2

const geometry = new THREE.SphereGeometry(0.3, 4, 4);
const material = new THREE.ShaderMaterial({
  uniforms: {
    time: { value: 0.0 },
  },
  vertexShader: `
    uniform float time;
    varying vec3 vNormal;
    float mod289(float x) { return x - floor(x * (1.0 / 289.0)) * 289.0; }
    vec4 mod289(vec4 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; }
    vec4 perm(vec4 x) { return mod289(((x * 34.0) + 1.0) * x); }

    float noise(vec3 p) {
      vec3 a = floor(p);
      vec3 d = p - a;
      d = d * d * (3.0 - 2.0 * d);

      vec4 b = a.xxyy + vec4(0.0, 1.0, 0.0, 1.0);
      vec4 k1 = perm(b.xyxy);
      vec4 k2 = perm(k1.xyxy + b.zzww);

      vec4 c = k2 + a.zzzz;
      vec4 k3 = perm(c);
      vec4 k4 = perm(c + 1.0);

      vec4 o1 = fract(k3 * (1.0 / 41.0));
      vec4 o2 = fract(k4 * (1.0 / 41.0));

      vec4 o3 = o2 * d.z + o1 * (1.0 - d.z);
      vec2 o4 = o3.yw * d.x + o3.xz * (1.0 - d.x);

      return o4.y * d.y + o4.x * (1.0 - d.y);
    }

    void main() {
      vec3 transformed = position;
      float n = noise(transformed * 10.0 + float(gl_InstanceID) + time);
      transformed += normalize(position) * n * 1.0;
      gl_Position = projectionMatrix * modelViewMatrix * vec4(transformed, 1.0);
      vNormal = normalMatrix * normalize(normal);
    }
  `,
  fragmentShader: `
    varying vec3 vNormal;
    void main() {
      vec3 color = vec3(0.5, 0.5, 0.5) * (vNormal * 0.5 + 0.5);
      gl_FragColor = vec4(color, 1.0);
    }
  `,
});

const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);

renderer.setAnimationLoop(() => {
  material.uniforms.time.value += 0.01;
  renderer.render(scene, camera);
});
2 Likes