I have a model that will be exported as gltf and imported to my React Three Fiber scene. I want to be able to change the shape of the model by scaling a section of it, so it’s not a uniform scale on the entire model. I made a screen recording of the scaling effect I want to achieve. I was thinking maybe I need to look into morphtargets but I felt that would be limiting the size that user’s can achieve. Are morphtargets really the way to go or is there a simpler way to scaling a section of a model?

If you want to replicate what 3D editing software does - you have to write your app as if it was a 3D editing software. Ie. you shouldn’t look at this as scaling a mesh / group - rather as displacing and modifying the position buffer (which contains the vertices) directly.

With exception of features like blender modifiers - 3D apps don’t store user operations as a chain of matrix transformations of translation / rotation / scale, they just move the vertices directly. And to do that easily - its likely best all your meshes have a scale of 1/1/1 at all times.

Great okay I am pretty close to what I want **however** I don’t know the equation to get all the vertices to move outwardly like how you saw in my first screen recording where I scaled the vertices in the 3d App. Currently they all move in one direction but not outwardly respective to their positions. I tried adding 1 to each vertex and multiplying by 1.1. Each gave a similar result (all moving in a direction). The relevant code.

```
onClick={(e) => {
const positionAttribute = nodes.table_A_Base_1.geometry.attributes.position;
const positions = positionAttribute.array;
for (let i = 0; i < positions.length; i += 3) {
const x = positions[i]; // Get the X coordinate of the vertex
const y = positions[i + 1]; // Get the Y coordinate of the vertex
const z = positions[i + 2]; // Get the Z coordinate of the vertex
if (y < 2) {
positions[i] *= 1.1; // Scale the X coordinate
positions[i + 2] *= 1.1; // Scale the Z coordinate
}
}
positionAttribute.needsUpdate = true;
nodes.table_A_Base_1.geometry.computeBoundingSphere();
}
```

Ok after more tinkering around (chatgpt ) I was able to get the effect I wanted. All I had to do was also factor in the vertexes Normals to the increments.

```
<mesh
onClick={(e) => {
const positionAttribute = nodes.table_A_Base_1.geometry.attributes.position;
const normalAttribute = nodes.table_A_Base_1.geometry.attributes.normal;
const positions = positionAttribute.array;
const normals = normalAttribute.array;
const moveDistance = 0.1; // Distance to move along the normal (adjust as needed)
for (let i = 0; i < positions.length; i += 3) {
const x = positions[i]; // Get the X coordinate of the vertex
const y = positions[i + 1]; // Get the Y coordinate of the vertex
const z = positions[i + 2]; // Get the Z coordinate of the vertex
const nx = normals[i];
const ny = normals[i + 1];
const nz = normals[i + 2];
if (y < 2) {
positions[i] += nx * moveDistance;
// positions[i + 1] += ny * moveDistance;
positions[i + 2] += nz * moveDistance;
}
}
positionAttribute.needsUpdate = true;
nodes.table_A_Base_1.geometry.computeBoundingSphere();
}>
```

that’s pretty incredible if chatgpt solved that for you.

Lol yeah it’s true, in fact let me put the exact prompt and the answer chat gpt gave me here, verbatim.

ME:

OK now how can I move my vertices relative to their normals direction?

CHATGPT:

To move vertices along their normals, you need to follow these steps:

- Retrieve the positions and normals of each vertex.
- Scale the normal vectors by a certain factor to determine the direction and distance of movement.
- Add the scaled normal vectors to the vertex positions.

Here’s how you can achieve this in Three.js:

```
const positionAttribute = nodes.table_A_Head_1.geometry.attributes.position;
const normalAttribute = nodes.table_A_Head_1.geometry.attributes.normal;
const positions = positionAttribute.array;
const normals = normalAttribute.array;
const moveDistance = 0.1; // Distance to move along the normal (adjust as needed)
for (let i = 0; i < positions.length; i += 3) {
const x = positions[i];
const y = positions[i + 1];
const z = positions[i + 2];
const nx = normals[i];
const ny = normals[i + 1];
const nz = normals[i + 2];
// Move the vertex along its normal
positions[i] += nx * moveDistance;
positions[i + 1] += ny * moveDistance;
positions[i + 2] += nz * moveDistance;
}
// Update the buffer attribute array with the modified values
positionAttribute.needsUpdate = true;
// Notify Three.js that geometry has changed
nodes.table_A_Head_1.geometry.computeBoundingSphere(); // Update bounding sphere (optional)
```

In this code:

- We iterate through each vertex in the geometry.
- For each vertex, we retrieve its position (x, y, z) and normal vector (nx, ny, nz).
- We move the vertex along its normal vector by scaling the normal vector by the desired move distance and adding the result to the vertex position.
- Finally, we update the buffer attribute array with the modified vertex positions and notify Three.js that the geometry has changed.

Adjust the `moveDistance`

variable to control how far the vertices move along their normals. Positive values move the vertices outward from the surface, while negative values move them inward.