Calculate vertex normals for indexed PlaneBufferGeometry in vertex shader after displacement

I’ve been banging my head on this for a while now. I have an indexed PlaneBufferGeometry I’m using for a GPGPU cloth physics simulation, and I can’t for the life of me calculate the normals correctly in the final vertex shader after rendering the simulation to the texture.

This is my current setup (relevant parts included), and I think it’s not working because I need a way to know the order of the current vert and it’s neighbours in the “face”. Just can’t quite get it right.

javascript:

// get faces
const indices = geometry.index.array;
const faces = [];
for(let i = 0; i < indices.length; i += 3)
{
    faces.push([indices[i + 0] * 3, indices[i + 1] * 3, indices[i + 2] * 3]);
}

const vertices = geometry.attributes.position.array;

// begin loop

// initializing data texture vertex positions
dataTexturePixels[index * 4 + 0] = vertices[index * 3 + 0];
dataTexturePixels[index * 4 + 1] = vertices[index * 3 + 1];
dataTexturePixels[index * 4 + 2] = vertices[index * 3 + 2];
dataTexturePixels[index * 4 + 3] = 0;

// storing lookup uv's in an attribute for looking up positions
positionReference[index * 3 + 0] = (index % size) / size;
positionReference[index * 3 + 1] = Math.floor(index / size) / size;

// end loop

This is where my brain is tripping up. I’ve tried using the face index values from the faces array in various ways, but because indices are duplicated, data’s being overwritten. Can’t think of how to properly store the vertex index information for each face so it can be looked up in the vertex shader using the positionReference (or some other way).

vertex shader / after simulation runs:

// how I'd calculate the normals if I could get a proper ordered reference
vec2 coord1 = faceVert1UvReference.xy;
vec3 pos1 = texture2D(tPositions, coord1).xyz;

vec2 coord2 = faceVert2UvReference.xy;
vec3 pos2 = texture2D(tPositions, coord2).xyz;

vec2 coord3 = faceVert3UvReference.xy;
vec3 pos3 = texture2D(tPositions, coord3).xyz;

vec3 tangent = pos3 - pos2;
vec3 bitangent = pos1 - pos2;
vec3 normal = normalMatrix * normalize(cross(tangent, bitangent));

fragment shader / lighting:

vec3 lightDirection = normalize(lightPosition); // also tried normalize(lightPosition - vWorldPosition);
vec3 normal = normalize(vNormal);
float lightValue = max(0.0, dot(normal, lightDirection)) * lightIntensity;
finalColor.rgb *= lightValue;

Not sure if I’m missing something obvious/doing something dumb, or this problem is indeed hard. Without posting the many failed ways I’ve tried does anyone have any ideas?

Any help is greatly appreciated.


Edit: I’ve added a couple examples, this one that uses flat shading with face normals, and this one showing my current messed up smooth vertex normals progress. Having a hard time finding my error…

Hi mystaticself,

It is in deed a complex case scenario you are simulating. I believe the best would be to setup a JSfiddle with the latest/closest results, so we can have a base to work on and a picture of what you want to achieve by calculating the normals.

What needs to be cleared is which normal are you trying to calculate (face, vertex)? I am assuming it is the face normal you are calculating.[quote=“mystaticself, post:1, topic:354”]
Can’t think of how to properly store the vertex index information
[/quote]

Yes, you will have to pass the vertex index to the shader in order to calculate the face normal, (if this is what you are trying to achieve). As you may already know, the vertex shader works on each vertex (in parallel) and does not hold any info about their relations. You don’t need to extract the uv coordinates to get your normals, the index should suffice.

There is a special reason why you are not using THREE.js built in normals and need to recalculate?

Hi INF1N1T,

I’ll try to get a fiddle up later today, it’s a pretty involved setup with my build tools but I’ll see if I can port it.

I’m trying to calculate vertex normals for smooth shading. My uv coordinates comment was a little misleading (I’ve updated it a bit), I simply meant to use the positionReference.xy (was uvReference) texture coordinates to lookup the neighbouring vertices and their order within the current face somehow. Just can’t figure out how to store this info so it’s easy to look up for each vertex.

The built in normals aren’t correct after I’m animating the vertex positions, so I need to recalculate them. As I write that I’m thinking there may (also) be an issue with my lighting calculations.

I’ve updated the original question with an example of my lighting, if something looks off please let me know.

Thanks again.

Re @mystaticself,

I know there is a way to do it within the fragment shader in model 4, but I am not aware of such feature in webGL 1.0.

The trick with the uv map might work, you will have to try with a simple case scenario and see if it effectively works. You could use the buildin shader debugger from firefox dev to hack this one.

There is a reason why you are not calculating the vertex normals on the CPU side and passing the info to the shader?

PS: see this post for more details on the subject: https://gamedev.stackexchange.com/questions/75313/calculating-per-vertex-normal-in-geometry-shader

I’d rather not do it on the CPU because I’m not using the bufferGeometry’s position attribute, I’m rendering the positions to a data texture. I haven’t tried it, but I assume looping through each pixel/position and recalculating normals on the CPU each frame would be pretty expensive. This also has to run on mobile devices. Worth a perf test though.

Edit: It doesn’t even seem possible to read back the pixel values from a Float32Array-based data texture, which means I can’t update the position attribute and recalculate the normals on the CPU.

re @mystaticself

Normals calculation is not that expensive, but it could become a bottleneck if there is a lot of vertices.

How many vertices are there anyways?

You could use THREE.js position data structure and pass an uniform containing vertices relations to calculate normals on the GPU.

I would try the CPU approach first and see if it scales properly (up to 40K vertices on PC should be a breath), otherwize you will have no choice but to do it on the GPU.

I ended up scrapping the above shared-face approach and used a neighbour lookup approach instead, thanks to @luruke for the tip. I had previously tried this but was using incorrect values for grabbing the neighbours from the lookup texture that made me think it wasn’t working.

Here’s how I’m calculating the normals now in the vertex shader.

float diff = 0.06; // tweak this value to yield different results.
vec2 coord = positionReference.xy;
vec3 transformed = texture2D(tPositions, coord).xyz;
vec3 neighbour1 = texture2D(tPositions, coord + vec2(diff, 0.0)).xyz;
vec3 neighbour2 = texture2D(tPositions, coord + vec2(0.0, diff)).xyz;
vec3 tangent = neighbour1 - transformed;
vec3 bitangent = neighbour2 - transformed;            
vec3 nrml = cross(tangent, bitangent);
vNormal = normalMatrix * -normalize(nrml); // pass to fragment shader

Much easier approach… sigh.

2 Likes