InstancedBufferGeometry recalculating normals after vertex transform

Hello. I’m trying to use a shader onBeforeCompile with InstancedBufferGeometry to position blades of grass. The rotation and position looks good, but the normals (which I’m transforming by the same method as the vertex rotation) don’t quite match up. If I do the same thing but with InstancedMesh, then the normals look good, so there must be something wrong with my normals recalculating.

This is how I’m doing the shader transforms

vec3 rotateVectorByQuaternion(vec3 v, vec4 q){
    return 2.0 * cross(q.xyz, v * q.w + cross(q.xyz, v)) + v;
}
vec3 vPosition = position;
vec4 direction = vec4(0.0, rotation.x, 0.0, rotation.y);
vPosition = rotateVectorByQuaternion(vPosition, direction);
vNormal = rotateVectorByQuaternion(vNormal, direction);

vec3 transformed = vPosition + offset;

grass_normals

This is side-by-side, InstancedMesh vs InstancedBufferGeometry. You can see that facing the light, the normals of the InstancedBufferGeometry are still reflecting the light.

Here’s the codepen for that: https://codepen.io/jkstrawn/pen/YzNdMmQ?editors=0010

Sidenote: I’m trying to use InstancedBufferGeometry because I assume it will be more performant than InstancedMesh (which already works). Does anyone have any thoughts on that?

I don’t think there should be a measurable performance difference.

Besides, it seems I can’t reproduce the picture of the screenshot with your codepen. It seems both approaches work as expected.