Shader geometry variables space

Hello :wave:

I’ve been tinkering with built-in materials (phong in particular), but Im running into some issues with specular lighting - specular stays in the same place regardless of camera movement. I suspect Im not operating in the same space as most ShaderChunks.
So here is my question: In what space is shading done in three.js shaders? In particular, inside lights_fragment_begin.glsl.js there is geometry input data:

GeometricContext geometry;

geometry.position = - vViewPosition;
geometry.normal = normal;
geometry.viewDir = ( isOrthographic ) ? vec3( 0, 0, 1 ) : normalize( vViewPosition );

I am providing my own geometry data, but im not sure about in what spaces to provide them. Am I right to assume vViewPosition is fragment position in eye space? i.e. viewMatrix * modelMatrix * vec4( position, 1.0 );?
But then in what space is geometry.normal supposed to be? :thinking:

Does anyone familiar with shader chunks know?

thats simple search, dolphin - 1 and 2

2 Likes

Geometry data that used for lighting must be computed in the same space. As the shader code indicates, they are in view space.

If you look at the code in defaultnormal_vertex.glsl.js, normal is transformed by normalMatrix which seems to be computed from seems to be the modelViewMatrix of the object if you look at the code in WebGLRenderer.js line 1720, line 1252

Although, you should provide everything in object space when you upload the geometry data since threejs shader will handle the matrix transformation for you. This example webgl_interactive_buffergeometry might be helpful.

1 Like

All default uniforms are described here: three.js docs

// = inverse transpose of modelViewMatrix
uniform mat3 normalMatrix;
2 Likes

Thank you guys for confirmation and links! :smiling_face_with_three_hearts:
Rendering stuff can be all over the place. Is there any particular reason (like performance?) why shading is done in view space in three.js, or is it just preference?