I have an array of points i need to check the distance agin in a shader pass. When using the world coords coordinates the points are relative to the camera, how can i convert them to be in the local space?
Hi.
To convert world coordinates to local (or object) space coordinates in a shader, you typically need to apply a transformation that goes from world space to object space. Here’s how you can do that in your shader:
Obtain the Model Matrix: Make sure you have the model matrix of the object for which you want to transform the world coordinates to local space. This matrix is usually provided by your rendering pipeline.
Inverse Model Matrix: You can transform the world coordinates to object coordinates by using the inverse of the model matrix. If you don’t have the inverse matrix readily available, you can compute it in your CPU code or use inverse(mat4) function if your shader language supports it (like GLSL).
Transforming Coordinates: In your shader, you will need to multiply the world coordinates by the inverse of the model matrix to convert them to local coordinates.
Points you’re passing to the shader are in local space by default. If you pass “position” to a varying, you’ll get that local position also in fragment shader.
Actually the point i am testing with is relative to the camera, whenever the camera move it “follows”. I need a point that will stay static. Also this is a shader pass, i dont know if this changes anything.
unfortunatly it doesnt work, the point always is relative to the camera.
@pavlo_fetisov Scene0 is always respective of the camera, when it orbits, so moves the point
Here is the current code:
void main() {
vUv = uv;
scene0 = inverse(modelMatrix) *vec4(1.0, 1.0, 4.0, 2.0 ) ;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
127.0.0.1:5500/scene.html - 23 January 2025 - Watch Video
So I subtract the camera position but there are some perspective issue
scene0 = inverse(modelMatrix) * vec4(2.0, 3.0, 1.0, 1.0 );
test = scene0.xyz - cameraWorldPos;```