In TSL compute shaders I would like to know how to read the data from a displacement map and then use that to update the meshes geometry (position buffer attribute). I’m still having some trouble understanding the new API
You can also do the displacement directly in the vertex shader. But if you want to do it with the compute shader, that’s also possible. In both cases (compute / vertex shader) you need textureLoad because neither compute nor vertex shader accept sampler textures.
But now I’m implying that you’re using wgslFn, which very few people do. I think TSL with the Fn node does it automatically when you assign textures, but I’m not that familiar with TSL shaders. I like wgslFn because it allows me to work in wgsl and I can use the W3C docu for it.
As for the communication between compute shader and vertex shader. Instead of attributes, you then need storages.
Here is an illustration of wgslFn. This allows you to work in a very similar way to the RawShaderMaterial from WebGL.
//corresponds to uniforms
const vertexShaderParams = {
projectionMatrix: cameraProjectionMatrix,
cameraViewMatrix: cameraViewMatrix,
modelWorldMatrix: modelWorldMatrix,
position: attribute("position")
}
const vertexShader = wgslFn(`
fn main_vertex(
projectionMatrix: mat4x4<f32>,
cameraViewMatrix: mat4x4<f32>,
modelWorldMatrix: mat4x4<f32>,
position: vec3<f32>
) -> vec4<f32> {
var outPosition = projectionMatrix * cameraViewMatrix * modelWorldMatrix * vec4<f32>(position, 1);
return outPosition;
}
`);
//corresponds to RawShaderMaterial
const material = new THREE.MeshBasicNodeMaterial();
material.colorSpace = THREE.SRGBColorSpace;
material.vertexNode = vertexShader( vertexShaderParams );
In my ocean repo I use a lot of compute shaders that read textures. Do you know this? Maybe it will help you? The compute shaders calculate displacement textures which the vertex shader then uses as displacement for the vertices. But as mentioned, you can also store the displacements in storages and use them as attributes if you prefer that way