Currently, I have two textures of the same size loaded onto two canvases, one is an RGB image (rgbTexture), and the other is a depth image (grayscale) (depthTexture). In the three.js scene, I want to create a 3D mesh where the color comes from the RGB texture, and the depth (or the z displacement of each vertex) comes from the depth texture. Below are some of my current ideas, but it’s not working. Could anyone provide any suggestions? Thank you!
geometry = new THREE.BufferGeometry();
const uniforms = {
iChannel0: {
type: “t”,
value: rgbTexture,
},
iChannel1: {
type: “t”,
value: depthTexture,
},
uTextureProjectionMatrix: {
type: “m4”,
value: camera.projectionMatrix,
},
};
const shaderMaterial = new THREE.ShaderMaterial({
uniforms: uniforms,
overdraw: true,
vertexShader: vertShader,
fragmentShader: fragShader,
transparent: false,
});
const mesh = new THREE.Mesh(geometry, shaderMaterial);
vertShader = `
uniform sampler2D iChannel1;
precision highp float;
varying vec2 vUv;
float displacement;
void main() {
vUv = uv;
displacement = texture2D(iChannel1, uv).r.
vec3 newPosition = position;
newPosition.z += displacement;
gl_Position = projectionMatrix * mvPosition * modelViewMatrix * vec4(newPosition, 1.0);
}`;
fragShader = `
precision highp float;
uniform sampler2D iChannel0;
uniform mat4 uTextureProjectionMatrix;
varying vec2 vUv;
out vec4 fragColor;
void main()
{
vec3 col = texture(iChannel0, vUv).rgb;
fragColor = vec4(col, 1.0);
}`