Help Please ! Triplannar mapping of texture is not working

I am trying to do texture mapping with triplanner mapping in three.js in terrain created after heightmap data into plane mesh.

To do that i load the diffuse texture by

 let diffuseTex;
  loader.load(
    "./resources/images/satellite.png",
    function (image) {
      diffuseTex = image;
    },
    undefined,
    function () {
      console.error("An error happened.");
    }
  );

I did use that texture created from the diffusemap image that i have locally as an uniform

var uniforms1 = {
    diffuseTexture: { type: "t", value: diffuseTex },
  };

  const _meshMaterial = new THREE.RawShaderMaterial({
    uniforms: uniforms1,
    vertexShader: terrainShader._VS,
    fragmentShader: terrainShader._FS,
  });

My shader code is:

const _VS = `#version 300 es
precision highp float;

uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;

in vec3 position;
in vec3 normal;

out vec3 vNormal;
out vec3 vPosition;

#define saturate(a) clamp( a, 0.0, 1.0 )

void main(){
    vNormal = normal;
    vPosition = position.xyz;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
    vPosition = vec3(gl_Position.xyz)*0.5 + 0.5;
    
  }
`;

const _FS = `#version 300 es

precision mediump sampler2DArray;
precision highp float;
precision highp int;

uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform vec3 cameraPosition;
uniform sampler2D diffuseMap;

in vec3 vNormal;
in vec3 vPosition;

out vec4 out_FragColor;

vec3 blendNormal(vec3 normal){
    vec3 blending = abs(normal);
    blending = normalize(max(blending, 0.00001));
    blending /= vec3(blending.x + blending.y + blending.z);
    return blending;
}

vec3 triplanarMapping (sampler2D tex, vec3 normal, vec3 position) {
  vec3 normalBlend = blendNormal(normal);
  vec3 xColor = texture(tex, position.yz).rgb;
  vec3 yColor = texture(tex, position.xz).rgb;
  vec3 zColor = texture(tex, position.xy).rgb;
  return (xColor * normalBlend.x + yColor * normalBlend.y + zColor * normalBlend.z);
}

void main(){
    vec3 color = triplanarMapping(diffuseMap, vNormal, vPosition);
    out_FragColor = vec4(color, 1.0);

  }
`;
export const terrainShader = { _VS, _FS };

Then i create a mesh with the geometry which is filled with the vertices data and material created with custom shader:

let geometry = new THREE.PlaneGeometry(
      partial.x,
      partial.y,
      partial.x - 1,
      partial.y - 1
    );
    geometry.setFromPoints(chunk[i]);
    geometry.computeVertexNormals();
    let mesh = new THREE.Mesh(geometry, _meshMaterial);
    mesh.castShadow = false;
    mesh.receiveShadow = true;

    scene.add(mesh);

I don’t know what i am doing wrong because i am getting this as output of terrain:

I was expecting mapping of this texture above this terrain.

texture to map:

image

I am new to this and struggling can anyone please help me ?

This is the complete code and this code is not working because though i did put all code into fiddle but three.js loader.load( image url) is thowing error of CORS which is not going even if i am doing loader.setCrossOrigin(“anonymous”);

because of it demo is not working but fiddle is

You want to use webgl 2.0 shader into webgl 1.0 shader

you don’t happen to use any SharedArrayBuffers do you? if so changed it to ArrayBuffer

is there anything wrong on the shader?

i am sorry, can you please help me know where can i use that ?

Test one projection

void main(){
    out_FragColor = texture(diffuseMap,position.yz);
  }

If black, then into console use
diffuseTex.needsUpdate=true;

With that texture doing this:

loader.load(
    "./resources/images/satellite.png",
    function (texture) {
      texture.needsUpdate = true;
      texture.minFilter = THREE.NearestMipmapNearestFilter;
      texture.magFilter = THREE.LinearFilter;
      texture.wrapS = texture.wrapT = THREE.RepeatWrapping;

and shader code as:

 vec3 color = triplanarMapping(diffuseTexture, vNormal, vPosition);
    out_FragColor = vec4(color, 1.0);

Output is:

This is zoomed in, this is same image tiled. I just wanted to let you know if it will help.

even with:

I removed that repeat wrapping and with this in shader

out_FragColor = vec4(texture(diffuseTexture, vPosition.yz).rgb, 1.0);

output is same as with

  vec3 color = triplanarMapping(diffuseTexture, vNormal, vPosition);
    out_FragColor = vec4(color, 1.0);

which is

The output is no longer after i created material in callback of load i.e

const loader = new THREE.TextureLoader();
  loader.setCrossOrigin("anonymous");

  loader.load(
    "./resources/images/satellite.png",
    function (texture) {
      texture.needsUpdate = true;
      texture.minFilter = THREE.NearestMipmapNearestFilter;
      texture.magFilter = THREE.LinearFilter;
      // texture.wrapS = texture.wrapT = THREE.RepeatWrapping;

      var uniforms1 = {
        diffuseTexture: { type: "t", value: texture },
      };

      _meshMaterial = new THREE.RawShaderMaterial({
        uniforms: uniforms1,
        vertexShader: terrainShader._VS,
        fragmentShader: terrainShader._FS,
      });
      createMainMesh();
      // createChunkFromMesh();
      lightingSetUp();
    },
    undefined,
    function (err) {
      console.error("An error happened.", err);
    }
  );

Please help me i am clueless on this

Works good if comment this

//#version 300 es
//vPosition = vec3(gl_Position.xyz)*0.5 + 0.5;

Into material: glslVersion: THREE.GLSL3,

				const material = new THREE.RawShaderMaterial( {
					glslVersion: THREE.GLSL3,
					uniforms: {
						diffuseMap: { value: texture },
					},
					vertexShader,
					fragmentShader,
					side: THREE.DoubleSide,
				} );

For stretched texture:

void main(){
    vNormal = normal;
    vPosition = position.xyz/10.0;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
    //vPosition = vec3(gl_Position.xyz)*0.5 + 0.5;
}

Texture code:

var texture_loader=new THREE.TextureLoader();
var texture=texture_loader.load("test.jpg");
texture.wrapS=texture.wrapT=THREE.RepeatWrapping;

I tried this but no change in output, do you think that for passing vertex position to shader, am doing good?

I do nothing but just put
in position

in shader

If landScape is chunks, then maybe need add chunk position to his vertexposition in shader

Need full zip archive

1 Like

The terrain is not in chunk rather it is complete one mesh.

This is my terrain this is color that i used normalized normal for color so its red and green
I am posting this to show my terrain

The mesh i have is 4104x1856 in length and breadth

and mapping texture is also of 4104x1856

The texture is supposed to map into the mesh

My texture is:

image

I saw that you were dividing the position with value of 10 so i did this:

 vPosition = vec3(position.x/4104.0, position.y/1000.0, position.z/1856.0);
 gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);

I am dividing y with 1000 because i am getting y value of the vertices from the elevation file containing elevation of the vertices between 0 and 1 and multplying it by 1000 to get final result

const geometry = new THREE.PlaneGeometry(
    4104,
    1856,
    4103,
    1855
  );

  const vertices: Array<number> = (geometry as THREE.BufferGeometry).attributes
    .position.array as Array<number>;

  let i = 0,
    j = 0,
    l = 0;

for (i = 0, j = 0, l = vertices.length; i < l; i++, j += 3) {
    vertices[j + 1] = data[i] * 1000;
  }

data is array of height value from elevation file which has generated the above terrain.

so i divided position.y by 1000 in

vPosition = vec3(position.x/4104.0, position.y/1000.0, position.z/1856.0);

I removed the

texture.wrapS = texture.wrapT = THREE.RepeatWrapping;

because right now i have texture of same size as the mesh size.

and this time i am getting something which i believe is progress than what i had earlier.

Big thanks to you.

but the problem now is that it is not mapping correctly

And this is output for

out_FragColor = vec4(vPosition.xyz, 1.0);

in the mesh i dont see any place white, i dont know what position shader is getting or value of position attribute which i thought get its value automatically.

Another case is with this texture:

image

output is:

Star texture shows incorrect. Star moved to left. PlaneGeometry scale geometry to left and up from center 0.0. You can add some offset to vPosition

void main(){
vNormal = normal;
vPosition=vec3(position.x/4104.0+0.5,position.y/1000.0,position.z/1856.0+0.5);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}

Are you sure the uvs are properly mapped inside your mesh?

1 Like

I did this:

    vNormal = normal;
    vNormalView = normalize(normalMatrix*normal);
    float _positiony = position.y*heightScale;
    vPosition.x = ((position.x/15216.0) *-0.5) + 0.5;
    vPosition.z = ((position.z/7424.0)*0.5) + 0.5;
    float _heightScale = 1000.0*heightScale;
    vPosition.y =  position.y/_heightScale;

and the output is now after applying a directional lighting:

Thank you very very much for guiding me.

I have heightmap data for 4104x1856 vertices,

So i create mesh with this:

const geometry = new THREE.PlaneGeometry(
8 * 4104,
8 * 1856,
4104 - 1,
1856 - 1
);

I multiplied with 8 because putting higher number looks much better than 4104. I dont know the reason.

I apply the topological map image which is also 4104x1856 in the fragment shader with triplanar sampling.

I am sue if i am not understanding how this woks but

The problem is that when i zoom in the output is this:

is there any thing that i can do make this smooth ? or atleast something that won’t look wierd like this?

Looks good. I use another triplanar. For one texture:

void main(){
vec3 blend_weights=vNormal*vNormal;
float maxBlend=max(blend_weights.x,max(blend_weights.y,blend_weights.z));
blend_weights=max(blend_weights-maxBlend*0.5,0.0);
blend_weights.y*=0.5;
float rcpBlend=1.0/(blend_weights.x+blend_weights.y+blend_weights.z);
vec3 weights=blend_weights*rcpBlend;
vec3 color=weights.x*texture(diffuseMap,vPosition.zy).rgb+weights.y*texture(diffuseMap,vPosition.xz).rgb+weights.z*texture(diffuseMap,vPosition.xy).rgb;
out_FragColor = vec4(color*light_c, 1.0);
  }

For 2 textures:

void main(){
vec3 blend_weights=vNormal*vNormal;
float maxBlend=max(blend_weights.x,max(blend_weights.y,blend_weights.z));
blend_weights=max(blend_weights-maxBlend*0.5,0.0);
blend_weights.y*=0.5;
float rcpBlend=1.0/(blend_weights.x+blend_weights.y+blend_weights.z);
vec3 weights=blend_weights*rcpBlend;
vec3 color=weights.x*texture(diffuseMap2,vPosition.zy).rgb+weights.y*texture(diffuseMap,vPosition.xz).rgb+weights.z*texture(diffuseMap2,vPosition.xy).rgb;
out_FragColor = vec4(color*light_c, 1.0);
}

It looks more smoother when there is flat plane but looks very unusual when there is a hill or elevation and on closer proximity i dont know but there is a artifact like this:

and this is on hill:

but overall image looks smooth from far:

what do you suggest?

Try change order likes this: vPosition.zy to vPosition.yz, vPosition.xy to vPosition.yx
vec3 color=weights.x*texture(diffuseMap2,vPosition.yz).rgb+weights.y*texture(diffuseMap,vPosition.xz).rgb+weights.z*texture(diffuseMap2,vPosition.yx).rgb;

I did this but the texture mapping went more mad !!! When i dont have texture mapped even at that time there is pixalation kind of thing is visible which i believe is height difference but height data of vertices in elevation map file is very correlated data like this

Maybe in total one side be good looking but if look at back side there again wiil be not correct, because it not seamless texture.