Regarding Displacement Maps: Does Three.js do Vector Displacement?

Three.js materials can handle grayscale displacement maps by moving the geometry vertices vertically.
But where a three.js material is give a multi-colored displacement map, does three.js displace the geometry vertices along all 3 axes (vector displacement)?
From what I have seen I am guessing not, but I just want to confirm and to determine if there are any shader modifications that could enable three.js to do so…
(I realize that, under normal circumstances, the normal map would also have to be recomputed. However, that is not an issue in my case because I already have a normal map computed using vector displacement.)

Not quite. It shifts along normals. See this shader chunk: three.js/src/renderers/shaders/ShaderChunk/displacementmap_vertex.glsl.js at 39ab305a37b4f1e1e8709672c1078183e393a30a · mrdoob/three.js · GitHub


So, if you displace a flat plane, it will all displace along a line that is perpendicular to the surface of the plane (i.e. straight out)? And if you displace a sphere, it will displace along a line that runs from the center of the sphere? That makes sense if you want to wrap a displacement map around an object.

The program I am working with gives me a both a 3 way displacement map and a normal map computed based on that 3 way displacement map. If I assign both to the material, would three.js use that normal map to make the displacement? If so, that would seem to cause the kind of 3 way displacement I was hoping for.

Here is an interesting discussion which confirms that three.js does not re-compute normal maps when the user provides a displacement map - which means that the user needs to provide both a displacement and a normal map. And the message above indicates that three.js uses the normal map to determine the direction of the displacement. Which means that three.js utilizes a user-provided normal map to compute direction of displacement. Ipso facto rexum flaxit (as my logics professor used to say), three.js does vector displacement!. Cool beans!

After posting the above epiphany, I realized that, if three.js does displace vertices horizontally, the edges of any planes that I use this on should not be straight, but should be ragged. As this shows, the edges are perfectly straight. Apparently, I am misunderstanding how the code works. Oh well!

I am guessing that the “objectNormal” used in three.js is the normal value for the object, before and without regard to any changes made by the normalMap. Thus, on a sphere, the objectNormal will be the normal for that sphere face. As noted above, this makes sense because the expected workflow would be (1) original object with original normals; (2) vertex displacement (displacementMap implemented in the vertex shader); and (3) shading (normalMap implemented in the fragment shader) - not the reverse.

1 Like

Amazing. It is the first time in my life Google gives me exactly one hit and it is from this forum!!!
(I tried to look the meaning of Ipso facto rexum flaxit)

Yes, Google is very good at picking up posts from this forum. I have often posted a question on this forum and, when searching Google, the next day, one of the top results is my question.
As for that phrase, my professor had a sense of humor. The first two words are real, the rest is nonsense.

p.s. Actually he was my commercial law professor. I said logics professor because I didn’t want to sound pretentious. But the study of law is the study of logic - or, often, the lack thereof. I just hope I was able to add a little “class” to this forum :sunglasses:

1 Like

Using that as a guideline, I am trying to create an extension that will displace a vertex in all 3 directions.
Here is what I added to the definition for a material that will be applied to a flat plane:

    onBeforeCompile: shader => {
        shader.uniforms.dmap = wav_.Dsp;	// Displacement Map
        shader.vertexShader = `
            uniform sampler2D dmap;
            varying vec2 vUv;					
            `#include <begin_vertex>`,
            `#include <begin_vertex>
                vUv = uv;
                transformed += vec3(0.0,1.0,0.0) * (texture2D(dmap, vUv).xyz * 10.0 + 0.0);

However, nothing happens. To keep things simple, I am using a texture with the same number of segments as the dimensions of the displacementMap (512x512). I have tried various alternatives, including this: transformed.y = texture2D(dmap, vUv).y * 10.0;

But nothing seems to cause a displacement.

With much assistance, I had previously created an extension that used a sine wave to displace transformed.y and it worked perfectly. So I am baffled as to why this is not working.

Does wav_.Dsp have that format {value: _displacement_texture_}?

It should since if I use it as the displacementMap for the material it works fine.

I have tried this:

let gu = {
		dmap: {value: THREE._displacement_texture_},

then after the value for wav_.Dsp is determined, I have: gu.dmap = wav_.Dsp;
and I have changed the first line in the extension to: shader.uniforms.dmap = gu.dmap;

But I am still not getting any displacement.

I see that the index for the displacementMap has gotten a bit more complicated than in the “old” days when they used “uv”. It is now called vDisplacementMapUv and is defined as:

vDisplacementMapUv = (displacementMapTransform * vec3(DISPLACEMENTMAP_UV, 1)).xy

Or would it be possible to enter wav_.Dsp into material.displacementMap, so that it will compute values like vDisplacementMapUv, but then replace - or supplement - the displacement instructions so that they will add x and z values to transformed? Is “begin_vertex” the right place to do that?

it’s still not clear for me what wav_.Dsp contains, what its structure.

It’s from that iFFT wave generator that I updated.
The entire code is here, The specific shader that generates the map is at lines 677-718 of the JS code.

Here is what it looks like when the map is used as a texture.

But perhaps it would save time to note that it appears to be a vec4 table, since the last instruction in the shader is:

gl_FragColor = vec4(outputA,outputB);

I am guessing that the standard displacementMap is a vec3 table.

That may be messing up the index, but I would think that, even if it is picking up the wrong values, I would be getting some displacement, even if it is the wrong displacement.

From what I see in the codepen, if wav._Dsp is this wav_.Dsp = this.displacementMapFramebuffer.texture;

Then, when you do this gu.dmap = wav_.Dsp;, you’ve got gu.dmap = some_texture, whereas it has to be gu.dmap = {value: some_texture}

Uniforms on js side have to have this structure: {value: some_value}

Have a look in the docs on Uniform

1 Like

That’s it! I forgot that uniforms need that syntax, even though the iFFT program is loaded with correct examples of how to load values into uniforms.

I have now changed that line to gu.dmap = {value: wav_.Dsp};
and am getting horizontal displacement.

I have also changed the replace section to take into account that, in the standard displacementMap, height is the x value:

        `#include <begin_vertex>`,
        `#include <begin_vertex>
            vec3 dsp = vec3(1.0,1.0,1.0)*(texture2D(dmap, uv).rgb * 1.0 + 0.0);
            transformed.x += dsp.y;
            transformed.y += dsp.x;		// displacementMap stores heigh in x
            transformed.z += dsp.z;

I will have to figure which displacmentMap values go with which transformed values.

If you are using WebGL2 (which is the three.js default), the “texture2D” apparently needs to be replaced with “texture”.

Given the ease of adding vector displacement, have the three.js people considered adding this capability to three.js? Three.js only uses the x value with Displacement Maps - the y and z values are ignored.

One challenge would be to create a shader to compute normal values, but I would assume that this is not a huge challenge and has already been done many times. Or they could require users to create and provide the correct normal map (which is what the iFFT program is doing).

Just curious.