Discontinuous Normals on a Normalized Cube Mesh with Displacement Texture

In my project, I have created a normalized cube using six plane meshes. To improve its visual quality, I added a displacement texture and calculated the normals for that texture to ensure proper lighting. However, I’m facing an issue where the resulting normals appear discontinuous at the edges, revealing the underlying cube structure rather than achieving a smooth, sphere-like shape.
you can see here: EXAMPLE

I have been working on this problem for weeks, but I have yet to make significant progress. During my research, I came across a relevant Stack Overflow question titled “How can I eliminate normal map seams on Cube Mapped Three.js BoxBufferGeometry?” you can see here.

One of the responses suggests not using the tangent frame provided by Three.js and instead generating tangents using the vertex data to align them at the seam. However, this is above my understanding of three.js. I am unfamiliar with this technique and would appreciate some clarification.

I would greatly appreciate any guidance or suggestions on how to resolve the issue of discontinuous normals and achieve a smooth appearance for the normalized cube with a displacement texture. Thank you in advance for your help.

Unfortunately I have no time to go deeper into this issue, but it looks like the following:

  • to get the direction of the normal vector you sample three points – one central and two along U and V directions – this is OK for internal blue points
  • however, when you are at the edge, one of the extra points will be outside the texture (the red point). In this case, the correct value must be taken from the next texture that is adjacent to the current texture. In the code, you read the value from the current texture and it is wrong.

To get the value from the appropriate textures, you may want to use 3D textures or cube maps (these are actually 6 textures – one for each of the 6 sides of a virtual cube).

1 Like

On CodePen, I have an example that displays animated waves on 4 adjacent planes. The computation of normals is in lines 133-142 (credit to prisoner849). You may not need the animation, but the computation of normals results in a seamless display.

1 Like

I don’t think this is the same issue because all the meshes in your example are using the same textures and in addition my normals a generated on the fly from a displacement texture. I think that PavelBoytchev is correct that my normal function is reading past the textures and causing issues. I just don’t know what approach to take to fix this yet or how 3D textures or cube maps would fix this issue.

I should rephrase instead of saying, “I don’t know how 3D textures or cube maps would fix this issue.” I meant to explain how a cube texture would work based on my example. This is because I’m generating a cube from six planes and passing the cube texture to a plane, which causes distortion.

Yes, I believe that Pavel is correct. They do not appear to be quite lining up.

Part of the reason that I shared my example was that I was having a similar problem with my meshes not quite lining up until prisoner849 helped. And I couldn’t tell, but I assumed that you were working with tileable textures - like my examples.

A couple of other normal shaders that I know about are (1) the three.js NormalMapShader.js example in examples/jsm/shaders; and (2) a normal shader that computes the correct normals for a displacment map that displaces vertices in three dimension - not just along the normal line (aka vector displacement).

I think the approach I’m currently going with is using a CubeTexture as Pavel said. But that’s giving me problems because I’m not using threejs BoxGeometry but my custom cube, made up of 6 PlaneGeometry. Getting a CubeTexture to work with PlaneGeometry so far seems impossible.

Usually dFd functions are used to sneak peak at the neighbor pixel values, this is due to the fact that pixels are rendered in small 2x2 batches. Instead of using epsilon, try this:

  vec4 displacementMapToNormalMap(sampler2D displacementMap, vec2 vUv){
    float scale    = 0.9;   // Adjust this to control the amount of displacement
    float strength = 1.;                   
    float center = texture2D(displacementMap, vUv).r; // Sample displacement map
    float dx = texture2D(displacementMap, vUv + dFdx(vUv)).r - center;  // Calculate gradients in the X  directions
    float dy = texture2D(displacementMap, vUv + dFdy(vUv)).r - center;  // Calculate gradients in the Y directions
    vec3 normalMap = normalize(vec3(dx * scale, dy * scale, 1.0));               // Calculate the normal vector
    normalMap *= strength;                                                       // Apply strength to the normal vector
    return vec4(normalMap * 0.5 + 0.5, 1.0);                                     // Output the resulting normal as a color

Okay, looking at your code, I see that what you are doing is more complex, especially since you are displacing the vertices in at least 2 directions.

In a a more advanced wave generator, the program computes a displacement map designed to displace vertices of a flat plane in 3 directions. (In the program, I have modified the standard displacement shader to do this.) In addition, the program computes a corresponding normal map (line 739-774) using 4 points of reference, rather than just 2

As you note, my program benefits from the fact that the textures are identical so that, e.g., my bottom values are equal to the top values plus 1. But could you do the same thing by (1) changing to a 4 point computation that can wrap around and (2) either (a) using the same Displacement Maps for all planes; or (b) at least making the edge point values identical for all planes?

To perform the wrap-around, you may have to recompute the displacement values for the outliers or save the Map values in target buffers.

I hope this is somewhat helpful.

currently, this is the best solution. It’s not perfect :person_shrugging::

cubetexture is actually quite easy to read from, you just give it 3D direction vector. so for the cube or either of 6 planes it would be just (world position - word cube center) normalized (in fact, im not sure if you even need to normalize)

Since you generate depth you can generate the seamless normal map from rendering the cubemap displacement into a cube-normal map. When rendering as a sphere the neighbouring texels at the edges of the 6 sides are taken into account.

An alternative to tangent space is object space where you should use at least 16bit maps though.

I think I might have thought of an idea how to use 2D textures and still get continuous normals. I will try it when I get home and if successful, I’ll post here the result.


Taking PavelBoytchev advice to use a CubeTextureand advice makc3d how to extract a CubeTexture. I finally got a seamless sphere.
IDK if the way I computed the normals for a CubeMap is correct but it works!
thank you.


1 Like

I still would like to see your idea when you get a chance, if you don’t mind.

I’m glad that you have solved the issue.

My idea was to add a thin border (1 pixel might be enough, but 2 or 4 are be better if you have mipmapping or anisotropic filtering). This border contains pixels from adjacent textures. In this way the shader can read pixels from surrounding textures without having access to the full textures.

You have 3 three versions :question:

<script src="

import * as THREE from "https://cdn.skypack.dev/three@0.148.0";
import {OrbitControls} from "https://cdn.skypack.dev/three@0.136.0/examples/jsm/controls/OrbitControls";