Computing Smoothed Normals for Vertices Along the Edge of a Plane

I have been trying to create an ocean consisting of several adjacent planes that share a single geometry.

Everything works fine except that the normals for the vertices along the perimeter of each plane do not quite blend with each other. The initial computation of normals seems to be okay. However, the normals are further adjusted using “smoothing”. This part does not seem to be working for the segments at the edge of the plane. See this example where the column of segments left of center does not smooth with the column of segments right of center.

I assume that this is because smoothing considers the normals for the segments around the segment. And, on the edge of the plane, almost half of those segments are missing.

How can I fix this?

  1. Since the rows and columns are repeating, the problem could be solved if the smoothing program knew to use the normals from the opposite end of the plane. Is there a way to make it do so?

  2. Can I fix this by manually computing the smoothed normals along the perimeter and inserting those new values in the array of normals for the plane? (I have not found any programs showing how the smoothing is done, so I am not 100% sure how I would do that.)

Hi!
I see it this way:

  • don’t rely on .computeVertexNormals()
  • move vertices and compute normals in shaders

TBH, maybe I’m not that attentive, but why do you want to use several planes to simulate ocean surface, when you can use just one (big enough) plane, that you can move along with the airplane?

1 Like

I had converted your example to use shaders - except that I had not figured out how to compute normals yet. I will try adding a routine for normals and post what I come up with. But I have not seen routines for smoothing the normals, so my result today may just have flat surfaces.

The jury is still out on whether to use moving grids or stationary grids to depict the ocean.

The nested moving grids are the best solution for showing land. I found that the same approach worked well with the ocean because you can use a single animated geometry for all of the nearby ocean grids. And they fit seamlessly together as long as you insure that each wave has a set number of repetitions within the grid.

Stationary grids might also work with the ocean. I would need to create nested grids with different levels of detail and make sure that there is a smooth visual transition. The outer grid probably won’t even require animation. The middle grid could use some animation - because you can still see foam and flashes from a distance. The inner layer would have the most animation. But, even this layer could be phased out as you climb.

As you may have seen, I was hoping to use the variation of Ocean.js (that uses the Water.js shader) with Gerstner waves for this inner layer, but I ran into a significant problem in that, when you look straight down at the ocean, all the detail vanishes and you are left with only flat colors. There may be a quick fix. For example, I might be able to adjust the sunposition since it appears that Water.js uses a sunposition that is independent from actual sunposition. Or it could be that Water.js only works well where the sun and your altitude are low.

1 Like

An attempt to compute normals with noise: https://codepen.io/prisoner849/pen/rNvvXvM?editors=0010

1 Like

That looks like it is doing a good job of smoothing the normals. So you are creating a couple of hypothetical future values to use in the computation? I didn’t realize that is how those numbers are computed. I will try that.

I have made this example to show how you can cover up the defects with noisy diffuse and normal textures. But I was hoping to strip away the textures so I could do things like change the wave color with height. Something like this, but much bigger:

Okay, I was able to get your example working. I kept getting errors until I found the large shader on the html page.

Here is a version of my program that uses shaders to compute the position and color. As you can see from the edges, the vertex shader is correctly displacing the mesh to create waves. But you can’t see the result on the plane because the shader is not computing the normals (yet).

I am hoping that the whole “noise” shader on the html page is not required to create smooth normals and that I only need to perform the precomputation shown on your js page.

BTW, it struck me that, if I were just doing something simple like showing a single row of waves, I could simply insert the correct normal value. In my example, each wave has only 5 faces, so I would only need to compute the 5 correct normal values and insert them in the correct place. But I will likely be creating waves of different size and speed moving in 3 different directions, so that could be a challenge.

One interesting thing I noticed when using the noise shader to create 4 adjacent meshes is that the meshes had shrunk to exactly 1/2 size. So it could be that the noise shader also moves the vertices in the z and x directions.

ADDITIONAL THOUGHTS

I may have been making this more complicated than it needs to be.

In my examples, I am varying the height (position.y) of each vertex based on the x and z positions of each vertiex in the plane, In the shader, these postions are given by position.x and position.z. (Since these waves are only running north/south, I am only using position.z.)

So to generate the group of values to use in the normalization process, I merely need to compute what the height would be if x were advanced by 1 (pos2) or z were advanced by 1 (pos2). For example, starting at position x=0, z=0, I would have pos = (0,y1,0), pos2 = (1,y1,0) and pos3 = (0,y2,1). where y1 and y2 are the computed y values. I could then use those value to compute the vertex normal. For example:

vNormal = normalize(cross(normalize(pos2-pos), normalize(pos3-pos)));

However, I have tried that and see no change in appearance. Am I still missing something?

I’ve got this result, using just sine and cosine functions instead of noise: https://codepen.io/prisoner849/pen/ExLReMW?editors=0010

1 Like

Just for fun, I eliminated the x portion of the calculation and ended up with my good old waves.
There was a visible split between the adjacent meshes, but that appears to be the result of some slight difference in the calculation of the y value (I increased the pi value to see if that helped, but no.)

In any case, it appears that, with these few small steps, you have achieved a y-displacement, computation of normals and changes in color due to elevation - all things I am trying to do.

Regarding computation of normals, it appears that is done with these commands:

`.replace(
	`#include <beginnormal_vertex>`,
	`#include <beginnormal_vertex>
		vec3 p = vec3(modelMatrix * vec4(position, 1.));
		vec3 pos = getNoised(p);
		vec2 shift = vec2(0.01, 0);
		vec3 pos2 = getNoised(p + shift.yyx);
		vec3 pos3 = getNoised(p + shift.xyy);
		objectNormal = normalize(cross(normalize(pos2-pos), normalize(pos3-pos)));
	`
	)

Since I have never constructed a shader using this method, can you tell me what is going on?

It appears that beginning with the “onBeforeCompile: shader => {” command, you are telling the program compiler to perform the specified steps.

What does the replace do? And the “#include <beginnormal_vertex>”? Are you telling it to replace certain sections of the existing shader?

In my standalone vertex shader, I have copies the above steps and tried to change the normal values using both vNormal and ObjectNormal and neither seem to make a difference. So I am a bit baffled.

Thanks for your patience and persistence!

ADDENDUM

I found an excellent article which discusses the practice of extending three.js shaders. This avoids the need to, as I have done, writing a complete shader to handle everything. And, as luck would have it, I found an interesting example of a sent of extenders, one of which creates ocean waves similar to what I was looking for. Interestingly, only the first option, the extender to the MeshPhysicalMarerial seems to generate good looking waves. So there are many ways to go wrong. And, even if I duplicate the result, it is not clear whether I can tile it or create a huge stationary plane - as discussed here.

SUCCESS !?!?!

Relying on my old friends, trial and error, I appear to have come up with something that works. Here is the program. Let me know if this makes any sense. The only drawback is that the colors seem a bit too dark. But the problem I complained about in my OP appears to have gone away. (Or maybe it is just too dark to see it!)

Looks nice for me :slight_smile:
I made some slight modifications to your js code and shaders, mostly for shortness and simplification.
Something related: LearnOpenGL - Basic Lighting

1 Like

Thanks. The changes look great.

One final basic improvement I could make make is to create an array to store all the values I am pre-computing. Then the shader would not have to compute those values a second time. Also that would eliminate potential inconsistencies between the pre-computed and actual values - and would allow me to add some randomness to the values. For example, I could add some of the small random changes that you demonstrated in your other examples.

And I could use these same computations if I decide to switch to a version that is fixed to the aircraft. I would simply have to add an X and Z displacement to reflect the movement of the aircraft.

I have modified my example to include your changes along with 2 sets of waves, with the primary set flowing diagonally. With 4 adjacent planes, the result looks seamless.

@phil_crowther
Looking at the result of your example, I’ve noticed that seam between planes:

Minor modification in the main function elimitates the seam:

	void main() {
		vUv = uv;
		p = vec3(modelMatrix * vec4(position, 1.));
        vec2 move = vec2(1, 0);
		vec3 pos = moveWave(p);
		vec3 pos2 = moveWave(p + move.xyy);
		vec3 pos3 = moveWave(p + move.yyx);
    //
        vec3 _p = vec3(position.x, pos.y, position.z);
        vec4 mvPosition = modelViewMatrix * vec4(_p, 1.0);
		gl_Position = projectionMatrix * mvPosition;
		vNormal = normalize(cross(normalize(pos3-pos), normalize(pos2-pos)));
	}
1 Like

You are absolutely right. I should have cranked the view in by 100% where it became obvious.
Thanks for the fix.

I am now going to try to break up those nice straight lines by adding some noise. I can try an equation that creates non-random noise that looks random. Or I can try adding a normal map.

ONE LAST QUESTION

In the shader version, I am no longer seeing a reflection of the DirLight on the water. Do I need to add something to the shader for make this happen, such as a computation involving EyeDirection?

ANSWERING MY OWN LAST QUESTION

After some more research, it appears that the best way to solve this problem is to use the approach you have been suggesting all along - creating an “extender” to whichever existing material has the characteristics I am looking for. I have never done that, but I found a good article about extenders which I have inserted as a response to my prior questions about your approach.