iFFT Ocean Wave Generator Module

My problem is solved thanks prisoner849. I also created a repository on github for this

Ok, I worked harder and it appears that the suggested code works fine when TxtA is a loaded texture. In that case, the following three instructions are all that is required:

let txtB = txtL.clone();   // txtL = loaded texture
txtB.wrapS = txtB.wrapT = THREE.RepeatWrapping;
txtB.repeat.set(2,2);

But, if I make TxtB a clone of the animated texture (TxtA), it doesn’t work.

I even tried using the loaded texture (TxtL) as a “dummy” texture and transferring the source address from the animated texture to see if anything would display (even without repeating):

txtL.source = txtA.source;  // txtL = loaded, txtA = animated

(I tried different-sized source images to see if that made a difference.)

There should be a pretty easy answer since it merely involves having the new texture use the same data that was successfully used by the original texture.

Prisoner849 is the master of animated meshes and has helped me many times. Is there an address where I can view your respective solutions?

I feel like this means that they half-assed source implementation. I tried to complain at github, let’s see if they fix or just say it is “as designed” :man_shrugging:

I am beginning to suspect that the problem has to do with the structure created by WebGLRenderTarget (which I will refer to as the RenderTarget).

When I started updating this program, I originally expected that the RenderTarget would be nothing more than a destination for data generated by the GPU graphics routines. However, in the original program, the Framebuffer was defined in much the same way as a texture: Here are the definitions that define the displacement and normal RenderTarget:

this.Res = wav_.Res; // this value is 512

let BaseParams = {
      format: THREE.RGBAFormat,
      stencilBuffer: false,
      depthBuffer: false,
      premultiplyAlpha: false,
      type: THREE.FloatType
   };

let LinearRepeatParams = JSON.parse(JSON.stringify(BaseParams));
   LinearRepeatParams.minFilter = THREE.LinearMipMapLinearFilter;
   LinearRepeatParams.generateMipmaps = true;
   LinearRepeatParams.magFilter = THREE.LinearFilter;
   LinearRepeatParams.wrapS = LinearRepeatParams.wrapT = THREE.RepeatWrapping;

this.displacementMapFramebuffer = new THREE.WebGLRenderTarget(this.Res, this.Res, LinearRepeatParams);
this.normalMapFramebuffer = new THREE.WebGLRenderTarget(this.Res, this.Res, LinearRepeatParams);

Thus, RenderTarget looks very much like a texture. However, as I was updating the program, one of the required changes was to refer to the RenderTarget with the texture suffix:

this.displacementMapFramebuffer.texture;
this.normalMapFramebuffer.texture;

This seems a bit odd, to reference what appears to be a texture as a subclass. [EDIT: based on the documentation, it appears that - in this case - texture does not refer to a Texture, but simply to the results of the operation that can be used as a texture.)

In any case, those definitions above are the values that I am exporting as the displacementMap and normalMap and using them in my program.

What I am doing is a bit different than what the program was originally designed to do. The original design involved creating a single large plane and repeating the textures. However, I wanted to be able to display the results in more than one plane (which the program allows). In addition, I want to be able to allow the texture data to be used in a separate texture which has different definitions - specifically the texture is repeated. And this is where I am getting stuck.

In the original program, the displacement and normal RenderTargets were intermediate results that were used by a final buffer to create a custom material. So perhaps I need to make these RenderTargets more like the final buffer?

Or the problem may be simply that I need to find the referent to the RenderTarget data. I have tried referencing Framebuffer.source, Framebuffer.texture.source as well as Framebuffer.data, Framebuffer.texture.data.

EDIT
I mistakenly referred to the three.js structure as the FrameBuffer. But Framebuffer is just the name given in the program to the structure created by the WebGLRenderTarget. I have tried to clarify by generally referring to the FrameBuffer as the RenderTarget.

ONE MORE THOUGHT
I defined the FrameBuffers to generate mipmaps. It is my understanding that this causes the program to automatically create mipmaps. Is there a way I can access those mipmaps directly and use them as separate textures?

Oh yes, the link

You can enter an index in the text field below and then see the vertex being raised at that point. The shader has the indices as an attribute. With the respective current index, it now fetches the associated vertex coordinates from the shader. this is then exactly the same as in the current position attribute. However, the shader has all the vertex coordinates and this means that I can now do interpolation calculations between several vertices, which is not possible with the vertex from the position attribute, because it always only contains the current vertex.

1 Like

Thanks.

This may be the better approach since I saw some disparaging remarks from the developers about cloning and copying WebGLRenderTarget. So, even if I figure out what to do, they might change things overnight.

I am hoping to solve this problem since it is pretty basic to creating a simple multi-resolution grid with animated textures - like the ocean. And it may create a link with what you are doing.

MORE
Here is a solution that is a bit of a kludge, but I got it to work by using Blender to create the 2x2 grid on the right. And here is another version that uses the displacement map as a texture so you can see the patterns more clearly.

This answers the main question as to whether this is possible with animated files and indicates that the solution has to do with the the UV map. So ideally, I will figure out how to do this in Three.js without having to use blender.

Note that I am NOT using the displacement map with the larger, more distant squares since they are far away enough that you don’t see the waves. So the more distant squares only need to have enough faces to display a single normal map. For example, if the more distant square is 3x larger, I only need 9 (3x3) faces. The most distant square will need 27 faces (9x9). Those square may not even need an animated normal may, so I can eliminate the faces and simply repeat a static normal map.

That’s exactly what I do. And so that the seams of the large chunk and those of the small chunks fit together exactly, I need the vertex data of the respective neighboring points of my current vertex in the shader. And hence the story with the DataTexture which stores the coordinates of all indices of a surface. Because I really need all the indices, not just the ones on the edge. Because the surface movement changes the normal vectors and with the neighboring vertices I can then recalculate all of them.

The area subdivision works with quadtrees. I already have all that. But that part is a very large, complex system in my app that I can’t simply copy quickly into a small example, but I have exactly the right address for you.

His tutorials have helped me a lot in general. Here is the link to his repositories.

1 Like

Yes, I have watched all of his videos and that is certainly the way to go if you are trying to generate a land-based terrain.

However, in the case of the ocean, I am still using an old-fashioned nested scrolling grid method. With land, you are using static meshes (or varying resolutions) that you have to create once in a single frame. However, in the case of the ocean, you are generating an animated terrain that changes with every frame. In terms of processing power, even using the iFFT method, you can only really afford to generate one 512x512 animated square.)

The flip side is that, because the ocean is relatively flat, you can start using pretty large unsegmented squares just a few miles out. You only need the animated displacement map for the nearby squares. For mid-range squares, an animated normal map is sufficient. For the furthest squares you can use static textures.

I create islands using Blender, using real world heightmaps and textures from Google maps. Here is my webpage which discusses the process. To avoid interference, I have them floating above the ocean by an amount that increases with altitude.

Creating the grid system has turned out to be more of a challenge than I had expected - because of the need to repeat the animated normal map in the mid-range squares. Having updated the iFFT wave generator, I am hoping to be able to provide programmers with a module that allows them to use this generator as part of a large-scale animated ocean.

It seems that I am almost there. I may have to use Blender squares until I can figure out how to do the same thing with Three.js.

1 Like

I have already managed the ocean with quadtrees so far that everything is in motion and the individual chunks are largely adjacent to each other. I do the movement and the edge correction in the shader. That’s why I needed the DataTexture. This is just a screenshot but it’s all in motion. It’s just a simple perlinnoise at the moment.Here you can see the chunks and how they fit together. The quadtree only generates the chunks for me, but the vertexshader makes the movement

I have to be careful that I don’t go down the wrong path. I’ll read through your method with the moving grid. After all, you’ve been doing it with the ocean for a lot longer than I have and I’m always ready to learn. In any case, the thing with the moving grid sounds interesting, because I don’t know that yet

I think it’s great that you have already done that. I hope you aren’t having any trouble using the iFFT wave generator with your mesh.

I finally found a solution within three.js. I created 4 planes of the same size as the smaller planes, positioned them adjacent to each other and then merged them using mergeGeometries from the BufferGeometryUtils. When I applied the texture, the 4 planes retained their separate identities. Here is a textured example and here is one where all of the squares are using the normal and displacement maps.

Here is the code I used, in case anyone else is facing the same challenge:

// Plane B
s = grd_.Siz/2;
let MatB = new THREE.MeshPhysicalMaterial({
	color: WtrCol,
	normalMap: wav_.Nrm,
	displacementMap: wav_.Dsp,
});	
let GeoB1 = new THREE.PlaneGeometry(grd_.Siz, grd_.Siz, GrdSeg, GrdSeg);
GeoB1.rotateX(-Math.PI * 0.5);
GeoB1.translate(s,0,s);
let GeoB2 = new THREE.PlaneGeometry(grd_.Siz, grd_.Siz, GrdSeg, GrdSeg);
GeoB2.rotateX(-Math.PI * 0.5);
GeoB2.translate(s,0,-s);
let GeoB3 = new THREE.PlaneGeometry(grd_.Siz, grd_.Siz, GrdSeg, GrdSeg);
GeoB3.rotateX(-Math.PI * 0.5);
GeoB3.translate(-s,0,s);
let GeoB4 = new THREE.PlaneGeometry(grd_.Siz, grd_.Siz, GrdSeg, GrdSeg);
GeoB4.rotateX(-Math.PI * 0.5);
GeoB4.translate(-s,0,-s);
let GeoB = BufferGeometryUtils.mergeGeometries([GeoB1, GeoB2, GeoB3, GeoB4], false);
let MshB = new THREE.Mesh(GeoB,MatB);
scene.add(MshB);

After we get our respective approaches working we can compare results!

Your approach is certainly the more modern approach. Mine has been around since the early days of flight simulations. Here is a demo which illustrates how my approach works. And here is one with randomized farmland and an island serving as a mountain. This might have been “state of the art” 20 years ago. I am trying to see if that approach is still useful today.

My webpage on creation of Islands cited in the prior message has a link to a place where you can get a height map for just about anywhere on earth. That was pretty handy. Before that, I had to search the internet for dems and turn them into height maps. If you are lucky you could find 10m or 30m dems.

1 Like

Thanks for the info about your code. It doesn’t matter to me whether it’s old or new. What matters is that it looks good, and your version does. I still have a long way to go with my version and if I can save time with your version, why not? I have just enough other things in my project where I have work to do.

Maybe it makes sense to link our sites. I have a NASA account and have 14250 elevation maps from NASA. This is the best NASA has from 60°N to 60°S. The NASA data is in NASA’s own format (hgt) but I have them all in 16-bit greyscale png format. Sites quickly charge a lot of money just to convert one or a few hgt files. I programmed a compiler in phyton and converted all maps. That’s quite a lot of GB. But I have the entire landmass with a resolution of 1 arc (30m). That looks very good. I’ll ask NASA if I can provide it for free download

I just asked NASA. Let’s see what they say. Actually, I imagine that there is no problem, but with what I’ve already experienced in life, I don’t want to risk any trouble. In any case, I would be happy if NASA said no problem, because the height maps would certainly be of interest to many.
My app e.g. creates the landscapes from such maps directly in threejs.

You might also check with the USGS (United States Geological Service) and . That is where I got most of my maps “back in the day”. It looks like they have started a “3D Elevation Program” which will have data about the United States. If you are looking for data for a specific place, you might be able to perform a specific search for specific data. For example, I was studying the topography of Tinian Island and found 10m dems by searching for “Tinian Island 10m dems”. That level of detail on a Pacific island unusual. In general, the more isolated the location, the lower the level of detail (both in dems and picture data).

1 Like

I didn’t know about that site, thank you. I got the idea that if I’m satisfied with the geometry of my quadtree version of the ocean then I’ll make a repository out of it. You have more experience with the IFFT. Then maybe you could offer two ocean solutions, because you’ve been dealing with the ocean for a long time. I’ve been interested in the landscapes for a long time. As mentioned, I do not load any GLB landscape models into my software, but my landscapes are also created dynamically using quadtrees from the map material in threejs. This has the advantage that the landscapes are only generated in detail where I am, the performance is great. With an ocean and clouds I would have the whole earth.
That would be nice for flight simulations. The disadvantage is that my database is very large because of all the map datas, way more then 100 GB

I forgot about the NOAA (National Oceanic and Atmospheric Administration), but they now appear to be limiting themselves to dems for coastlines and American island territories (like Tinian).

I think a little demo of the quadtree method would be extremely helpful to Three.js programmers, even if the terrain data is limited or randomly generated. Other than the challenge of trying to create a flight simulation on an Apple ][+, one of the events that got me interested in 3D modeling was a National Geographic article on the eruption of Mount Vesuvius. I thought it would much more educational to see a 3D model of the eruption that you could fly around. With the right tools (perhaps Blender?), that kind of project could be easily done these days.

I am busy updating my ocean and grid mods. There are many improvements that could be made, but I am interested to see what I can do with your clouds. :face_in_clouds:

I’m done with the stitching, it always fits. I added scirts. So that I can be sure that nothing ever shines through. The scirts don’t have to go that deep. I only made it deep enough to be able to see that everything fits. Later it is sufficient if the scirts are just a hundredth of the chunk width deeper. What I am now dealing with are the normal vectors. These must be independent of the chunk size, otherwise it would look bad if chunks of different sizes are next to each other. I need to somehow generate the normal vectors from the wave function in the fragment shader. The normal vectors would thus be independent of the geometry resolution. Of course, the wave function in the vertex and fragment shader must be the same so that the normal vectors generated in the fragment shader match the course of the geometry.
Here is a picture where you can see what the stitching looks like. The intermediate points of the smaller high-resolution chunks are always linearly interpolated so that they always lie on the edges of the larger neighboring chunks.

The colors are dynamic and change depending on the nature of the neighboring chunks. So I could also check from further away so that everything is correct and it is. The quadtree water surface is now complete. The wave function can be easily exchanged. At the moment I only use a simple perlin noise. That’s enough for development.

2 Likes

Let’s see if I can build a quadtree demo next weekend. My current variant is far more complex than necessary for a demo because I have a planet-wide ocean with the radius of the earth. The screenshots are from my tablet and the surface moves smoothly as far as you can see. Quadtrees are a very powerful thing.
The demo should really only contain the necessary part, because quadtrees are complex enough on their own.

What is the smallest sized square in your quadtree?

It seems that you could use the trick I learned about using smaller squares to create larger squares.
People use those iFFT wave generators to create really small scaled waves. So you could create several squares of each size to use as needed.

The great thing about iFFT waves is that they are tileable. The downside is that this results in a tiling appearance. But, if I can figure out how to cascade them, we could eliminate the tiling problem in both of our grids.

That is how the game developers are currently handling this problem. Some are splitting them into as many a 4 cascades and combining them together. But those developers are mostly working on games where you are only a few hundred feet above the waves. Our common challenge is that will be much higher.

Regarding stitching, one trick you might be able to use is to simply lower the height of the grids as distance decreases. That is what I do when transitioning from the inner grids which use a Displacement Map to the outer grids which don’t. I merely have to lower the outer grid so that it is below the lowest trough. That transition takes place several miles out and is not visible. But you are stitching between grids with different resolutions, so the grids should almost match and you would not have to lower them by as much. That probably won’t work for terrain - which is not random and changing like the ocean. But it should work fine for the ocean.