Displaying the Contents of FrameBuffers

I have been deconstructing an ocean wave generator that uses framebuffers to store interim calculations. The last two items the program creates are a displacement map and some version of a normal map. It used those maps internally to create a custom ShaderMaterial.

I have been trying to modify the program to generate a displacement map and a normal map that you can use with standard three.js textures. Here is a CodePen showing what I have accomplished so far.

I have three questions:

1st. The last 2 frames at the top show the contents of the displacement and norma map buffers. I would also like to be able to display the interim contents of the first three buffers (the Init Spectrum, the New Phase and the New Spectrum). How can I do that? I have tried creating display textures using DataTexture, but that does not seem to work. Can I use the html image drawing routines to draw the buffers?

2nd. This may be related to the first. During initialization, the program (lines 231-232) creates a small mesh called screenQuad:

this.screenQuad = new THREE.Mesh(new THREE.PlaneGeometry(2, 2));

The only other reference to this mesh is during the rendering process where the program adds a texture to this mesh (line 275):

this.screenQuad.material = this.materialPhase;

I have no idea what this is for. However, removing this mesh is fatal and prevents creation of the displacement and normal maps.

3rd. The program creates a version of a normal map (pp. 573-607). However, it appears that the results are normalized values (0 to 1) and not the values that you would use to create a normal map image (0 to 255). I have made some modifications to the program to create an image, but an not 100% sure if what I have done is correct.


It turns out that solving the Normal Map issue was the most important question. I can live without answers to the first 2 questions since they do not affect the operation of the program. I have also figured out how to vary wave color based on height (by using the displacement map as a texture and, within the shader, using the blue channel of the resulting diffuseColor as an index to pick the proper diffuseColor). With that final improvement, the project is essentially complete and my questions answered.

You need to check what values these buffers generate and convert them in a reasonable normalized range as anything outside will get clamped to black or white when displayed as the render targets are floats they’re likely not suitable to display straight away. If you don’t know just try to get the values visualized to get an idea like gl_FragColor = vec4( vec3(abs(fract(value) )), 1.0 ); scaling down or up. The code is a bit messy to read honestly, what’s the purpose of this JSON conversion even

Also for correct ts normal map output @604 gl_FragColor = vec4(temp * 0.5 + 0.5, 1.0);

This might be helpful


Thanks, I will try those suggestions.

The purpose of the JSON script appears to be a quick way to copy all of the Base Params (lines 111-117) into each of the parameters that follow (although I suspect that not all of those Base Params are needed). Just FYI, the first buffer is static and contains a bunch of random numbers generated at the end of the initialization. The next 5 buffers (ping/pong and spectrum) are working buffers that will change over time. The last 2 buffers contain the displacement and normal maps.

The change to the normal map makes the resulting normal map look perfect! The result was a lot flatter (my values went up to 255). But once I removed the modifiers I had added to the MeshStandardMaterial, it looks great again. Having an animated normal map is very helpful. I am using this in a flight simulation where the actual displacement becomes less visible and you can use the normal map alone.