How to use three.js for GPGPU?

Hi everyone, about a month ago I started using Three.js to tinker around with GLSL shaders. I’m currently working on a fluid dynamics simulation based on the classic GPU Gems chapter and I’m having a bit of trouble understanding how to reliably propagate the results of one render pass through to the next.

At the minute I have two ‘fields’:

function Field(name){
/* Class definition for Fields
	- Fields are vectors or scalars to be computed 
	- The quantity that the field represents is stored as a texture i.e. discretised onto a grid of texels
	- Two textures are used to achieve result propagation through successive shader renders
*/ = name;
this.texA = new THREE.WebGLRenderTarget(w(), h(), {minFilter : THREE.LinearFilter, magFilter : THREE.NearestFilter});
this.texB = new THREE.WebGLRenderTarget(w(), h(), {minFilter : THREE.LinearFilter, magFilter : THREE.NearestFilter});

this.swap = function(){
	var temp = this.texA;
	this.texA = this.texB;
	this.texB = temp;

One for the velocity field of the fluid (‘u’) and one for a quantity (‘x’) such as ink to be moved by the fluid. x.texA is the output texture that is rendered to the screen.

To get to the final render, there are several buffer renders, which compute various terms in the Navier-Stokes equations such as advection, diffusion, etc. Each buffer has a shader material which takes an input texture as a uniform, so a render looks like:

forceBuffer.material.uniforms.texInput.value = x.texA;
renderer.render(forceBuffer.scene, camera, x.texB, true);	

// Swap A and B textures: take output of buffer renders, set up input for next buffer renders

This works fine most of the time, but sometimes there are problems that I cannot explain or solve, I just tinker around with things until it works, which is what I’m hoping to get help with!


The most recent example is the force and brush shaders: the brush draws a white blob on the output texture x.texB underneath a mouse click, while force does the exact same thing to u.texB but with a red blob (this is not the end goal, eventually the color of the blob will encode the direction and magnitude of mouse drags).

The two shaders had identical code except where in brush I have gl_FragColor.rgb += pos.z * max(radius - dist, 0.0);, in force I have gl_FragColor.r += pos.z * max(radius - dist, 0.0);.

At first I used the same shader (brush) for both u and x renders, with a bool controlling whether the output was .rgb or just .r, which worked fine. I then copy-pasted the same code into force and switched the u render to use that shader instead, but this broke the effect propagation: as soon as I stopped holding a mouse click, the u texture went blank whereas before the red blob would remain and be affected by the other shaders in the pipeline.

Obviously without dumping the whole code it will be impossible to properly diagnose this issue, but it’s not the only time something like this has happened. Previously I have found that I need an even number of texture swaps in my render pipeline otherwise nothing is propagated, but that is no longer the case. I feel like I’m stumbling around in the dark on these topics.

The question(s)

So, with the goal of better understanding how to achieve shader result propagation and build a proper fragment shader rendering pipeline:

  • How is it best to do this with Three.js?

  • Am I on the right lines by having two textures for each field and swapping my textures after each render?

  • How do you debug problems like this, where a simple change results in unexpected behaviour? Currently I just have to change things around blindly until something works, so I must not understand something fundamental about how rendering, shaders, textures, etc. (or maybe even the basics of how JS handles copy/deep-copy?)

Thanks so much for reading this and for any help!

Have you already seen GPUComputationRenderer? I think it uses a similar approach like you’ve described in this thread.

The following example is based on this class in order to simulate a flocking steering behavior. Maybe you can use GPUComputationRenderer as a basis for your own project.

1 Like

Thanks, this is very helpful! I’ve got a lot of reading to do.