Approach for Chaining Shaders

I am working on a series of procedural tools, and I am curious how I might best approach for chaining together shaders. For example, I’d like to create a flexible system to begin with an FBM noise shader, then feed that through a fractal warp shader, then through an invert shader, etc. Previously I’ve just written single shaders for these tasks, but I’d like to make it more modular, kind of like EffectComposer, or a shader graph.

I have been playing with using glslify, but I’ve had issues with getting it to work inline, which seems necessary if I’m going to have some kind of interface for adding and removing shaders.

Any advice or direction would be appreciated!

What’s wrong with using multiple EffectComposers with your own custom shader code? It’s already built to be easily chained, and the last one can be rendered to canvas.

Alternatively, you could render your scene to a WebGLRenderTarget and then feed the resulting .texture to a new material as a uniform so you can use it in a subsequent shader.

See this example: https://threejs.org/examples/?q=gpgp#webgl_gpgpu_water they used one render pass to calculate the ripple animations, then passed the resulting texture to the final scene to be used as a heightmap. The same can be done in your case. Just make sure you use separate scenes per render pass, or you’ll get recursion errors.

1 Like

Here’s an example by Felix Turner that looks like what you have in mind, maybe you’ll find it useful: https://www.airtightinteractive.com/demos/js/shaders/preview/

2 Likes

What’s wrong with using multiple EffectComposer s with your own custom shader code?

Built-in anti-aliasing stops working, meaning that you need to include a final FXAA or SSAA pass which is much slower (very noticeable on mobile) and doesn’t look as good.

Also, I think you might lose correct tone mapping (linear -> gamma color space) too, although most people just do this step before post processing which I don’t think is correct.

3 Likes

Those caveats are both good to know!

1 Like

Are you thinking about chaining calculations into a single shader, or only chaining the outputs through multiple render stages? I think macros can be useful for making reusable modules and chaining them within shaders. Also, perhaps consider NodeMaterial. (I hope I am pointing you in good directions here.)

1 Like

So far I’ve been trying to chain outputs, which is where I might be hitting a snag. Thank you for the recommendations, and NodeMaterial looks great, working to figure out which is a better route to go.

With EffectComposer I am able to generate the fbm just fine, but when I go to pass it through a warp shaderpass, I can only display the fbm.

this.target = new THREE.WebGLRenderTarget(this.width, this.height, {
  format: THREE.RGBAFormat,
  type: THREE.FloatType
});

this.composer = new THREE.EffectComposer(this.manager.renderer, this.target);
this.composer.setSize(this.width, this.height);

// this shader generates fbm
this.fbm_shader = new FractalNoise(4);
// this shader warps the texture (tex0) input
this.warp_shader = new FractalWarp(4);

const fbmPass = new THREE.ShaderPass(this.fbm_shader.shaderMaterial, "fbm");
const warpPass = new THREE.ShaderPass(this.warp_shader.shaderMaterial, "warp");

this.composer.addPass(fbmPass);
this.composer.addPass(warpPass);

am I actually able to chain outputs in this way using EffectComposer? Or do I need to render the fbm to a texture first?

UPDATE: I managed to get this working, it really came down to some confusion around the renderer and my own system of managing it.