State of Node-based Shaders in three.js? Any good tutorials?

I’ve been writing text shaders with something like ShaderMaterial for years now and like many who have been doing the same I’ve felt for awhile that they’re not as extensible or reusable as I want them to be. So I’m curious to hear about peoples experience with three.js’ node based shader is now.

Is it production ready? Have people built large apps with them? I always expect some occasional breakage when upgrading versions at some point but have the node shaders been fairly consistent between versions now?

How are the ergonomics of node shaders? Is it reasonable to write them entirely in code? Or do you need a shader-graph-like UI for building them?

And lastly can anyone recommend tools or tutorials for getting started with them?

Any help appreciated! I’m looking forward to a future of more extensible and hopefully more shareable materials and shader nodes!

7 Likes

We tried at work and I found it quite hard to use. The code needed to render even simple things would quickly become unmanageable. My biggest gripe as far as I remember was the structural decision to nest nodes through the constructor which makes it very inflexible and rigid, you can’t just reconnect this bit to that connector, the entire structure has to be recreated. It is also impossible to express it declaratively, which means UI will have a hard time as it can only be done imperatively.

Imo it would help if nodes were flat instead of being nested and connected to other nodes via reference or descriptor, or id. That would make a huge difference. Visual node editors would pop up around that pretty quickly.

How different is sharing node based shaders versus sharing regular shaders? I thought GLSL is already fairly shareable, being able to run on various devices and browsers and whatnot, or is there a gotcha?

@drcmda
Interesting – that’s good to know. Part of me is wondering if it’s worth diving into it to just be able to give more feedback on the ergonomics and future of it all. It would be nice if there were a core group of people working with node materials or location to discuss and give feedback because that had been my concern, as well. As far as I understand node materials will be the only way to build shaders in the future WebGPURenderer so it would be great to get ahead of it.

Have you tried making an issue to give feedback on why it may or may not work for UIs or declarative-style code?

A couple of my hopes and expectations have been:

  • that nodes and connections will be easily traversal so shaders can be programmatically adjusted (redundant subgraphs merged, materials can be cloned and modified, core shader can be automatically swapped ie converting a phong material to a shadow material with all the necessary and possible connections retained)
  • that I’ll be able to make self-contained, reusable meta-nodes that can added to into a shader graph (and be subsequently optimized if redundant) so common parts can be reused easily. ie if I have a group of nodes that make topographic lines that should be easy to make a class for, connect, and reuse.
  • that I’ll be able to create a raw glsl node with connectable inputs and outputs if using nodes directly if it suits the situation or I have existing glsl code, etc. In my brief experience with ShaderForge in unity it was very easy (granted it was through the graph UI) to add a set of raw glsl with named input variables and it seems like it should be able to make easy in code.

Building a larger list of use cases and limitations seems like a good start.

@dubois

How different is sharing node based shaders versus sharing regular shaders? I thought GLSL is already fairly shareable, being able to run on various devices and browsers and whatnot, or is there a gotcha?

I’m more focused on sharing and extending logical components of shaders such as reusable functions, surface shading treatment etc so people can use them all together and quickly build new materials. Once a material is written it’s reusable and shareable but not very flexible.

3 Likes

Thanks for clarifying. I’m still struggling to understand the differences but I think it makes more sense. I’ve never looked into glslify, but I always thought it would be easy to “import” a function. Say if you write a glsl function, and then save it to its own file.

i have very little experience, always wanted to dig into node based shaders more deeply. but given the following structure:

new A(
  new B(
    new C()
  )
)

if i want to re-use B, or re-plug C into A, reverse the order, etc, everything has to be re-created from scratch. not even sure about multi-connectors (A connected to B and … C?).

in blender nodes are just flat objects, they are not nested, and they connect to one another by reference. that makes it easier for the UI and i think generally that it may be a simpler model. this could be inexperience on my part though, are there any benefits in nesting via constructor?

i made a small test once trying out if it’s feasible to connect flat nodes and it looked OK to me: Bezier curves & nodes - CodeSandbox these nodes could easily be tied to a node editor, re-connected etc.

<Nodes dashed color="#ff1050" lineWidth={1}>
  <Node ref={a} name="a" position={[-2, 2.5, 0]} connectedTo={[b, c, e]} />
  <Node ref={b} name="b" position={[0, 1, 0]} connectedTo={[d, a]} />
  <Node ref={c} name="c" position={[-0.25, 0, 0]} />
  <Node ref={d} name="d" position={[2, 0.5, 0]} />
  <Node ref={e} name="e" position={[-0.5, -1, 0]} />
</Nodes>
5 Likes

What would happen I’d you try to put indices of incompatible nodes? Say a vec2 into a vec3?

i think the same that would happen when i put a vec2 into a constructor that expects a vec3. both the imperative use and the declarative could be typesafe imo, typescript would not allow a specific node to have mis-matches. in blender “type” is color-coded (green link only goes into green target etc), i think it does allow user-error though and it even interpolates.

Right now glsl would break right? One would have to settle that vec2 to vec3 would always resolve as way xxy?

having type safety could prevent all mismatches imo, it wouldn’t compile. even the IDE would display a red squiggly line. VSC does this even for non-TS projects.

1 Like

It seems like good feedback to give if you feel like it’s going to make NodeMaterials easier to use and more amenable to adoption. These are the two issues I’ve seen around node materials that might be worth posting to:

Or maybe a new issue? It’s hard to track what the vision and direction for all of this is at the moment since they’ve been around so long.

I’ll try to dive into them in the next month or so and hopefully come back with some better experience and feedback.

1 Like

I think that many (all?) of the nodes also support reuse by re-assigning instance properties, you don’t have to reconstruct them. But I can see where allowing empty constructors would be useful. I find the nesting easier to read in code, but that’s a small effect.

IMHO, writing these node graphs in code imperatively is easy for some common use cases that aren’t possible today. Things like:

  • assign UV sets to arbitrary texture slots
  • swizzle texture channels
  • override per-instance properties with InstancedMesh

That’s already valuable.

However, I wouldn’t want to write imperative code for the kinds of big node graphs people make in Unity, Unreal, or Blender, like a wind/grass shader. That’s much better served by node-based UI, I think. I’d be glad to see people build those, but they don’t necessarily have to be built specifically for the three.js NodeMaterial system… better to support a common node graph standard like MaterialX (see NodeMaterial: Support MaterialX Standard Nodes · Issue #20541 · mrdoob/three.js · GitHub) and just have robust support for loading that in/out of three.js shader system. I had been working on trying to set up a MaterialX loader for three.js for a while but haven’t had time for that project recently. The advantage here is interoperability, I expect it’s only a matter of time before you can create MaterialX node graphs in mainstream DCC tools.

In general – I think NodeMaterial would really benefit from more feedback and contributions, Sunag is friendly and doing great work on this, but it is a big project.

5 Likes

Hi guys, i love node editors a lot !!

from 2016 to 2021, I tried to make a few cable and boxes systems for a while

loklok-shader-lab (glsl code gen)

Assisted Graphics Enginnering (glsl code gen generation 2)

https://legacy.effectnode.com/ (js based cable and boxes)

https://v2.effectnode.com/ (wrapper for node api in threejs)

https://docs.effectnode.com/ (generic cable and boxes system)

(image link)

i really really dream to make something like VFX that is integrated into a metaverse so that people can tune their own objects visual effects directly <3

please let me know if you guys wants source code, im open to share <3

8 Likes

Is there a way to compile MaterialX into glsl so it could be used with RawShaderMaterial?

1 Like

The MaterialX library includes a GLSL compiler, yes. The harder part is getting that GLSL to work with everything else you probably use three.js for: lights, skinning, etc. I think that’s where it helps for three.js to have its own node-based shader system.

just tried https://v2.effectnode.com/ and I think it lacks one essential feature - I cant grab an empy space and drag it. Since your default graph does not fit my window and there appears to be no zoom either, it cant be edited :S

1 Like

Hi
I’m starting a node UI component, it’s really at early stage, but I planned this goals:

  • Blender like UI and interactions:
    • drag/zoom
    • groups
    • multi document viewer
    • lateral panels
  • Generic enough to be used as UI for any engine connected
    • API to query the graph
    • API to add engine specific nodes
    • API to “compile” nodes to the engine specific language
    • THREE.js is my 1st target
  • SVG format
    • It can be added to html DOM directly without parsing
    • The lib will add interaction and new node specs
    • The lib will offer a complete theming of UI via css (that can be attached to the SVG)
    • The graph can be viewed as a static image (like a frozen UI)
    • :new: SVG is XML so easy to import/export to MaterialX

I need to work a bit more on the project to be shareable, then it will be a Libre licence…

Right now I have:

  • drag / zoom on SVG
  • basic draggable boxes with title and named inputs/outputs
  • content of boxes defined in html (to gain easy UI inputs, or custom elements)

Need to finish at least

  • Draw dynamic link (line or bezier) recalculated when connected boxes are dragged
  • Load/update/save a file from file system (aka open my saved graph)
  • Cook the 1st attempt of API to gather graph information and compile something
2 Likes

Thanks guys – it’s nice to see people putting time into tools for this sort of thing. I’ve seen others float around, as well. Once things are solidified a bit more it might be nice to list some of the options on the library and plugins page (and maybe add “tools” to the title of it :grin:) so they’re more discoverable.

@donmccurdy

However, I wouldn’t want to write imperative code for the kinds of big node graphs people make in Unity, Unreal, or Blender, like a wind/grass shader.

I haven’t done my deep dive into node materials, yet, but pragmatically I’m worried about the inability to write these things in code directly. With external tools how will I include and reuse my pre-made list of grouped or custom nodes in a separate editor? As a basic example if I have a node that computes topographic lines for terrain in a shader in my project and I want to use it multiple different node materials how can I ensure they share it? Do I have to copy and paste it every time? Will the tools have to be able to access my project directory? In tools like Unreal / Unity / Blender these types of reusable nodes would be discoverable in the “project” context which is a model three.js doesn’t really require. Do we know of any other tools that have a more ergonomic API for programmatic construction of these things?

In general – I think NodeMaterial would really benefit from more feedback and contributions, Sunag is friendly and doing great work on this, but it is a big project.

From what I’ve seen Sunag seems like a great and very receptive collaborator and I’m impressed to say the least with the effort that’s been put into the node materials project. I’m looking forward to finding the time to finally give it a try and give some feedback.

I’m worried about the inability to write these things in code directly…

You do have the option of putting hand-written GLSL into a custom node, I think this works pretty well for reusing common patterns across a large project. Here’s a simple example: Voronoi3DNode.js.

For cross-compatibility in other software you’ll need to break that down into something like the MaterialX “standard nodes”, which are (IMO) also what three.js should aim to implement. Some kind of “node grouping” mechanism could also make sense, MaterialX has something like this, it’s really just indicator that a group of nodes can be collapsed in a UI editor with a human-friendly name.

Do we know of any other tools that have a more ergonomic API for programmatic construction of these things?

Tangram’s shader “blocks” are the only example that comes to mind here. Not aware of any similar patterns with wide adoption…

You do have the option of putting hand-written GLSL into a custom node, I think this works pretty well for reusing common patterns across a large project.

I figured this would be the case. I think what I’m getting at is how will an external tool that might be needed to build complex node graphs? I guess for real or complex projects where you have a lot of custom nodes or materials to manage a desktop tool of some kind or VSCode plugin will be needed so it’s aware of what’s available in your project structure.

Tangram’s shader “blocks” are the only example that comes to mind here. Not aware of any similar patterns with wide adoption…

Interesting – hadn’t heard of Tangram or its shader blocks. Unity also had the concept of a “Surface Shader”, which seems similar, that let you write a function to output the simple surface properties (albedo, gloss, alpha, normal, etc) which it would then wrap in its rendering pipeline to support shadows, lighting, etc. I haven’t used them a lot but the idea seems fairly elegant if all you want to do is make custom surface properties or vertex displacement with the built in renderer lighting. I wonder how difficult that would be to add to the existing material system? :thinking: It seems even Unity has moved on from them, though, considering they’re not supported in their modern pipelines.