R164 - Nodes No Longer Working With WebGL/WebGL2?

Now that I finally got my WebGL (by default WebGL2) programs to successfully work with Nodes, I finally got around to testing them with r164. But now the programs are no longer working.

I checked the three.js examples for guidance, but the category related to Nodes has been removed. I can only find Node examples with WebGPU programs.

Is it no longer possible to use Nodes with WebGL/WebGL2?

From responses I have received, that is apparently the case. In the future, Nodes will work only with WebGPU. But, in the meantime, we can still use the r163 version of Nodes with the latest version of three.js. For example, this worked for me:

<script type="importmap">
	{
		"imports": {
			"three": "https://unpkg.com/three@0.164.0/build/three.module.js", 
			"three/addons/": "https://unpkg.com/three@0.164.0/examples/jsm/",
			"three/nodes": "https://unpkg.com/three@0.164.0/examples/jsm/nodes/Nodes.js"  // r163
		}
	}
</script>
<script type="module">
import * as THREE from "three";
import Stats from "three/addons/libs/stats.module.js";
import {
		color,
		texture,
		normalMap,
		float,
		vec2,
		attribute,
		positionLocal,
		MeshStandardNodeMaterial,
} from 'three/nodes';
import {nodeFrame} from 'https://unpkg.com/three@0.163.0/examples/jsm/renderers/webgl-legacy/nodes/WebGLNodes.js';	// r163

This my not work for everyone and may not work for long. Also, you will not be able to take advantage of future improvements to Nodes. In my case, I need to switch from custom shaders to compute shaders to work with WebGPU. But, overall, the changes should be worth it.

NOTE: The specific Node items (e.g. color, texture, etc.) are ones I used and will vary for each user.

1 Like

I was wondering why my app was crashing

That would be news to me! Assuming you are referring to this thread, I left a comment asking for clarification:

1 Like

Don,

No I had not seen that post, but the answer appears to be yes. In addition to what I was told, the “webgl-legacy/nodes/WebGLNodes.js” file has been removed and if you look in the official examples, the Nodes category has also been removed. The only Nodes examples I could find were part of WebGPU examples.

In my proposed solution above, it is possible that you can use the r164 version of Nodes and keep the last line to get the r163 version of “webgl-legacy/nodes/WebGLNodes.js”. I just tried it and it appears to work. So I am going to change my proposed solution.

By pure coincidence, I had finally figured out how to use Nodes with WebGL2 and the results were amazing. I had hoped to have a least a few more months to rest on my laurels.

I’ve commented in Add envMapIntensityNode for node materials in WebGLRenderer by TobiasNoell · Pull Request #27156 · mrdoob/three.js · GitHub the situation.

To clarify: Despite it’s name, WebGPURenderer does support WebGPU and WebGL2. The latter one is used as a fallback when WebGPU is not available on a system. We have decided to support the new node material only with WebGPURenderer since the support in WebGLRenderer was very limited due to its legacy architecture.

So the node material does support WebGL 2 but only via WebGPURenderer.

6 Likes

In my particular case, that approach does not seem to be working. My program uses RawMaterialShaders. So I get an error notice that

Error: NodeMaterial: Material "RawShaderMaterial" is not compatible.
Need to update [program name] to use Compute Shaders

and the program does not kick back to WebGL2, but stops running.

[NOTE: I was also getting that notice in r163. But, technically that statement was not true because RawShaderMaterial WAS working with NodeMaterial - at least for awhile.]

With the change in approach, is this one of the types errors that should cause the renderer to switch back to WebGL2? Or does that create a risk that the set of conditions will “grow like topsy”? And perhaps you don’t want to keep trying to make Nodes work with features of WebGL2, including RawShaderMaterials.

For what it is worth, I realize that Compute Shaders are a much better option for what I am doing (computing numbers) and was disappointed when I learned that this feature was not available in three.js. So now that Compute Shaders are finally available, I really need to make the switch.

Major transitions, like this, are always a challenge. But they are generally worth the effort required to switch.

1 Like

Custom materials based on ShaderMaterial, RawShaderMaterial and modifications via onBeforeCompile() won’t work with WebGPURenderer. This will be indeed the major migration task when moving to WebGPURenderer.

However, the new node material and TSL provide much better ways for creating custom materials. If you read about TSL the first time, it’s the three.js shading language that allows you to implement shader code that can run both with WebGPU and WebGL. The renderer transpiles the TSL code to the supported backend which enables a platform independent way to write custom materials. In fact, the shader of all built-in materials are also based on TSL. In this way, we can write materials or post-processing effects once and they run in WebGPU and WebGL. We expect the majority of future shader code in three.js and third-party libraries will be TSL-based since it’s more or less the only effective way to support WebGPU and WebGL with a single code base. Besides, TSL is JavaScript syntax so stuff like module imports, tree-shaking, ESLint and other existing development patterns work flawlessly with TSL.

3 Likes

Are there any examples showing to “daisy chain” several compute shaders together?

For example, the structure that I am working with will involve:

  1. A starting map of values.
  2. A compute shader will take those map 1 values, modify them and save them to map 2.
  3. A ping-pong shader will take the map 2 values, modify them and save them to map 3 (a 3-way displacement map).
  4. A compute shader will take the map 3 values, and compute a normal map and save those values to map 4.

(Ideally map 3 and map 4 can be plugged into a standard Node texture, just as they are now.)

Is there a simple way that you can use TSL to initialize a series of defined maps and link them together with a series of defined shaders? I believe I have seen some TSL examples of 2 compute shaders being linked together, but those may be outdated and I wonder if there are some examples of more shaders being linked together.

This looks like it will be an interesting year, thanks to all of your efforts.

I can help you port your app to webgpu, step by step. You know how much I struggled with error analysis over 5 releases so that sunag could expand the compute shaders release after release so that exactly that works.

I had already presented my ocean repo in my chat about the progress there, but here again for those who still have certain fears of contact with WebGPU:

I use the node system very extensively. However, I only use wgsl, the shader language from WebGPU with the wgslFn node instead of tsl with tslFn, which ultimately only translates into wgsl and as far as I know it doesn’t yet cover the same function scope. My ocean wouldn’t be possible with WebGL2. WebGPU is already very deeply integrated into three.js.

2 Likes

Yes, I am still working with the earlier version of the wave generator that you helped update.
The module is the file named Ocean3.js and is in the jsm directory on my GitHub page.

I think the basic concepts are the same: you create data sets and use shaders to link them together. But I expect it will also be like the way they turn a stock car into a NASCAR racer. You lift up the radiator cap and build an entirely new car underneath.

MORE
It looks like the “webgpu_compute_texture” provides a guide for a simple first step. It shows how to define the destination (using the storageTexture function), defines the computations (using tslFn), and how to combine the 2 in a computeNode. Using the storageTexture function would not be a bad first step since that should be similar in operation to the WebGL2 fragment shaders. And it sounds like you know how to replace the tslFn definitions with a wgsl shader (which I assume is similar to my existing WebGL2 shaders). The only thing that example does not do is show how to use source data from a preceding computation. But perhaps the wgsl shader shader can handle that. The final challenge will be how to animate it all. So it may not be as horrible as I thought.