Saw these examples where there are node counterparts some gltf scenes and was wondering what’s new here
In this sheen example , the only difference is a nodeFrame.update() in the render loop
& since the sheen values are coming from the gltf file how does nodes contribute ?
In materialX case there’s noise being animated and layered to get a nice effect but these gltf node examples look the same.
Hm, I wonder if that sheen example is a mistake, and the author meant to do something like this to update the material:
One use case here is that if you want to make a specific part of a glTF material procedural — like a wind/grass sway effect, a custom fade in/out effect, or a shimmering animation — you can add the procedural part to a NodeMaterial more easily than the typical MeshStandardMaterial returned by GLTFLoader.
so instead of going through the complex glsl/shader stuff to get some effects , the nodes counterparts can be used to replicate the same look/make time based effects/ stack multiple effects like blender ?
Is there any performance loss with these nodes ?
and I’m guessing in a future where everyone uses webXR these will work in AR too ?
Would be great if you could file a bug about that sheen example yeah, thanks!
A lot of work is still going on toward NodeMaterial and WebGPU. And nodes are probably the best option we have to support interoperable shaders between WebGL and WebGPU’s different shading languages. It should be fine to use NodeMaterial in WebXR, but I don’t know how the performance stacks up at this point. Compile time is probably a bit higher, maybe that can come down, but I think the performance of the resulting shader should be pretty much the same.