Glsl shader as map source for meshStandardMaterial

Hi,
first time posting, threejs newb here, complicated naive question…

I want to use a shadertoy glsl shader as a map source in a meshStandardMaterial, so I can get lighting, pbr juiciness, etc.
I am well aware of the shaderMaterial, and how to use that to implement shadertoy generative shaders as a texture in a material, and have made a simple demo that does this. (yes it uses AFrame, but it’s Threejs underneath)
The problem with the shaderMaterial, is that it does not have lighting, or reflections, and renders looking emissive.

I’m just wondering if there is some hack to get the output of a glsl shader and use it as a texture in a meshStandardMaterial. I kind of already know this is not possible, but just wanted to ask in case someone has actually done this.

I also understand that the serious way to do this is to start with the fragment and vertex shader of a meshStandardMaterial, which I found here

and weave in the shadertoy code somewhere in the middle, and feed it through the lighting functions, etc. But that is rather a daunting task, given the length of the shader (about 600 lines of glsl code).

I come from the world of Touch Designer, which is a 3d game engine for performance (not web based).
They have a built in tool that exports a (pbr) material to a glsl shader, that has simplified code, which is much more managable (all of the lighting functions are accessible as calls, but not exposed in all verbosity in the exported shader).
Touch is a node based programming environment that makes it easy to create a glsl texture and import that into a material. Just wondering (hoping) if anyone has made or is making, anything close to that for THREEjs.

I have seen and played with Shaderfrog, which is a very cool node based online glsl editor that exports to threejs. But it also does not support lighting, or vertex normal reflections, but does support enviromenment map reflections.
If Shaderfrog evolved into something like substance designer, with the ability to export a pbr material with the power of shadertoy, that would be exactly what I am hoping for.
Just wondering if anyone else has craved for, or worked on, tools like this.

thanks
Thomas

I think you’re looking for something like the GPGPU water example, which uses custom shaders to generate a heightMap. Here’s a simplified explanation of the code in that example:

  1. Create a plane Mesh with custom shaders to calculate water height (red & green channels).
  2. Set up an OrthographicCamera and a Scene to render this plane head-on.
  3. Render this Scene to a THREE.WebGLRenderTarget so the render result will get stored in a texture.
  4. Create a second plane Mesh with its heightmap assigned to the result of the RenderTarget: heightmap = renderTarget.texture
  5. Set up a PerspectiveCamera and a secondary Scene.
  6. Render your second scene as usual with the WebGLRenderer to output to canvas.

Keep in mind that you’ll need to do the rendering in two separate scenes to avoid infinite recursion problems.

You can do something similar yourself.

  • In step 1, you could render your custom ShaderToy shader.
  • In step 4, assign the custom render texture as the color map:
    MeshStandardMaterial({map: renderTarget.texture});
2 Likes

WoW!!! that is so cool and so inspiring and performs so well. thank you for the detailed explanation of how to achieve this. Complex, yes, but this is the hack I’ve been looking for.
thank you so much!!

1 Like

Thomas,

If you are looking for a web-based version of Touch Designer, it’s possible that Polygonjs could fit your need.

While I’ve never used Touch Designer, I’ve used Houdini - which is related - for many years, and Polygonjs is inspired by it. It’s fully node based (and web based), from geometry manipulation to shader creation, as well as particle sims and events handling.

I’ve mentioned it on this forum here and will keep updating when new features are released.

You can also watch some tutorials, and you’re also very welcome to shoot any questions you may have.

A short video of a custom (and psychedelic) shader being created in 2min:

1 Like

Gui, thanks for sharing this amazing tool! yes, this is very much like what I’m used to, with Touch, Substance, and other apps that use the immense power power of node based programming. I’m blown away!!!
I’m going through it now, and just wow, where have you been all of my life?! This is seriously great.

Just noticed that this app exports glsl shaders. I haven’t played with it yet, but do these shaders include lighting and reflections? Also looking at the psychedlic car example, the shader does not appear to use lighting? But on closer inspection, it does have reflections. Is that correct? It looks like the glsl exporter does export lighting and reflection functions.
My question for you, has polygonJS solved the problem of integrating generative shaders into materials?
Or would one have to do a kind of hack, as marquizzo demonstrated in the GPGPU water example, where the shader is rendered, and then the canvas texture from that render is fed into another scene to be used in a separate pbr material?

In Touch, I commonly did things like this, feeding one scene render into another, to create a complex effect. But in Touch, you could do this with only one renderer. To use Houdini lingo, you could use a glsl TOP(texture) and feed that directly into a Material. Can polygonJS do something similar?
Are you using a similar kind of node grouping system (Tops, chops, sops, etc)?

again, I’m blown away with polygonJS. I can’t believe you can do all of this in a browser.

Thanks a lot for the kind words, I really appreciate!

And the shaders absolutely do preserve lighting and reflections. I made sure to only insert code and to keep what threejs already offers. I really had to stand on the shoulders of giants for this, as I definitely could not do better.

If you look at the car, it may be confusing as the colors are a little intense, but you can see the area lights from the wall behind that are being reflected.

In this short example, for a freezing effect, I actually override the roughness with a noise, and the lighting is more obvious.

And here is a step by step of how to override the roughness. But you can inject code that will impact position, color, alpha, uv, metalness and point size for point shaders. There is probably more that could be overriden, I’m still thinking about it.

You can also try the scene here: https://polygonjs.com/gui/custom_shader_get_started/edit

In the part of the doc explaining how glsl shaders can be exported, which you may have just seen, I explain where the code is injected, which is right before the skinning and lighting #includes. The example is using a MeshBasicMaterial, so there are not many light related includes there, but that’s the idea. I should put a second example with a MeshPhysicalMaterial, that may be clearer.

(Note that vimeo decided to block my account just as I was writing this for some reason, so some videos won’t work in the doc for now - bear with me while I try and get it back)

1 Like

phenomenal~!

Ah, apologies, I realise I replied to only half of your questions.

This is definitely something I am looking into. I’d have to take closer look before guarantying anything, but there are already some pieces in place, such as the particle system, the render texture and the glsl texture. I’m working on a way to combine those so that you could get a similar effect. So it wouldn’t be a hack at all, just a good way to do it for real time purposes. I’ll get back to you on this when I have something to show.

Yes, it can :slight_smile: There is an example using the render node linked also above. You basically give it a camera in and it gives you a texture out. So you can plug it into any material. It’s fairly fresh, so there might be side effects I haven’t stumbled upon yet, but for now, I just make sure that the objects it renders are in a specific layer which the other cameras cannot see.

If you have anything specific that you’d like to achieve, don’t hesitate to let me know, I’d be curious to help and battle test Polygonjs with it.

Well, what’s amazing is threejs. All I’m doing is putting a UI on top (and thanks to many other open source projects).

(And my vimeo videos are back up by the way, so the videos in the doc should work just fine)

yes, of course, three.js is astounding, and totally changing how we experience the web.
Tools like Aframe (which I love) and polygonjs are also a very important part of the landscape. I am deeply grateful for people like you who take on these challenging tool making tasks to empower artists. I am especially grateful that your tool is both solving the generative shader in a material problem AND bringing my favorite interface (visual node programming, that actually feels like Touch designer) to this challenging brave new world. It’s going to take a moment to absorb all of this and start making stuff before I can give useful feedback. But I am very taken with what you have made. It’s beautiful.