Adjusting position of vertices in the vertex shader based on the fragment color in the fragment shader

I have a plane that uses a 2D fragment shader to create a simple animated pattern with some color diversity. I want to create a dynamic displacement based on the fragment colors such that the vertices are updated to be higher or lower based on the current color. Essentially creating a dynamic 3D surface representation of the plane. What would be the best approach? I’m looking at how to get the current frag color into the vertex shader so the vertex can be adjusted based on thaat color, but not having much luck. Thanks!

Just out of curiousity: what do you have now and what result do you want to get? Any referece pictures?
Usually, information goes from vertex shader to fragment shader (varyings), not the other way round.

Right. I do understand the shader pipline, and that I am trying to somewhat reverse it.
But perhaps I could use uniforms somehow to query the color attribute array of the plane geometry and access it in the vertex shader to adjust the vertex positions in realtime?

This is a beamline simulation in a particle accelerator.The beam moves across the sample (plane), which has the animated plasma effect. What I am trying to produce is a dynamic displacement on another associated plane. Essentially a 3D surface map in which the vertices’ heights are controlled by the changing colors of the affected plane.

Is this the effect you’re looking for?

Both the plasma plane and the wireframe plane use the same WebGLRenderTarget’s texture (passed in a uniform into ShaderMaterial of both of them).

1 Like

It is. However, I’m looking for an animated scene whereby the wireframe reacts to the
animated plasma (due to the wireframe’s frag shader). The wireframe should then
be ripppling as its vertices are updated per the changing colors of the plasma.

Okay, I admit, that I’m not that smart, but for me it looks like an attempt to find a backdoor, when the main entrance is wide open.

Having a single data source (a render target’s texture), it’s easy to keep things in sync. :thinking:

3 Likes
const textureCamera = new THREE.Camera();	
const screenQuad = new THREE.Mesh(new THREE.PlaneGeometry(2, 2));

const size = 1024;   //or what you want
const framebuffer = new THREE.WebGLRenderTarget(size, size);

screenQuad.material = material;   //assign your plasma material	

//place this in your update loop	
renderer.setRenderTarget(framebuffer);
renderer.render(screenQuad, textureCamera);
renderer.setRenderTarget(null);

Assign the framebuffer.texture to your plane material which you want to deform. At this way you have the texture for your vertex and fragment shader

2 Likes

This is what I do to get visuals in my previous posts :handshake:

2 Likes

Do you mean that’s what you did in your video above?
Then that’s solved, because the plasma manipulation is always rendered into the target and is therefore available for the vertex shader of the mesh to be deformed

2 Likes

Definitely sounds like a good DataTexture usecase.

Thanks everybody for the great responses. Appreciate your input

1 Like

Whoops, meant to say rendertarget usecase, but also potentially datatexture if you’re doing simulation in js. :smiley:

1 Like