Shader to create an offset, inward-growing stroke?

I’ve just started using three.js, and I’m having trouble finding resources that can lead me to solving this problem. I’ve been digging through the book of shaders, and while immensely educational, I haven’t been able to wrap my mind around it enough to solve this on my own yet.

For some context, I’m building an application that visualizes data with a 3d scatterplot. Here’s a quick clip of an early iteration of the scatterplot portion of it:

I’m currently using InstancedBufferGeometry to keep the draw calls down to a minimum as there could potentially be 10’s of thousands of nodes on display at any given time, and I’d like these nodes to be able to animate between states when the user filters / changes settings. I’m currently using a RawShaderMaterial with basic vertex and fragment shaders that take into account some fog for color adjustment.

Currently, each node is just its own colored geometry (RingBufferGeometry for the rings, and a custom ShapeBufferGeometry path to create the X’s). There’s no background to these, and in the case of the rings, no filling. It’s very difficult to discern depth, especially when zoomed in to the plot. My thought is for every node to be solid (black) and then have the node color “painted” onto it via the shader. Here’s a video example of what I’m looking to do:

As shown in the above video, I’d like to be able to pass something into the shader to determine the thickness of the inner-stroke; and because I’m going to be using multiple shapes, having it work regardless of shape would be fantastic; essentially, it’s a stroke that’s “inside” the geometry, offset a bit to allow for a black outline at all times, and grows inward.

Here’s a link to a single circle codepen that is pretty much using the same structure as my full scatterplot, just cut down to only deal with one node for the ease of trying to solve this problem:

Really appreciate any assistance!

1 Like

Hi!
You can pass UV coordinates from the vertex shader to the fragment one and draw a circle, using those coordinates:

I used uniforms for values of inner and outer radii, but you can use attributes for each instance to have different thickness :slight_smile:

Just out of curiousity, why do you use instanced buffer geometry instead of points with custom textures for shapes?

2 Likes

Thank you so much for helping, @prisoner849! :grin:

Is it possible to make the shader work regardless of shape, though? Right now it will only work for a circle, but because I’m going to be using several shapes to designate different statuses, having a “one shader that rules them all” would be amazing. Is that not possible? Would it require multiple shaders (one per unique shape)?

I was reading up on normals last night, and while I can’t get it to do what I want, could that be part of the answer? (following a contour, rather than always being a circle)

Here’s a video clip of how your example looks when I change the amount of segments for the CircleGeometry to 3/4 (triangle and square, respectively):

And here’s what I’d like a single shader to do (without branching / conditional logic):

To your question about instanced buffer geometry vs points… I think I settled on Instanced Geometry because points weren’t as flexible with what you could do with them visually maybe? I don’t remember my thought process entirely at the time.

Essentially, these nodes will have several properties that can be changed by the user (color, thickness, opacity, x, y and z positions, etc.) and as the user adjusts which variable is used for each property, I want the nodes to quickly animate to their new states. You mentioned that I could be using points with custom textures, but wouldn’t the textures get blurry when zoomed in close? Not only that, but if the line thicknesses can be changed, how would that be achieved with a static texture?

One of the big selling points for me on switching from pixi.js to three.js–aside from being able to visualize data in 3d, which is awesome–was that everything is so much easier to scale and stay crisp in three.js.

Really appreciate all the help! :smile:

It was an interesting task :slight_smile:
Had that idea for long time, but tried it just now :smile:

Here is THREE.Points() with changed shaders of THREE.PointsMaterial(). Each point has its own shape index and in dependence of its value we choose a shape for the point. There are 10 000 points now :slight_smile:

The triangle shape was the toughest one to create, so I just stole the code from shadertoy :sweat_smile:

7 Likes

You are awesome! Thank you so much!!! :smile:

@lunch
You’re welcome :beers:
I hope it will be useful :slight_smile:

@Mugen87 @looeee
Can you move this thread from “Questions” to “Resources”, if it worth to be there? :slight_smile:

Well, it is a question that got answered so I think it’s in the right place.

Nice answer btw :smile_cat:

Sorry to bump this one again… I’m trying to figure out another part to this! :joy:

The results from @prisoner849 are awesome, but currently I’m having an issue with aliasing, especially on smaller nodes, or nodes further away from the camera. It seems like when zoomed up close to a point, the smoothstep() function is doing fine to make sure things are not jaggy, but it gets pretty rough the farther away the particles are… Like this:

Any thoughts? Antialiasing is enabled in the renderer, but I’m assuming this is a result of the pixel results from the fragment shader not receiving antialiasing–only geometry edges would benefit from that?

Here’s the link to the pen that @prisoner849 created:

Ah, okay :slight_smile::beers:

1 Like

Maybe this thread will be helpful:

I’ve tried it with FXAA:

Can’t say there is much impovement of visual. :thinking:

Hmm… Just made this :slight_smile:

1 Like

Hmmm, interesting… I saw a few links which tackle antialiasing through the shader, but they all use standard derivatives:

https://www.desultoryquest.com/blog/drawing-anti-aliased-circular-points-using-opengl-slash-webgl/

http://madebyevan.com/shaders/grid/

I had trouble trying to enable the standard derivatives extension in three.js, which led me to this:

Still no luck :thinking:

Hey @prisoner849 or anyone else coming to this thread looking for answers; I figured out the antialiasing bit! (well, mostly)

The key was enabling the standard derivatives extension in the fragment shader and then applying the fwidth() function to a few of the spots in the return statement of the getShape function.

First, to enable standard derivatives, you need to be using a RawShaderMaterial, and then at the top of the shader, you put this statement:

#extension GL_OES_standard_derivatives : enable

Secondly, here’s the getShape() function in its entirety. Take a look at the return statement to see where I used fwidth().

float getShape(float thickness, float outer, vec2 uv) {
    uv *= 2.0;
    float a = atan(uv.x,-uv.y) + PI;
    float r = TWO_PI / vSides;
    float d = cos( floor( .5 + a / r ) * r - a ) * length( uv );
    return smoothstep(thickness - fwidth(d), thickness + fwidth(d), d) - smoothstep(outer - fwidth(d), outer + fwidth(d), d);
}

Another cool thing about this getShape function is that I don’t need to have separate functions (or conditional branching) to make different shapes! The same function that makes a triangle will work for any n-sided shape. In the above function, the variable vSides is just a varying that I’ve pulled in to the shader as an attribute (i.e. 3.0 for triangle, 4.0 for square, 50.0+ for a circle, etc.).

Anyways, I mentioned above that I mostly solved the anti-aliasing issue. If you look closely at the codepen below, anywhere a shape overlaps another you can see the aliasing on the outer edge of the black outline; I believe it’s because of the discard that’s being done to remove the extra pixels around the shape, and I couldn’t seem to antialias that using fwidth(). Not sure if there’s another method to get around it, but honestly I’m super happy with the results and will perhaps revisit it down the line. This is the line I’m referring to:

if (shape < 0.5) discard;

Anyways, thank you so much again, @prisoner849! I’m making great progress now :smile:

Here’s the final codepen which has some extra goodies like attributes controlling color saturation, brightness, thickness, scale, etc. They’re all randomly generated at runtime in the codepen, but it’s cool to see it all working :slight_smile:

4 Likes

Yeah, I had the same thought, but I was too lazy to change the shader :smile:

Now everything looks really cool :+1:

Have you considered altering opacity smoothly instead of discarding? Something like:

gl_FragColor.a = smoothstep(shape);

(or with parameterized sharpening around the threshold value, like smoothstep(0.5+sharpness*(shape-0.5)))

1 Like

A-ha! I still work on this project from time to time and will look at this tonight :slight_smile: Thank you for this! :+1:

Note that altering opacity may turn out to be more expensive than discarding, though. The bright side is that you can avoid a branching. I had a fragment shader recently that used nested conditionals to color pixels differently. It loaded my GPU 100% all the time. After I switched out all the hard boundaries with sharpened sigmoid functions (https://en.wikipedia.org/wiki/Sigmoid_function, https://en.wikipedia.org/wiki/Logistic_function), the load was reduced to 30% or so, with better look.

1 Like

Yes! That is the only branching logic I have in the shader, and it has given me an eye twitch :slight_smile: If this solution produces a nicer look with better performance, I will be very happy!

1 Like

No rest for crazy people! :smile:
Just tried that shaping function with THREE.PointsMaterial() + .onBeforeCompile (simplified version):

Still no enabled transparency, just alphaTest: 0.5

1 Like

Nice and simple. But still the edges look like stairs. You need smoothing on the alpha channel too, over the outer contour. I tried to make it work now, but must admit that I don’t understand the code well enough (and don’t really have time).

1 Like

Yeah. Here is a working example:

Simple setting of transparent to true (no alphaTest parameter) gives an odd looking result, as further points can overlap close points. To solve this issue, we need to sort particles by distance from the camera. I found a realization of this algorithm in this example three.js examples and applied it to my points.
Added Stats to both the current and the previous codepens.

1 Like