I have some particles and I would like to change their shape when they get too far from the camera. For example when you are fully zoomed-out the particles would be a solid small dot but as you zoom in, the few planes close to the camera (viewer) will take a square or maybe triangular shape and all the rest will still be a solid small dot.
How can this be done please?
I have this plunker but it has the typical behavior. I would like to introduce this “zoom-dependent shape” functionality
Hi!
Not sure if this is what you’re looking for, I get this idea, reading your question.
Shapes (and color) change, in dependency of the depth:
1 Like
That is really fantastic. You got the idea 100% correct. It is way above my level, I need to understand it and then adopt/change it for what i want to develop. Can you please elaborate a bit on your code? Just the basic idea. I didnt know what a clipping_plane was until this moment. If I understand well you define a plane, and then you tell WebGL to render what is above the plane in a different manner to the space beneath it.
In any case, many thanks for this. Brilliant stuff!
Shaders are combined of shader chunks.
#include <clipping_planes_fragment>
internally will be substitute with the code from here: three.js/clipping_planes_fragment.glsl.js at b968f4817cefe2266f338012d8e1b4861d0fa0f0 · mrdoob/three.js · GitHub
And this chunk plays nothing in my scenario.
I used that string just to find the place I need to inject my functional to, keeping the rest parts of shader intact.
Long story short.
There are two things happen.
In vertex shader, I compute distRatio
for morphing of shapes, based on depth, which is mvPosition
and it’s the position in camera space.
In fragment shader I morph shapes with mix
function, using distRatio
.
No postprocessing.