I’d like to take something like the blobs/metaballs in this demo:
and intersect them with regular meshes. Never done that before.
I’m guessing that if I use gl_FragDepth
, I can specify the depth of the fragment drawing, and this can make the pixels result in being in front of or behind of regular meshes.
If so, is there a demo of this?
EDIT: similar question: How I can combine a shader based rendering and three.js scene graph
You can do this. I’ve done it before… and iirc yes it involves using gl_FragDepth if you want correct depth values. I don’t have any samples I can point you to though. I think when I did it, I used a cube, and did the raymarching in the cubes shader.
Here’s a project that uses SDFs for modelling and is quite nice:
Another approach is to extract the SDF into a buffergeometry via marchingcubes. I haven’t messed with this much but it looks promising?
https://threejs.org/examples/?q=SDF#webgl_geometry_sdf