I’m trying to find a more efficient way to achieve this burnt toast effect that works with any geometry regardless of vertex count. I’m new to shaders and have invested a bit of time recently trying to work this out. The method pictured here passes edge line buffers as textures then computes the distance from points to lines in the fragment shader. The lines are derived from an algorithm similar to EdgesGeometry. Obviously this can be expensive and slow, especially as you zoom in. There are games that can be played with clustering edge lines by proximity to faces, of course with much more complex texture encoding. But even this has limits.
I’ve tried using color maps and tweaking the values (pre-processed based on normal line distances) so that varying values across the face can be interpreted reasonably accurately as distances from edges, but not been able to work out all the difficulties with more complex geometries. Even then, it doesn’t look as nice/clean as this where you get proper a proper radius around sharp points.
Advice on other approaches? Or has this been solved already and I just haven’t found it (not sure what this is called).
Yes. but you would have to bind the adjacency information in a separate vertex data channel, Or, pack everything into a DataTexture (positions, and vertex adjacency information) and then you can freely sample/walk around the surface of a mesh.
Thanks. Can you point to an example of how to create/pass a separate vertex data channel (or is this just another BufferAttribute)? It’s my understanding that all buffer attributes associated with vertexes are interpolated when passed to the fragment shader. I’d like to avoid that without tripling the data so that the interpolated values stay constant.
Is it correct that DataTexture sizes are limited and wouldn’t be suitable for per-vertex or per-face information?
Apologies, but I’m only 3 days into shader coding.