You can see that when the arcs are segmented, the normals are flat face normals and do not capture the curvature of the original shape. However, it should be possible (at least in theory) to use the original shape definition to exactly determine the normals that each vertex should have. For example, if segments are created from an arc, the normal of each vertex would point directly outward from the center of the arc. For lines, it would be perpendicular to the line.
I am aware of geometry.computeVertexNormals(), but this will not work in my case. If two lines meet at a sharp corner, I would still want their normals to be perpendicular to the original line segments, not the average of the two face normals.
Ah, that could possibly work, thank you for the suggestion!
Still, it seems like a shame that we have all the exact geometry in the form of a Path, discard it all when the path is segmented, and then later recompute the normals in an attempt to recover the original shape of the path. If there was a way to set the normals from the beginning, it would save a lot of needless computation.
I will give toCreasedNormals a try, or perhaps give a go at my own Extrude implementation and see where that takes me.
You may be on to something here, however, it sounds like quite a nuanced use case, in that you’d have to go to the effort of somehow distributing the calculated normals of a shape path onto eventual segmented geometry faces nonetheless… If you can see a way to make these projections that’d be great!
If you look at the original PR of toCreasedNormals you’ll quite quickly see the viability of a more generalistic approach to generating tight face normals of a greater subset of imported / generative geometries, eg…
To do this, I did not project my desired normals onto the segments created by Three.js (as you say, that would be somewhat tricky), but rather I did my own segmentation into vertex/normal pairs, and generated the position and normal attributes from that.
In order to ensure the my segmentation of the wall matched up with the segmentation of the top and bottom caps, I had to pass my own segmented vertices into ShapeUtils.triangulateShape to create the top and bottom caps. Then I used mergeGeometries to package it all up nicely.
If Three.js’s shape utilities could return the normal vector of each segment in the call to shape.extractPoints( curveSegments ); in the ExtrudeGeometry constructor, perhaps it would be possible to do this. But, whether that’s going to improve performance significantly for enough people to justify the work, I couldn’t say.
I’d be happy to share my implementation if that would help clear up any details.
It doesn’t. Nor does it do the custom extrusion path.
How is each step of the bevel calculated? Does it use a polygon offset algorithm such as clipper, or something else? If it is something like clipper, then it’s going to be hard to transfer the normals to each step. To say nothing of calculating them for a custom extrusion path…
So probably this isn’t going to be generally useful for any applications other than my own, and that’s fine.
What I would try is, break up the individual curves (arc, lines, bezier, whatever) extrude them. Calculate the normals smoothing every segment. Line won’t have anything to smooth, arc will. Those are your walls.
Then combine them all into a closed shape and make your floor and ceiling.