latheBufferGeometry face normals, seam issues

Hi,
I’m venturing back into familiar-ish territory with using latheBufferGeometry.
I’ve got a rectangle (loaded from an svg) as the source object (shape) for a lathe geometry operation.

My results are as such:
top images are top view, bottom images, underside.

different texture for a different view:

As you can see, I’ve got a nasty seam going down the radius and the outer and inner diameter sides are inverted.

Here is the code that I’m using…

objInput = scene.getObjectByName(objInput);

            const points = [];

            let positions = objInput.geometry.attributes.position.array;

            let ptCout = positions.length  / 3;

            for (let i = 0; i < ptCout; i++)
            {
                let px = positions[i * 3];

                let pxN = px * -1;

                let py = positions[i * 3 + 1];

                let pyN = py * -1;

                points.push( new THREE.Vector2( pxN,pyN ) );
            };


            const geometry = new v3d.LatheBufferGeometry(points,sidesInput);


            // ATTEMPT TO SMOOTH SURFACE (NOT WORKING)
            geometry.computeVertexNormals(true);

            geometry.computeFaceNormals(true);

            geometry.computeBoundingBox(true);



            const geoMergedVerts = THREE.BufferGeometryUtils.mergeVertices( geometry, 1 );


            const material = objInput.material;

            material.flatShading = false;

            material.smoothShading = true;

            material.depthWrite = true;

            material.name =  nameInput;


            lathe = new v3d.Mesh(geoMergedVerts, material);

            scene.add(lathe);  

I’m thinking this has something to do with point order (possibly of the loaded svg??)

Any help in getting these little issues resolved would be fantastic!
Thanks!

EDIT: I’ve fixed the faces’ direction issue, however no matter what shape I use, ONE side of the resulting object’s geometry never gets created. This seems to be the biggest issue.

I see several compounding issues here:

  • For a closed rectangular path, you need five vertices, not four. The last vertex needs to replicate the first, to get the appearance of a closed shape. It looks like you are using only four vertices, so the top “lid” is missing and you’re looking “inside” the mesh.
  • The code of LatheGeometry() does already include a call to computeVertexNormals();
    so there’s no point in calling that again.
  • In a procedural geometry like LatheGeometry(), we have shared vertices, which are used by all neighbouring faces. The way computeVertexNormals() computes vertex normals is by iterating over all faces, computing the face normal for each face, adding that faces normal to whatever the normal at a particular shared vertex was before, then averaging (normalising) the two. This yields correct results, when a vertex is shared by only two faces, but produces unintended (i.e.: wrong) results when a vertex is shared by more than two faces. (I’m currently working on a proposal for improvement).
  • Vertex Normals generally lead to a smoothing along edges. If you want “hard” edges along the lathing circumference, I have achieved good results with duplicating the vertex where I want a hard edge. That way you can have both smooth transitions and sharp edges in the same geometry.
  • The sequence of vertices (CW, CCW) along a path is significant, as it affects the orientation of normals (pointing inward or outward) and also the mapping coordinates.

Try it:

		let ri = 5;		// inner radius
		let ra = 20;	// outer radius
		let h = 5;		// height

/*
// CCW

    points.push( new THREE.Vector2( ra, h ) );
    points.push( new THREE.Vector2( ri, h ) );
    points.push( new THREE.Vector2( ri, 0 ) );
    points.push( new THREE.Vector2( ra, 0 ) );
    points.push( new THREE.Vector2( ra, h ) );

*/
// CW

    points.push( new THREE.Vector2( ra, h ) );
    points.push( new THREE.Vector2( ra, 0 ) );
    points.push( new THREE.Vector2( ri, 0 ) );
    points.push( new THREE.Vector2( ri, h ) );
    points.push( new THREE.Vector2( ra, h ) );

On a general note, issues like yours are easier to analyse if you keep the number of vertices (along a path) and segments for a Lathe low, as high vertex counts and segment counts tend to smooth any errors away.

Also, I recommend using the VertexNormalsHelper() function in debugging such issues:

     if ( helper ) {
     	helper.geometry.dispose();
     	scene.remove( helper );
     }
     
     helper = new VertexNormalsHelper( yourMesh, 2, 0xff0000, 1 );
     scene.add( helper );

For r125 and later, it’s useless, as this is a method for Geometry.

After calling this line on LatheGeometry with the full circle, I would try to compute the average normal of the respect points of begin and end, and applied the result to them.

1 Like

The seam handling as you describe is already built-in in the regular LatheGeometry() code, following Line 103:

		// if the geometry is closed, we need to average the normals along the seam.
		// because the corresponding vertices are identical (but still have different UVs).
        ...
2 Likes

There is no such a seam in this solution: Clipping Planes Problem - Radial reveal only half way - #16 by prisoner849
It’s done with shaders, but doable on js side.

1 Like

Sounds like an “off by one” error. Have you considered the 1st bullet point of my previous answer?

BTW: updating an original post after it has already received answers is likely to go unnoticed.

2 Likes

Hey!, this post is an extension from the project you helped me with back on that post you just linked!
I’m using your shader code along side this lathe operation to produce both the interactive animation of the object revealing radially, AND the finished, seamless (watertight) object.
CAN I actually use your shader to (upon a button click AT completion of 360 reveal,) to produce a solid object without seam and removing the end-caps from the interior?

While i still want to solve this long-time coming problem of ‘lathed geometry from a closed shape (not a curve)’
…if your shader can provide the above, then It’d be all the more efficient.

-Thanks for all the suggestions guys!
Will be experimenting with them.

That’s the first thing I’m trying.

something like this was in comments in my code 9had attempted this solution it seems last time i had been working with this code)

ie: recording the first entries in the points then replicating them…

             if (i = 0)   {
                            firstEntryX = pxN;
                            firstEntryY = pyN;
                        }
                   

then later…

 points.push( new THREE.Vector2(
                        firstEntryX,firstEntryY
                    ) );

this code provides the missing side problem…

objInput = app.scene.getObjectByName(objInput);

            const points = [];

            let firstEntryY;
            let firstEntryX;
            
            let positions = objInput.geometry.attributes.position.array;

            let ptCount = positions.length / 3;

            for (let i = 0; i < ptCount ; i++)
            {
                let px = positions[i * 3];
                let pxN = px * -1;
                let py = positions[i * 3 + 1];
                let pyN = py * -1;

                points.push( new THREE.Vector2( pxN,pyN ) );

                if (i == 0)   
                {
                    firstEntryX = pxN;
                    firstEntryY = pyN;
                }
            }

            points.push( new THREE.Vector2( firstEntryX,firstEntryY ) );


            const geometry = new v3d.LatheBufferGeometry(points,sidesInput);

            geometry.computeBoundingBox(true); // is this neccessary, guys?

            const geoMergedVerts = THREE.BufferGeometryUtils.mergeVertices( geometry, 1 );

            const material = objInput.material;

            material.flatShading = false;

            material.smoothShading = true;

            material.depthWrite = true;

            material.side = THREE.DoubleSide;

            lathe = new v3d.Mesh(geoMergedVerts, material);

            scene.add(lathe);

I am still interested in removing the seam that still occurs on the material. obviously the points have not merged.

possibly i should also record the firstEntry AND lastentry, then somehow average and combine them…

ie:

if (i == 0)   
                        {
                            firstEntryX = pxN;
                            firstEntryY = pyN;
                        }

                        if (i == ptCount )
                        {
                            lastEntryX = pxN;
                            lastEntryY = pyN;
                        }

of course i wouldn’t RE-add the last entry to the point array, but how would I go about taking those two recorded values and average / merge them? This seems to be the final step.

i’m wondering if

const geoMergedVerts = THREE.BufferGeometryUtils.mergeVertices( geometry, 1 );

is still relevant after doing so.?

Be careful not to mix up the closed path (in the points[] array) with the full circle ( 0 to 2 * PI ) lathing “sweep”. The closed path is where you replicated the first point as the last.

The visible seam is happening in the lathing sweep. The averaging of normals is already built in the LatheGeometry() code itself and usually prevents any visible seams. If you want to see how the authors of that piece of code did the averaging: here’s the link for you to study.

On a related note, a “path” as defined in the points[] array is a 2D line structure, which may be closed but doesn’t have to be. I’m a little suspicious of the filled rectangle which keeps appearing in the screenshots you provided.

Does the seam also occur, when you fill the points[] array for testing purposes manually, instead of deriving it from a shape?

let ri = 5;		// inner radius
let ra = 20;	// outer radius
let h = 5;		// height

// CCW

points.push( new THREE.Vector2( ra, h ) );    //  moveto
points.push( new THREE.Vector2( ri, h ) );    //  lineto
points.push( new THREE.Vector2( ri, 0 ) );    //  lineto
points.push( new THREE.Vector2( ra, 0 ) );    //  lineto
points.push( new THREE.Vector2( ra, h ) );    //  lineto (back to start)

I used a set of points that defines a simple rectangle, and got that “seam” in the result: Edit fiddle - JSFiddle - Code Playground

I am not sure if there is a connection.

In some of my geometric constructions, I have had to recalculate the normals at the seam points when the parts meet, because they were different due to the edge position.
I took the average.

:thinking:

1 Like

Made an example with the same approach of the bending of ExtrudeGeometry, but on js side instead: Edit fiddle - JSFiddle - Code Playground

1 Like

prisoner849, Nice!
I’m going to try that.
Side Question:
Any off hand knowledge of why I’m getting errors when supplying a pre made MeshNodeMaterial for the sides material?
Other material types work fine.

i believe it’s this (obvious-ish) issue…
I’m creating the extruded mesh with two materials…
const extrudedMesh = new THREE.Mesh(geometry, [materialLids, materialSide]);
while the shader only uses one material??
function patchMaterial(material, uniforms){

actually i’d strikethrough that last paragraph as i see I have…

patchMaterial(materialSide, uniforms);
patchMaterial(materialLids, uniforms);

this, from when your example from earlier this year. it’s using (happily,) dat.gui which i have to animate the radial extrusion.

Hofk,
That’s an additional portion of code I’d been dealing with regarding if the profile of the lathe shape was something more complex like a crescent say…
And the sides show a faceting effect.
I had some code that member manthrax on github had kindly dilute supplied, that smoothed the surface, yet it was for Geometry, not bufferGeometry, and hasn’t been able to successfully update it since the new version of three said goodbye to geometry.

I’ll add that to this when back at desktop…
EDIT:
here it is…

/// HERE'S WHERE UPDATES NEED TO START I BELIEVE..
// THREE.BufferGeometry.prototype.computeAngleVertexNormals = function(angle){
THREE.Geometry.prototype.computeAngleVertexNormals = function(angle){
function weightedNormal( normals, vector ) {

        var normal = new THREE.Vector3();
        for ( var i = 0, l = normals.length; i < l; i ++ ) {
            if ( normals[ i ].angleTo( vector ) < angle ) {
                normal.add( normals[ i ] );
            }
        }
        return normal.normalize();
    }

    this.computeFaceNormals();
    var vertexNormals = [];

    for ( var i = 0, l = this.vertices.length; i < l; i ++ ) {
        vertexNormals[ i ] = [];
    }

    for ( var i = 0, fl = this.faces.length; i < fl; i ++ ) {
        var face = this.faces[ i ];

        vertexNormals[ face.a ].push( face.normal );
        vertexNormals[ face.b ].push( face.normal );
        vertexNormals[ face.c ].push( face.normal );
    }

    for ( var i = 0, fl = this.faces.length; i < fl; i ++ ) {
        var face = this.faces[ i ];

        face.vertexNormals[ 0 ] = weightedNormal( vertexNormals[ face.a ], face.normal );
        face.vertexNormals[ 1 ] = weightedNormal( vertexNormals[ face.b ], face.normal );
        face.vertexNormals[ 2 ] = weightedNormal( vertexNormals[ face.c ], face.normal );
    }

    if ( this.faces.length > 0 ) {
        this.normalsNeedUpdate = true;

    }
}

I changed materials to MeshNormalMaterial and got that result with no errors :thinking:

1 Like

Terribly sorry. I meant meshNODEmaterial I created it in Blender, its essentially an outline material using edge emission, which is a property of a GLTF object is import just to hold the material to input into your shader.

Trying to get the same-ish type of effect as shared here in Omar’s outline pass…
Link - How to render full outlines as a post process - tutorial - #13 by Omar_Shehata