Point Cloud with Illumination Model. How do I make it performant?

I have a data viz app that renders point clouds:

I am using PointsMaterial - which is flat shaded - and I use a texture to shape each point into a circular dot:

const materialConfig =
    {
            size: pointSize,
            vertexColors: true,
            map: new THREE.TextureLoader().load( "texture/dot.png" ),
            sizeAttenuation: true,
            alphaTest: 0.5,
            transparent: true,
            depthTest: true
    };

this.material = new THREE.PointsMaterial( materialConfig )
this.material.side = THREE.DoubleSide

The geometry is a list of meshes. Each mesh - each cluster of points with a single color - is a vertex list and a color list:

this.meshList = trace
    .map(({ xyz, rgb, color, drawUsage }) => {

            const geometry = new THREE.BufferGeometry()

            const positionAttribute = new THREE.Float32BufferAttribute(xyz, 3 )
            geometry.setAttribute('position', positionAttribute)

            const colorAttribute = new THREE.Float32BufferAttribute(rgb, 3)

            // drawUsage = THREE.DynamicDrawUsage
            colorAttribute.setUsage(drawUsage)
            geometry.setAttribute('color', colorAttribute )

            geometry.userData.color = color

            const mesh = new THREE.Points( geometry, this.material )
            return mesh
    })

This is reasonable performant - I can render 500,000 points and still have decent frame rates.

Question: I now want to use an illumination model - phong, physically-based, etc. I am unclear on how to do this? Can it be done and still have the app be performant?

Thanks.

Phong, Lambert, standard, physical … all more advanced illumination models require normal vectors. Individual points have no normal vectors, unless you figure out how to calculate (emulate) them based on the points distribution in the cloud .

One possibility is to cluster the points and for each point to assume its normal vector is from the mass center of the cluster to the point itself.

Another possibility is to find some wrapping with a mesh and use its normal vectors.

1 Like

Yah, I like the idea of deriving the associated normal vector using the “centroid of point cluster” approach.

Ok, assuming I go this route:

  1. how do I associate the surface normal with a point?
  2. how do I get the the material to make use of it?

Most likely such material for points will require tuning the shaders.

Let me propose an alternative - instancing. Instead of a point, you may have a normal low poly mesh with any material. I tried it with 500000 points and it works fine on my 1.5yo laptop, 1000000 points is also fine.

Click the image to run it online:

image

2 Likes

Interesting, so instancing will keep things performant. I had no idea it was that efficient. Nice. Can you send me you sample code. I assume this would allow even more fancy effects like shadowing, etc.

Thanks so much for the assist.

Here it is: https://codepen.io/boytchev/full/ExpzXEe

1 Like