Using Textured Planes as particles in pointcloud

Hi There.

I’m trying to place a plane which I can texture at every location of a point cloud.
I have PLY file with 22k vertices (no normals, faces, uv etc …very clean). I would like to add a 2D Plane pointing towards the camera at every location and then add a randomised texture from an array of 3 textures and rotate it around the y axis.
All of this is fairly simple but I was hoping to find a more efficient way of doing this as creating 22k separate animated textured planes with a loop is horrendous on performance.
I know I can use THREE.Points() with a sprited image but this wont allow me to do rotation or multiple different sprites. Is there any way to extend THREE.Points to use a THREE.PlaneGeometry instance, instead of a single image?
Thanks very much

Hi!

I didn’t get, what rotation do you mean?
And “multiple different sprites” is a texture atlas. In case of both points and instanced planes, you’ll need to patch the material to make it able to use the atlas.

Hi thanks for the response,
Sorry I wasn’t very clear.

1. So let’s say I have a PLY file with just vertices and no faces or anything else like this:

ply
format ascii 1.0
element vertex 10
property float x
property float y
property float z
property list uchar int vertex_indices
end_header
-1.096243 -34.75982 -9.593866 
-1.026878 -34.69783 -9.696553 
-1.026878 -34.64236 -9.569579 
-3.354581 -32.12786 5.258512 
-3.272709 -32.05115 5.177201 
-3.272709 -32.03098 5.314286 
-1.264096 -34.68545 -15.23503 
-1.167952 -34.59552 -15.27826 
-1.167952 -34.63902 -15.14671 
-2.080832 -36.91751 -12.30752 

(The real file is many thousands of vertices)

2. and then let’s say I want to make a point cloud out of it using Three.Points like so:

//imports & definitions omitted
const modelLoader = new PLYLoader();
modelLoader.load(myPLYFile, function(plyGeom){
        const modelGeometry = plyGeom;
        const modelMaterial = new THREE.PointsMaterial({color:0xFFFFFF,size:0.5});
        const model = new THREE.Points(modelGeometry, modelMaterial);
        scene.add(model);
},undefined,function(error){
        //oh no!
})

So far so good.
However, instead of the standard square that Three.PointsMaterial gives me I would like it to be any one of an array of images and I would also like the z rotation to be addressable / controllable.

I know that I can use an image instead of the square and even set an alpha map (sorry in my initial post I called this a ‘sprite’ - my legacy terminology from other software/libraries) like so:


const   particleSprite = new THREE.TextureLoader().load( 'particlesprite.png' );
const   particleAlpha = new THREE.TextureLoader().load( 'particlealpha.png' );
const   pointSpriteMaterial = new THREE.PointsMaterial({
           size: 0.5, 
           map:particleSprite, 
           alphaMap:particleAlpha, 
           transparent:false
}

The problem:

I need much more control over the points / particles than this. For instance, I would like to programatically set the image/sprite/texture and rotate the particles as mentioned before (maybe also set other geometric properties).

I only ever need the particles to appear in “2D” and pointed at the camera like the basic Three.Points() and PointsMaterial() does already. I also love that these classes/objects are so lightweight with regards to resources. Perhaps as a result of this efficiency, I don’t seem to have any more control over particles as a result when they are in Points/PointsMaterial objects.

I know that I can simply create a huge mass of PlaneGeometry()s on the scene to achieve what I am after but the resource overhead is not acceptable for the project so my question is:

Question:
Is there a way to extend Point/PointsMaterial to be able to control each point a bit more like a Plane/Texture?

Thanks very much!

Have a look at this: Rotating and scaling a shape - #5 by prisoner849
and this: Texturing THREE.Points - #2 by prisoner849

This is a huge eye opener for me. My head was stuck in OOP JS land where I thought i should be somehow extending the Points or PointsMaterial class but instead you went the other way around and did shader injection on the other end of the logic pipeline which is so much more simple and effective!

I’d never even seen the onBeforeCompile event and certainly didnt know you could bind to it from the material constructor.

This is fantastic! Out of curiosity, on your example with the “X”/crosses, how would you render an image with a mipmap so that the further away from the camera, a different mip sprite is used?

I’m so thankful for this, i think i could have tried about 20 things and failed every time because i wasn’t thinking about it right.

hmm, when I try to use the Texturing example you provided above on a new version of ThreeJS it dies with the error:

three.module.js:18910 THREE.WebGLProgram: Shader Error 0 - VALIDATE_STATUS false

Program Info Log: Fragment shader is not compiled.


FRAGMENT

ERROR: 0:161: 'mapTexelToLinear' : no matching overloaded function found


156: #if defined( USE_LOGDEPTHBUF ) && defined( USE_LOGDEPTHBUF_EXT )
157: 	gl_FragDepthEXT = vIsPerspective == 0.0 ? gl_FragCoord.z : log2( vFragDepth ) * logDepthBufFC * 0.5;
158: #endif
159: 		
160:       	vec4 mapTexel = texture2D( map, vUv );
161: 				diffuseColor *= mapTexelToLinear( mapTexel );
162:       
163: #if defined( USE_COLOR_ALPHA )
164: 	diffuseColor *= vColor;
165: #elif defined( USE_COLOR )
166: 	diffuseColor.rgb *= vColor;
167: #endif

It seems to be something about the deprecation of mapTexelToLinear in the shader.
It works great on older versions though. Any thoughts on how to solve this?

And if you try it this way: diffuseColor *= mapTexel; ?

:zap: :man_dancing:

you sir, are a gentleman and a scholar!

Thank you again!

can I please ask a question about your ‘onBeforeCompile’ injection above:

I want to inject the uTime/u_time uniform into the PointsMaterial shader but it seems to be very unhappy about doing so.
From all of my past experience and the official documentation, passing uniforms are supported only by the ShaderMaterial class. I have been successful at injecting u_time as a one time variable (using string replacement) when the Material is instantiated but I’m guessing the performance overhead of this is quite heavy as I assume this is recompiling the shader over and over again?

I cant find much information on the onBeforeCompile trick and hoped you might know how this might work?

In this example https://codepen.io/prisoner849/pen/OJmxGOg?editors=0010, there is the approach I use. Lines 25-27, 32, 45 in the JS section.

Thank you so much!