Hi,
I have a cloud of points with custom shader material. I have some paths that I transmit to those points which are thus moving. How can I transmit the position of each point in the global scene to the shader ? I don’t know if the shader can exist individualy in this situation.
You can write the positions to the geometry.attributes.position every frame, and set the .needsUpdate = true on the attribute.
This helps reduce drawcalls… but the CPU is still doing all the animation.
The next level up from that is to store your positions in a floating point texture (FBO - framebuffer object) and store your paths in a float DataTexture, and read the path data and transform the particles in the shader… reading from one FBO and writing to a copy of it (since you can’t read and write to the same FBO simultaneously)… so instead you read from one, write to another, and them swap them for the next frame. (also called ping pong rendering)
I succeeded to make this work : FBO particles – Youpi !. Do you know how I could change the color of points according to its position ? The solution is not specific to this case I think, it must be the same for every fbo case with three.js
In renderMaterial shader, uv represents only the coordinates inside the particle, not the position of the particle on the screen, i am right ? I don’t understand.