Transmit point position to its own shader

Hi,
I have a cloud of points with custom shader material. I have some paths that I transmit to those points which are thus moving. How can I transmit the position of each point in the global scene to the shader ? I don’t know if the shader can exist individualy in this situation.


const BrainParticleMaterial = shaderMaterial(
...
..
)

return(
<points>
        <bufferGeometry attach="geometry" ref={brainGeo}>
          <bufferAttribute
            attach="attributes-position"
            count={positions.length / 3}
            array={positions}
            itemSize={3}
          />
          <bufferAttribute
            attach="attributes-randoms"
            count={randoms.length}
            array={randoms}
            itemSize={1}
          />
        </bufferGeometry>
        <brainParticleMaterial
          positionx={ref.current?}
          attach="material"
          depthTest={false}
          transparent={true}
          depthWrite={false}
          blending={THREE.AdditiveBlending}
        />
      </points>
)

Thank you.

You can write the positions to the geometry.attributes.position every frame, and set the .needsUpdate = true on the attribute.

This helps reduce drawcalls… but the CPU is still doing all the animation.

The next level up from that is to store your positions in a floating point texture (FBO - framebuffer object) and store your paths in a float DataTexture, and read the path data and transform the particles in the shader… reading from one FBO and writing to a copy of it (since you can’t read and write to the same FBO simultaneously)… so instead you read from one, write to another, and them swap them for the next frame. (also called ping pong rendering)

2 Likes

I succeeded to make this work : FBO particles – Youpi !. Do you know how I could change the color of points according to its position ? The solution is not specific to this case I think, it must be the same for every fbo case with three.js

1 Like

Yeah you would have to modify the renderMaterial where it outputs gl_FragColor.

In renderMaterial shader, uv represents only the coordinates inside the particle, not the position of the particle on the screen, i am right ? I don’t understand.

I think that’s right. In the render shader, It’s the coordinate of the pixel of the sprite texture that is at the currrent fragment being rendered.

In the simulation shader, the UV is the coordinate of the particle data in the particle data texture.