I have created a point cloud (in this case a simple large sphere around the camera) and I’m using a PointsMaterial
as shader, and created the mesh using BufferGeometry
. Mostly this works fine, but when the points get further than a certain distance (roughly 1000 units), the center of the camera forms an odd “blind spot” in which none of the points are visible anymore. This blind spot moves around with the camera and is always in the center of the screen.
The following is the code.
Code Sandbox: https://codesandbox.io/p/sandbox/selective-bloom-forked-9ztltr
import React from 'react'
import { Canvas } from '@react-three/fiber'
import { OrbitControls } from '@react-three/drei'
import { PointsMaterial } from 'three'
function PointCloudSphere() {
const pointCloud = React.useMemo(() => {
const pointCloud = []
const RADIUS = 1000
const N = 400
for (let theta = 0; theta < 2.0 * Math.PI; theta += (2.0 * Math.PI) / N) {
for (let phi = -Math.PI / 2; phi < Math.PI / 2; phi += (2.0 * Math.PI) / N) {
pointCloud.push(RADIUS * Math.sin(theta) * Math.cos(phi))
pointCloud.push(RADIUS * Math.sin(theta) * Math.sin(phi))
pointCloud.push(RADIUS * Math.cos(theta))
}
}
return new Float32Array(pointCloud)
}, [])
const material = React.useMemo(() => {
return new PointsMaterial({
sizeAttenuation: true,
vertexColors: false,
depthWrite: false,
size: 2,
color: '#FF0000'
})
}, [])
return (
<points material={material}>
<bufferGeometry attach="geometry">
<bufferAttribute attach="attributes-position" args={[pointCloud, 3]} />
</bufferGeometry>
</points>
)
}
export default function App() {
return (
<>
<Canvas>
<OrbitControls />
<PointCloudSphere />
</Canvas>
</>
)
}
Once you decrease RADIUS
to a bit less than 1000 the problem disappears. I would understand if there was something like a “max distance” type thing for the shader, but I don’t think there is, and it also doesn’t make sense that it’s just the points the center of the screen that are affected.
Anyone any idea what’s going wrong here? Is it a bug, or am I doing something wrong?