I want to compute the screen-space size of a bounding box.
Here is my code(heightOnScreen
means the pixels which the box occupies longitudinally on screen space, I try to compute it by the diagonal lines of the box to take projection into account):
let center = new THREE.Vector3((box.min.x + box.max.x) / 2, (box.min.y + box.max.y) / 2, (box.min.z + box.max.z) / 2);
let extents = new THREE.Vector3((box.max.x - box.min.x) / 2, (box.max.y - box.min.y) / 2, (box.max.z - box.min.z) / 2);
let a = center
.clone()
.add(new THREE.Vector3(-extents.x, -extents.y, -extents.z))
.project(this._mainCamera);
let b = center
.clone()
.add(new THREE.Vector3(extents.x, -extents.y, extents.z))
.project(this._mainCamera);
let c = center
.clone()
.add(new THREE.Vector3(extents.x, extents.y, extents.z))
.project(this._mainCamera);
let d = center
.clone()
.add(new THREE.Vector3(-extents.x, extents.y, -extents.z))
.project(this._mainCamera);
a.x = 0;
b.x = 0;
c.x = 0;
d.x = 0;
let heightOnScreen = Math.max(a.distanceTo(c), b.distanceTo(d));
I found that for a same box, a
’s value doesn’t change wherever the camera is, so my heightOnScreen
(doesn’t change of course) is meaningless. Any kind of help is welcome.
I make a simpler experiment:
this._mainCamera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, 1, 200); this._mainCamera.position.copy(new THREE.Vector3(0, 0, 50)); // init pos this._mainCamera.lookAt(new THREE.Vector3(0, 0, 0)); let zero = new THREE.Vector3(0, 0, 0); console.log("zero", zero.project(this._mainCamera));
And this give me Vector3 {x: NaN, y: NaN, z: -Infinity}
. I think it shouble be 0,0,0
.