Some weird ratio between units and pixels

Hi, I’m trying to create measure tool for my objects. I have some predefined constant pixelsToSIUnit, and I need how many pixels has displayed line to calculate real world size.
Elements are 2D so I’m using orthographic camera.

I’m able to calculate it in units and for example 100px it’s around 222units.
Before I added camera.clearViewOffset() it was 256 units.

Do you have any ideas how to solve it? I went through all answers on this topic, and none of them work for me or I’m missing something.

I would suggest not using pixels to measure with. Instead use the native three.js units since those will be constant even when your orthographic camera properties change such as the left, right, top and bottom frustum which would alter the number of pixels per set distance similar to the camera zooming in out.

When importing 3D models from a content creation package in .gltf, .obj or other format they will have some unit of scale. It’s often 1 meter from the CC package = 1 unit for Three.js, but could intentionally be set differently. I know you mentioned using 2D elements, but the principle is the same. There will be a conversion of SVG units to Three.js units which should be constant, but avoid using screen space pixels since those are not constant.

Thanks, for your reply. In the meantime I found solution for my issue. Missing proportion was proportion between camera size and render size.

Camera was set for example to 2000 units wide but canvas was 750 px so after dividing these two I’ve got my “magic” number. Quite easy, but I was focused so much, so I overlooked it :wink: