Detect changes of camera properties via OrbitControls

Hello. I have an app using three js. I am using ortographic camera and Orbit controls. I want to fire event whenever zoom or position of camera is changing. What is the best way to do it? Should I constantly check these parameters in animate loop?

You can use this approach:

controls.addEventListener( 'change', render );

This pattern is used many times in the examples for instance:

https://threejs.org/examples/webgl_loader_gltf

1 Like

thank you so much;)

Hi Mugen87,

I have a similar problem. Is there a way to only listen to the zoom and not the pan/move?
I need this in order to scale some objects when the camera is zooming in/out.

Thanks man,

Unfortunately, this is only possible if you modify OrbitControls. I guess it should work if you change the following code section in update() from:

if ( zoomChanged ||
	lastPosition.distanceToSquared( scope.object.position ) > EPS ||
	8 * ( 1 - lastQuaternion.dot( scope.object.quaternion ) ) > EPS ) {

	scope.dispatchEvent( changeEvent );

	lastPosition.copy( scope.object.position );
	lastQuaternion.copy( scope.object.quaternion );
	zoomChanged = false;

	return true;

}

to

if ( zoomChanged ) scope.dispatchEvent( changeEvent );

if ( zoomChanged ||
	lastPosition.distanceToSquared( scope.object.position ) > EPS ||
	8 * ( 1 - lastQuaternion.dot( scope.object.quaternion ) ) > EPS ) {

	lastPosition.copy( scope.object.position );
	lastQuaternion.copy( scope.object.quaternion );
	zoomChanged = false;

	return true;

}

The change event will then only fire if the zoom is changed. You could also consider to introduce a new event type (e.g. zoomChanged) for this use case.

1 Like

Thanks for replying. That’s a good point, the only problem I see is I’ll have to change the orbit-controls.js whenever I update the library.

Is there a way to intercept mouse and touch presses that trigger a zoom through a component? If so, do you have an idea how?

Um, not sure how this would look like…

For example I could emit an event when the third mouse button is pressed or the mouse wheel used, both of which tell orbit-controls to start zooming and employ it to scale my objects. What do you think?

I did it by detecting the inputs that trigger the zooming/dollying, namely the 2nd mouse button, wheel and touch events.