Raycaster intersections after TransformControls rotation

I have a scene with a bunch of meshes in a mesh group. When moving the mouse, I can get the intersection points and objects from a raycaster all in the normal way, all of which works just fine. But if I perform a rotation with TransformControls, the intersected points returned by Raycaster.intersectObjects() appear to have not followed the rotation.

onMouseMove (event) {

raycaster.setFromCamera (mouse, camera);
const intersections = raycaster.intersectObjects (meshgroup.children, false);
const point = intersections[0].point;
const ob = intersections[0].object;

Before a TransformControls rotation, all works just fine. When I point the mouse at a location on the drawing, the point coordinates returned and the object intersected have the values I expect. But after a simple TransformControls rotation about the Z axis, for instance, the point coordinates returned are no longer what I think they should be. I.e., if I point to the same location on the drawing as before the rotation, the coordinates returned are not the same. This must surely be another confusion between local and world space that keeps tripping me up.

But oddly enough, the object returned is correct. Just the point coordinates are wrong.

Can someone explain what is going on and how to get the “correct” point coordinates, probably via some quaternion rotation?


For anyone interested, I’ve partly solved this problem. If I obtain the TransformControls worldQuaternionStart and worldQuaternion and calculate a change quaternion and apply that change quaternion to the raycaster intersection point, then the transformed point is corrected.

That works fine as long as there is only one rotation along only one axis, but if I start adding rotations about other axes or even additional rotations about the same axis, this approach breaks down.

Can anyone explain how to generalize the approach for arbitrary rotations?