Apply sensor data to human model

I am not so good at math problems :persevere:, I have an issue currently about a small motion capture project.

I have a list of sensor data comes from real world (simply bind on arms and legs and chest, only 5 of them), and the data is already calibrated by using 2 poses ( anatomical standing and slanted sitting), from my point of view, the calibration step is used to convert sensor’s rotation data from it’s own coordinate system to the world coordinate system (body up as Y-axis, body right as X-axis, and body faces to Z-axis) so that each sensor’s data is like we put sensor at the origin of this coordinate system and rotate so the value we get are based on this fixed world coordinate system and after calibrate, we can know how human’s arm is moved instead of a list of non-sense data from sensor

and comes to the model, I build the model from Blender and rigged it, when export, it already been converted as Y-axis up so it’s same as the fixed coordinate system before. When I load the data to Three.js and use “bone.quaternion.copy(sensorData)”, the bone’s move is totally different than the real one.

After searching, it might due to the sensor’s rotation data is based on fixed coordinate system but the bone’s rotation is based on it’s parent bone so I need some way to convert sensor data to the bone’s local coordinate system.

I am not sure if what am I thinking is correct, please help, it’s so hard to find a working sample for my problem :slightly_smiling_face:‍:arrow_up_down: