Testing the MediaPipe library for hand positioning in a 3D scene.

demo: DEMO Vorodis2
The main issue was scaling a 3D scene with perspective to match the 2D coordinates returned by MediaPipe. Yes, it would be possible to use a different camera setup or even ignore the Z axis completely, but for this test I wanted to keep the scene closer to a real perspective setup.

Another part was placing a nail on the finger. It is not just about positioning it, but also scaling and rotating it correctly along the finger phalanx. I didn’t fully finalize the rotation axis relative to finger bending yet, but it is enough to validate the general approach.

I also ran into some unexpected issues with the camera and library imports. Things that used to be simple tend to become more complicated over time: dependencies, browser limitations, and small breaking changes. Even a basic test now requires extra attention.

4 Likes

Thanks for sharing the demo and your insights. Regarding the Z axis from MediaPipe, it cannot be fully accurate because it only provides an approximation based on the 2D input and learned depth priors. Some inconsistencies are expected when trying to perfectly match a 3D perspective.

Your approach to scaling and placing the nail along the finger phalanx makes sense, and even without the final rotation alignment, it is a solid validation of the general workflow. The camera and dependency issues you mentioned are familiar; these small complications often appear as projects grow and browsers evolve.

Overall, it looks like you have managed the main challenges well, and the approximations from MediaPipe are just a known limitation of the pipeline rather than a flaw in your setup.

Take for example this wrist AR demo can only approximate the z, but the hand 3d position can not be rely on because the z axis is not accurate from MediaPipe, we can measure the wrist or hand distance from camera by measuring the fingers average sizes.

Nice !
Very good start, MediaPipe is a magic - I was playng with it some time ago, processing hands and face: https://shaderpipe.yesbird.online/

And yes, it’s very capricious, and needs a lot of patience.
Sources are here, maybe they will help: GitHub - syanenko/ShaderPipe: Sandbox for experiments with GLSL and Google's Mediapipe, based on Three.js and React.js with Material-UI.

1 Like