Hello, I am planning to build a 3D talking head with animated lips (i.e. lip sync) according to some user input. My idea is to find a model head on a website such as https://sketchfab.com/ and then customize it in blender. Once complete, I would export it in glTF format and then load into three.js. From there I could morph the vertices on the lips using morphTargetInfluences. Is this a reasonable approach? Is there a better way?
Using morph targets is definitely a valid approach for facial animations.
I am struggling to make lipsync by using Rhubarb lipsync library but, I am unable to create lipsync by the exported json. Can you please suggest something?