Advice for animating lips of 3D talking head

Hello, I am planning to build a 3D talking head with animated lips (i.e. lip sync) according to some user input. My idea is to find a model head on a website such as and then customize it in blender. Once complete, I would export it in glTF format and then load into three.js. From there I could morph the vertices on the lips using morphTargetInfluences. Is this a reasonable approach? Is there a better way?


Using morph targets is definitely a valid approach for facial animations.

i still waiting for merged looks like more optimized solution for Morph targets with

Hi @shane,
I am struggling to make lipsync by using Rhubarb lipsync library but, I am unable to create lipsync by the exported json. Can you please suggest something?


ya just use usermedia to get the microphone, or preload an audio file, then look up how to analyze audio data in real time, but get the least amount of data, basically, look up how to get the audio data in the main update loop, then just determine whether or not one is talking, check the volume level, and if u are talking, then put in some kind of sprite sheet of mouths moving, unless u want to do more complicated moiuth animations, but basic mvement should be good enough