I see that in the original model, there are facial expressions :
But when I inspect the model json with a console.log I cant find the data of morphTargetDictionary in the face mesh.
I cant read directly the GLTF file.
Maybe this model hasnt morph or its stored elsewhere… ?
Thank you for your attention.
Documentation or any resource on this topic is rare.
As I understand you can have different targets (eg. mouth closed, mouth open) and morph between them… but what happens when you have more targets? eg. left eye open, close. Do you have to prepare now:
left eye open, mouth open
left eye closed, mouth open
left eye open, mouth closed
left eye closed, mouth closed
as predefined targets, so that you can “dynamically” do “any” animation? and so on for more face parts…