Streaming animation blendshapes as KeyframeTracks

Hello,

I am trying to stream blendshapes coming from a WebSocket into animation of an avatar.

I don’t really know what could be the better approach here in terms of design. I was wondering if it was possible to add KeyframeTrack to an existing animation so that I have not much work to do in terms of render loops?

Do you have any ideas of the approach I could take to have minimal complexity in the render loop/best performance overall?

Thank you very much.

Simon

By “stream blendshapes” you mean just streaming the weights for the blendshapes applied to the model yeah? That sounds pretty efficient. I don’t think you want to try to stream actual mesh blendshape vertex data itself since that data is pretty massive.

Yes, just the weights. I was more curious about how I could integrate this with the ThreeJS ecosystem. There seems to be no API to just “push” new animation frames to an existing animation. Any idea how one would do that?

You can make your own AnimationClip s:

https://threejs.org/docs/#api/en/animation/AnimationClip

You create keyframe tracks:
https://threejs.org/docs/#api/en/animation/KeyframeTrack

Each track controls a single float value or primitive datatype like vector3 position, quaternion, or euler.

You can even use them to drive your own custom properties on your objects… you basically embed the path to the property in the keyframe track, and the animationmixer will walk that path during animation and write the values to your properties.

You can inspect the animations array on a loaded GLB in the debugger to get an idea of the internal structure… its not toooo complicated. ( not trivial tho :slight_smile:

1 Like

And is there a way to push keyframe tracks to the same animation clip or do I need to recreate one each time I get a new frame? Because I want to play the animation as soon as I get the first blendshapes data.

It might be easier to start without using KeyframeTrack or Animation classes at all. Create a geometry with 2-3 blend shapes, change the morph target ‘influences’ to the active blend shape in the sequence, and replace blend shapes as you go. Better not to have too many blend shapes uploaded to the GPU if possible, unless they are very small. It’s essentially a copy of the geometry for every frame.

1 Like

So basicly updating this property in my render loop will suffice to update the blendshapes values on the mesh? I don’t plan and replacing blend shapes but rather updating the weights to animate as I go, does that make sense?

Yes – just changing that property in the render loop will activate/deactivate the blend shapes.

Alright perfect, thank you very much. I was hoping that there was some kind of higher-level solution because that still requires me to do the scheduling or my animation “manually” but I guess it’s better this way.