FBX: CreateFromMorphTargetSequence using BufferGeometry

The latest FBXLoader creates SkinnedMesh geometry as BufferGeometry. All the examples online are for creating morph target sequences from Geometry or using a GUI to control the morph influence. I’m having a hard time figuring out how to get my baked blend shapes into a clip so it plays back with my joint animations at the same time. There doesn’t seem to be an equivalent array to “morphTargets” within the BufferGeometry data. So what is the proper way to get baked blend shape and joint animation into three.js?

I should mention in my actual project that I’m loading multiple animation files separately and adding them to the same mixer, but in this example I only have 1 animation file loading.

Here’s my fbx file: online_ex.fbx (141.4 KB)

And my relevant code so far. Since mesh.geometry.morphTargets doesn’t exist what can I use to create the morph target sequence?:

// Character
Promise.all([
    loadFbxPromise('./assets/animations/online_ex.fbx'),
])
    .then(allObjects => {
        _character = allObjects[0]
        _character.position.set(0, 0, 0);

		// Material setings for skel mesh file
        _character.traverse( function ( child ) {

            if (child instanceof THREE.SkinnedMesh) {
                mesh = child;
                mesh.material = material;

                mesh.material.skinning = true;
                mesh.material.morphTargets = true;

                mesh.castShadow = true;
                mesh.receiveShadow = true;

            }
        });

        allObjects.forEach(character=>{

            mixers.push( new THREE.AnimationMixer( character ) );

            // If object isn't a skeletal mesh file then add it to boneClips
            if(character.animations !== undefined){
                boneClips = boneClips.concat(character.animations);
            }

            // Morphs
            character.traverse( function ( child ) {

                if (child instanceof THREE.SkinnedMesh) {
                    mesh = child;
                    morphClips = THREE.AnimationClip.CreateFromMorphTargetSequence( 'morphs', mesh.geometry.morphTargets, 3 ); // I'm not sure what to do on this line

                }
            });
        })

        _character.mixer = mixers[0];

        scene.add(_character);

        // mesh = _character.children[0]; // this is a guess, you may have to change this
        initGUI();
    });

function loadFbxPromise(url) {

    return new Promise((resolve, reject)=>{

        let fbxLoader = new THREE.FBXLoader(manager);

        // onLoad, onProgress, onError
        fbxLoader.load(url,
            object=> {resolve(object)},
            ()=>{/* progress */},
            e=>reject(e));
    })
}

mesh.geometry.morphAttributes should look like:

mesh.geometry.morphAttributes = {
  position: [ BufferAttribute, BufferAttribute, ... ]
}

and then mesh.morphTargetInfluences (https://threejs.org/docs/#api/en/objects/Mesh.morphTargetInfluences) would contain weights, which may be animated:

mesh.morphTargetInfluences = [ 0, 0, ... ];

I’m not sure how FBX, FBXLoader, or various DCC tools handle all of that, or whether you’ll have to manually connect things after import, but hopefully it’s somewhat useful. Personally I’m using a Blender/glTF workflow for this.

1 Like

Great, thanks! I’ll give this a shot. Would you recommend glTF over FBX for future projects? Is there a good reason to choose one over the other?

Disclaimer: I’m part of the glTF format working group and an author of GLTFLoader. :innocent:

I believe you’ll generally find filesize and parse time to be better with glTF, which is optimized for transmission, does not assume an SDK for parsing, and has relatively few ways to represent the same thing. FBX has more complexity, for reasons that aren’t particularly helpful for runtime use on the web (more details).

An increasing number of tools have very good glTF support, and if you’re working with one of those I think you’ll find glTF to be more future-proof and better supported. Unfortunately Autodesk’s Maya and 3DS Max are still difficult to use with glTF. If you have a preference for particular tools, I’m glad to suggest a workflow (or point out current limitations).

The gist of what you’re trying to do here (a model containing multiple animations, where animations affect morph targets) is all supported in glTF. Here’s a glTF example with animated shape keys, but only one animation: three.js examples

2 Likes

Awesome, thanks! Ya this is what I’m looking for out of this FBX workflow. I feel like I’m so close, but can’t seem to get blend shape animation to follow with it’s corresponding joint animation. I’ve only figured out how to change the weights with morphTargetInfluences. Like so:

web_aniamtion

It’s supposed to just do this:

maya_aniamtion

I do all of my work in Maya/Houdini so any info on proper glTF workflows would be great.

I’m getting an error when I try this:

morphClips = morphClips.concat(THREE.AnimationClip.CreateClipsFromMorphTargetSequences('morphs', mesh.geometry.morphAttributes.position, 24, 3));

Uncaught (in promise) TypeError: Cannot read property ‘match’ of undefined

But not when I do this:

morphClips = morphClips.concat(THREE.AnimationClip.CreateFromMorphTargetSequence('morphs', mesh.geometry.morphAttributes.position, 24, 3));

Does CreateClipsFromMorphTargetSequences require a different set a data? The docs say they just need an array.

I’m a glTF convert. The workflow out of Maya isn’t that great at the moment with the current Maya2glTF plugin, but at least my thing is animating! Thanks for the push in the right direction. Used your glTF viewer for help.

My basic loader code:

 var loader = new THREE.GLTFLoader();
loader.load( './assets/animations/gltf/cyl_test.gltf', function ( gltf ) {

	var content = gltf.scene;
	clips = gltf.animations;

	scene.add(content);

    clips.forEach((clip) => {
        if (clip.validate()) clip.optimize();
    });

    mixer = new THREE.AnimationMixer(scene);

    clips.forEach((clip) => {
        mixer.clipAction(clip).reset().play();
    });

});
1 Like

The workflow out of Maya isn’t that great at the moment with the current Maya2glTF plugin

Have you tried FBX2glTF? I’ve had good results with this recently, although admittedly I haven’t tested it on many animated models yet.

Glad it is working! I do think FBX2glTF and Maya2glTF are the best current options for Maya workflows, and both do support animation. Neither is perfect, but the authors are at least responsive to bugs, so I would strongly encourage filing issues on GitHub if/when something doesn’t work as expected. If animation doesn’t play correctly on both http://gltf-viewer.donmccurdy.com/ and http://sandbox.babylonjs.com/, that’s probably enough to assume it’s an exporter bug. :slight_smile:

Thanks guys!

A few questions that might need a different thread. I just want to get some initial thoughts:

  1. I think the main thing that bothers me with Maya2glTF right now is that I’m not sure how to look dev for it. I have a PBR substance material assigned to it, but I can’t see any SSS due to shader limitations. Any tips for look development from within Maya? Should I use Maya Hardware renderer? How would I export the effects of a dome light on my model?

  2. I’ve also tested out substance painter to sketchfab and I love how it lets me quickly lookdev without much coding and then i can download a glTF file from that. The only problem with this is that it only works with static geometry. Is there a potential workflow for something like substance painter >>> sketchfab >>> downloading the glTF model from sketchfab >>> apply joint animation/blend shapes to that static mesh within three.js?

  3. How to add an env light to a scene if my object is glTF? Do I need to override the material completely if I just want to add an HDR env light?

Here’s some progress. This is the sketch fab model inside your viewer, Don.

rbg

And here’s the model (and desired look) within sketchfab.
rbg_sf

  1. I’m probably not familiar enough with Maya to answer that unfortunately. I can say that glTF’s materials are metal/rough PBR, spec/gloss PBR, and unlit. So anything else is likely to get converted/approximated at export. Maybe worth filing an issue on the Maya2glTF exporter asking for best practices / documentation?
  2. Substance Painter also has direct export to glTF, although I don’t think it supports animation either. Perhaps if it at least preserves the skeleton, it would be possible to load materials animations separately through an (untextured) file? Do you know if Substance Painter can export animation with any other format?
  3. I think an envMap is considered best practice with PBR materials like MeshStandardMaterial — practically it will be very necessary with metallic materials, and you can probably get by without it for non-metallic materials. glTF uses PBR unless the model is unlit/shadeless. But I think just setting material.envMap = texture is sufficient, you shouldn’t need to alter the material. You can swap out environments in my viewer to compare here, one of the options is HDR.
1 Like