How to get all of the skinnedMesh vertices for each frame of fbx animation

I’m new for threejs.I have got a fbx animation(human running) and can be showed in threejs. I know that threejs calculates the skin vertices of each frame of the animation through skeletal motion.
Now, I want to know how to get all the skin vertices for each frame of fbx animation?

PS. the reason is that I have a 3d led cube (about 30 * 30 * 30)and what to show the human running animation, so I want to get all the skin vertices for each frame so that can be show by the 3d led cube

What’s your reason for wanting this?

I have a 3d led cube (about 30 * 30 * 30)and what to show the human running animation, so I want to get all the skin vertices for each frame so that can be show by the 3d led cube

Could you explain this in more detail? I don’t understand.

Three.js computes the vertex positions for each frame on the GPU, and computing them on the CPU (so you can use them) is (1) not supported by the library, so you’d need to write some complex code, and (2) going to be very slow. If you can explain what you’re trying to do more specifically, there may be other options.

sorry for my pool English:( .
I made a simple LED display like a cube consisting of 30 * 30 * 30 LEDs. This will show the 3D animation on the cube. I want to use FBX exported by 3D software as the source of animation.

FBX export does the frame by frame animation automatically. It never calculates the skeleton. I just uses the final mesh’s vertices that have been skinned in another program.

So just use the FBXLoader to load your model. Your FBX should already be frame by frame information.

1 Like

I don’t think that’s correct, what program(s) are you referring to? FBXLoader creates a THREE.SkinnedMesh, which animates bone positions with each frame. After animating the bone position (mixer.update(delta)), how would you get the position of the 8,000th vertex in the mesh?

But there could be some way to do this all in Blender, exporting one FBX file for each frame of the animation…

One option would be implementing a GPU side transform feedback, rendering each vertex per pixel to a render target, that might save a lot memory without restricting fps.

Edit: since you will use it for that usecase and use FBX anyway i approve the FBX frame by frame option

You don’t have to. Blender and most FBX exporters bake the information if you tell it to.

In the options for an FBX exporter there is an option to bake out frame by frame animation.


1 Like

Ah, I think that’s different. It “bakes” frame-by-frame positions of bones, so that IK constraints on the bones and custom interpolation don’t have to be computed on the client. But it doesn’t bake the positions of vertices — those still have to be computed on the GPU. There’s probably still a way to do this in Blender though, but I think you’d need to export a separate mesh (or shape key) for each frame.


I download the example’s model file at ( and found that each frame has vertices arrays !. So It is made by “bakes” ?

Nice! It’s different from baking I’ve seen in other formats, but if there’s a morph target for every frame that might be all you need. :+1:

No man, they bake verts. That’s the purpose. Trust me. I’ve been doing 3D for a long time. That was one of the key elements back when FBX first came out is that it baked vertices.

I believe you that baked vertices are possible (I think that’s called a Point Cache?), but I’ve written code exporting animation from Blender, and when you use its Bake Action (or bpy.ops.nla.bake()) function it does something different:

The Bake Action tool will apply interpolated frames into individual keyframes. This can be useful for adding deviation to a cyclic action like a walk cycle. This can also useful for keyframe animations created from drivers or constraints.

This bakes keyframes of bone positions, not vertices. It is possible that the FBX exporter does something else, but from the source code it appears to do the same thing:

    currframe = f_start
    while currframe <= f_end:
        real_currframe = currframe - f_start if start_zero else currframe
        scene.frame_set(int(currframe), currframe - int(currframe))

        for ob_obj in animdata_ob:
            ob_obj.dupli_list_create(scene, 'RENDER')
        for ob_obj, (anim_loc, anim_rot, anim_scale) in animdata_ob.items():
            # We compute baked loc/rot/scale for all objects (rot being euler-compat with previous value!).
            p_rot = p_rots.get(ob_obj, None)
            loc, rot, scale, _m, _mr = ob_obj.fbx_object_tx(scene_data, rot_euler_compat=p_rot)
            p_rots[ob_obj] = rot
            anim_loc.add_keyframe(real_currframe, loc)
            anim_rot.add_keyframe(real_currframe, tuple(convert_rad_to_deg_iter(rot)))
            anim_scale.add_keyframe(real_currframe, scale)
        for ob_obj in objects:
        for anim_shape, me, shape in animdata_shapes.values():
            anim_shape.add_keyframe(real_currframe, (shape.value * 100.0,))
        currframe += bake_step
1 Like

no. The obj exporter and FBX exporter are completely different.

You are stuck on bones.

Animate something in Blender with no bones. Now try exporting that from out of blender with “bake keyframes” and uncheck any export option that includes bones. Make sure it is the FBX Exporter. Not the obj

Import it into 3JS and see if it animates.

Baking when exporting the FBX file (at least from 3DS Max) will create one position for each animated object per frame (or the interval you specify), but in the case of a skeleton animation, the vertices are not animated directly so there is no data directly related to vertex positions, only bone positions. Vertex positions are then calculated on the GPU as usual with skinning.

OP is talking about an animated human character, presumably animated using bones, so that seems reasonable.

The code above is copied from in the source code of the Blender FBX exporter. But yes, I’m talking about bones as that seems most relevant to this thread and to three.js, which supports skinning and keyframe animation but not (as far as I know) the FBX Point Cache feature. I couldn’t tell from the source whether the Blender exporter can create a point cache.

No, we don’t currently support this.