Hi!
I’m now familiar with TSL. But I’m still having issues retrieving some data for morphTargets using TSL.
For regular attributes attribute('name') works. I can use it for position, normal, uv, skinIndex, skinWeight which is useful for vertex shader manipulations. These are by default stored in buffers on the mesh geometry directly in a section called attributes.
My problem is that I need to use morphTarget attributes which are stored separately in the geometry buffer inside morphAtrributes instead. morphInfluences is stored in the parent object and I already managed to store the data in a uniform.
And inside morphs are only referenced vaguely once in this sentence:
The transformed term reflects the modifications applied by processes such as skinning , morphing , and similar techniques.
I’ve tried multiple approaches. attribute('morphTarget<N>') doesn’t work for example. On compilation I always get errors saying could not find attribute 'morphTarget<N>'. With <N> being an integer index.
Is there more documentation for me to explore elsewhere?
Is there maybe already a documented vertex shader in TSL handling morphTargets?
Does anyone know how to properly retrieve morph attributes data in TSL?
My issue with that approach is that since it’s not a TSL node. I will manually get the full array of morph data for every vertex. And then since my shader is executed per vertex. I need to find a way to align this data with the current vertex treated. I’m not sure there is a way in a TSL Fn() function to ask for “current vertex ID” ?
Well I definitely can’t do it manually, that’s too much data for uniforms.
Although digging again through the code. I found earlier morphReference(mesh) that returns a morphNode there is no reference to this in the doc so I’m trying to dig through the code and understand how it’s applied. So far the little I could do with it seemed to compile properly at least.
This is the code that inserts morphTargets default behaviour to the vertexNode.
Now it seems to only be a matter of manipulation the node builder.
When you create a node using TSL, in the Fn() function goes first by default the node builder as parameter.
vertexNode = Fn((builder) => {
...
})
I can access it this way. builder.addStack() seems to be a prerequisite.
Then in theory I would have though morphReference(mesh).toStack() would have done the trick. Unfortunately it does nothing. I will probably need to dig again in the code to understand the addStack() and .toStack() behaviours.
Anyone has manipulated the node builder in a similar way already? And does it sound like a good idea?
It feels like the solutions lies here close by.
This is purely theoretical, but what if you recreate the morph target inside your Fn? You can retrieve the original attribute and use the morph attribute as a temporary storage. Then, compute the final value using mix(original, morph, weight).
In theory, yes it would work to mix(geometryPosition, morphAttributes.location, morphInfluence). Although maybe can’t see a good solution. But when I tried implementing that kind of logic, I had to create a custom attribute to retrieve the current vertexID and store morphAttributes in a uniform and iterate on it using the vertexID. And that’s when I discovered the size limit for uniforms among other issues. And overall I don’t think I have the level for a full implementation of morph targets.
That’s some more reason why I am trying to recover the default morph behaviour instead of reprogramming it.
Moments ago after more digging. I found out that the default code was returning positionLocal. I tried experimenting with it again. And it does almost work. In my code I was using positionGeometry which overrides everything. With positionLocal I’m getting the morph transforms back. The only issue I still didn’t manage to fix. Is that positionLocal is local mesh space and my transforms are not localspace. So it collapses to the origin now the further you get from the rest pose. I’m not really an expert on this. I understand the principles but I’m not so great putting them in practice sometimes. x)
I’ve tried more variations.
I’m convinced that now I understand what’s happening. When I’m using positionLocal instead of positionGeometry. I indeed retrieve the morph transforms. But I also retrieve the default skinning behaviour. So since the beginning I had no issue transferring data spaces. The problem is simply that the skinning is applied twice. Once with matrix blend and once with dual quaternions. So any transform kind looks like it’s twice as much as it should be.
Then that means I need to find a way that works to tell the material to disable skinning while keeping morphs. The classic material.skinning = false doesn’t seem to affect anything. But I remember seeing clauses enabling/disabling this in the source code. I think I can investigate and find a solution. If that works the code should be compact and elegant!
The final solution was to was to use mesh.isSkinnedMesh = false; since material.skinning = false is ignored. This is a little bit hacky. But if the mesh is no longer treated as a skinned mesh by default it’s fine since anyways I override the vertexNode to skin it myself.
Now I just need to finish updating the normals with similar code and that’s a wrap.