When converting Matrix3 to Matrix4 I’m seeing inconsistencies that I wouldn’t expect. For example, converting a Matrix3 to a Matrix4 gives unexpected results, and is different to a Matrix4 created with the same transforms.
I’ve created a codepen at https://codepen.io/grokys/pen/eYVLBbK to demonstrate the difference (open the console at the bottom to see the output), but basically these two snippets produce different output:
var m3 = new THREE.Matrix3()
m3.scale(4, 5)
m3.translate(12, 14)
var m3to4 = new THREE.Matrix4().setFromMatrix3(m3)
// Result: 4,0,0,0,0,5,0,0,12,14,1,0,0,0,0,1
var m4 = new THREE.Matrix4()
m4.makeScale(4, 5, 1)
m4.multiply(new THREE.Matrix4().makeTranslation(12, 14, 0))
// Result: 4,0,0,0,0,5,0,0,0,0,1,0,48,70,0,1
Things to note:
- In the Matrix4 the translation is multiplied by the scale, in the Matrix3 it’s not (Matrix3 is using pre-multiplication for
translate
? This is unexpected becausemultiply
is specified as using post-multiplcation so I’d have thought this to be the default) - The translation values appear in different places
- Decomposing the 3x3->4x4 matrix gives strange results
What I think is happening:
- even though
multiply
onMatrix3
specifies post-multiplication,translate
(and possiblyscale
androtate
) use pre-multiplication - converting a
Matrix3
to aMatrix4
doesn’t work as I’d expect: it just does a row-wise copy, meaning that what should appear in row 4 stays in row3
Is this intentional? Is converting a 3x3 matrix to a 4x4 matrix not supposed to be done like this?