Converting JSON to GLTF via legacythree2gltf issue

I tried to convert the Marine JSON model from prior library revisions using the legacythree2gltf converter:


marine.zip (2.1 MB)

The first issue I had was that while the JSON file is loaded via readFileSync, the texture is loaded via a regular Three.js TextureLoader. I think because of that running ...three.js\utils\converters>node .\legacythree2gltf.js .\marine.json just ended its work witout any output. I added some console.log calls and the execution went to LegacyJSONLoader: THREE.Loader.prototype.initMaterials -> Loader: this.createMaterial( materials[ i ] ... ) -> json.map = loadTexture(...) -> texture = _textureLoader.load( fullPath ) -> TextureLoader: loader.load( url ...) and stopped. I feel like there should be some error, but I did not see anything in my console.

Then I commented out the materials loading from LegacyJSONLoader and the process completed with a GLTF file. Unfortunately, when I import it to Blender, all I have is the skeleton and the animations but not the mesh (or materials, which I expected not to be there).

With the Online GLTF converter I get the model, but no animations, skeleton or the materials.

How could I get the Marine into the new GLTF format such that it would have: mesh, textured material, skeleton and the animations (idle, walk and run)?

I don’t know why but the soldier’s material is transparent with an opacity of 0 (which results in an invisible mesh). Check out the related material definition in the JSON:

"materials" : [	{
	"DbgColor" : 15658734,
	"DbgIndex" : 0,
	"DbgName" : "MaleMarineC",
	"blending" : "NormalBlending",
	"colorDiffuse" : [0.5364704212020399, 0.5364704212020399, 0.5364704212020399],
	"colorSpecular" : [0.19372500479221344, 0.19372500479221344, 0.19372500479221344],
	"depthTest" : true,
	"depthWrite" : true,
	"mapDiffuse" : "MarineCv2_color.jpg",
	"mapDiffuseWrap" : ["repeat", "repeat"],
	"shading" : "Lambert",
	"specularCoef" : 9,
	"transparency" : 0.0,
	"transparent" : true,
	"vertexColors" : false
}],

If you replace this section with the following content, I was able to load the file via LegacyJSONLoader:

"materials" : [	{
	"DbgColor" : 15658734,
	"DbgIndex" : 0,
	"DbgName" : "MaleMarineC",
	"blending" : "NormalBlending",
	"colorDiffuse" : [0.5364704212020399, 0.5364704212020399, 0.5364704212020399],
	"colorSpecular" : [0.19372500479221344, 0.19372500479221344, 0.19372500479221344],
	"depthTest" : true,
	"depthWrite" : true,
	"mapDiffuse" : "MarineCv2_color.jpg",
	"mapDiffuseWrap" : ["repeat", "repeat"],
	"shading" : "Lambert",
	"specularCoef" : 9,
	"opacity" : 1.0,
	"transparent" : false,
	"vertexColors" : false
}]

Result:
image

Can you please try to enhance the converter by this code section:

THREE.ImageLoader.prototype.load = function ( url, onLoad ) {

	if ( this.path !== undefined ) url = this.path + url;

	if ( ! fs.existsSync( url ) ) {

		onLoad( new Buffer( '' ) );
		return;

	}

	onLoad( fs.readFileSync( url ) );

};

Otherwise ImageLoader does not work in a node environment.

I tried that and also enhanced the converter by replacing the ImageLoader.load function to read from the filesystem like recommended.

I’m still getting only the skeleton and the animations out. There seems to be something happening in the GLTFExporter. If I console.log things like mesh.geometry.attributes.position, mesh.geometry.index and mesh.material.map right before the exporter is called, then all of these now have some data in them. So actually loading the JSON file seems to work. But in the resulting GLTF file even the meshes key is missing.

So far I’ve found out that there is this line in the GLTFExporter:prorcessMesh():

if ( isMultiMaterial && geometry.groups.length === 0 ) return null; 

For some reason this returns null without any error message or notice for the Marine model. I guess there should be some message what is wrong with the mesh and how to fix it.
Right not it seems that adding something like this gets over that part:

if (isMultiMaterial && mesh.material.length == 1) {
	isMultiMaterial = false;
	mesh.material = mesh.material[0];
}

But the next problem is when the processMesh() calls processMaterial(), which calls processTexture(). In the latter there is this code:

if ( options.embedImages ) {

	var canvas = cachedCanvas = cachedCanvas || document.createElement( 'canvas' );
	... 

This fails in the node environment (again without any error messages for me, had to put console.logs everywhere to track it down), because there is no document object there.

So I configured the exporter not to embed the images in the legacythreejs2gltf.js:

var exporter = new THREE.GLTFExporter();
exporter.parse( mesh, ( json ) => {

	var content = JSON.stringify( json );
	fs.writeFileSync( path.basename( file, '.json' ) + '.gltf', content, 'utf8' );

}, { binary: false, animations, embedImages: false } );

But then Blender gave some texture errors if I imported the GLTF. So from the GLTF file I had to also remove this part under the materials.pbrMetallicRoughness keyword:

"baseColorTexture": {
	"index": 0
}

After this the import to Blender finally worked and I got both the mesh, skeleton and animations.

I’m not really sure if the artifacts near the armpits and hips are an issue with the model or the importer/exporter. When I use the --optimize flag to merge the verts, then it is less noticeable.

Okay, I think I finally got it working.

First, to get the error messages I had to add a try-catch block around the entire legacythree2gltf.js code:

try {
	// the entire script contents
} catch(e) {
	console.log(e);
}

Then I could easily see the errors and where do they come from. Instead of the script just quietly stopping without any results.

The only thing that was still missing was the texture. I decided to embed the texture in the GLTF file.

Two changes were necessary:

1) The part that was recommended before. The last line was wrong. At the end of the THREE.ImageLoader.prototype.load Instead of

      onLoad( fs.readFileSync( url ) );

there should be:

	var image = new Canvas.Image();
	image.src = fs.readFileSync( url );
	onLoad( image );

2) In the part that overwrites the global.document to simulate the browser environment, instead of:

	const canvas = new Canvas( 256, 256 );

there should be

	const canvas = new Canvas.Canvas( 256, 256 );

Now with embedImages: true the script seems to correctly pack the texture into the GLTF file. When I import it in Blender it seems all there.

I also no longer have to manually remove the

"baseColorTexture": {
	"index": 0
}

part from the file either. As it now can read the actual texture.