Can't apply material to loaded model. Nothing helps

I have a problem which I’m struggling with 3 for the days in a row already.
I cannot apply MeshStandardMaterial to a loaded object mesh.

I’m loading glTF model compressed with DRACO. And I also tried it without compression. And I tried FBX as well. No difference.

But when I create a test cube dynamically, then I can apply the materials without any problem.

Here is how it looks in Firefox and Edge (black model of book and the test cube):

And here is the same in Chrome (it shows only .map channel and it flickers):

I tried all the Three.JS materials except RawShaderMaterial, ShadowMaterial, SpriteMaterial. And here is the result:

Materials which apply to the loaded model successfully:

  • LineBasicMaterial
  • LineDashedMaterial
  • MeshBasicMaterial
  • MeshDepthMaterial
  • MeshDistanceMaterial
  • MeshLambertMaterial
  • MeshMatcapMaterial

Materials which don’t work:

  • MeshNormalMaterial
  • MeshPhongMaterial
  • MeshPhysicalMaterial ⟵ the one I need
  • MeshStandardMaterial
  • MeshToonMaterial

And it doesn’t matter whether it is a textured material or a material with default settings (just white).

I assume that the problem is more global rather than just a mistake in the code. Maybe something wrong with the model. Because when I tried to import the model into the editor the symptoms remain the same: the dynamic cube is textured well, while the imported model is black.

I’m getting my models (both glTF and FBX) by exporting them from Cinema 4D.

Finally, here is that part of code, which is responsible for texturing. Besides in the scene I have a cubemap, ambient light etc.:

// material
let bookMaterial = new THREE.MeshPhysicalMaterial({
    map: textureLoader.load('/images/book-cover_col.jpg', handleTexture),
    bumpMap: textureLoader.load('/images/book-cover_bump.jpg', handleTexture),
    envMap: cubemap,
    roughness: 0.05,
    metalness: 0.6,
    bumpScale: 0.001

// load mesh
    function(gltf) {

        let bookMesh = gltf.scene;
        bookMesh.traverse((child, i) => {
            if (child.isMesh) {
                child.material = bookMaterial;



Just in case I attached the 3d model and simplified source code.
glTF with DRACO: book-cover-d_v4.gltf (14.5 KB)
glTF uncompressed: book-cover_v4.gltf (37.7 KB)
JavaScript source: source.js (3.9 KB)

Thank you in advance for the help! Because after the 3 days I don’t know what else I can try.

Can you also share your code as a git repository? Right now, it’s quite complicated to debug your issue.

BTW: It’s not correct to set Texture.encoding to THREE.sRGBEncoding for a bump map. It’s not a color texture.

@Mugen87, I’ve created the repository:
Works on Node.JS + Express

Ok, thanks, fixed this.

I’ve found the problem!
Localizing the problem step by step during these days led me to the broken Cinema 4D export.
I was using this solution, which is still at the experimental stage.

I’ve installed Blender and used glTF export. And it works perfectly.
While playing around with Blender’s export I’ve found out that if I switch the normals off, then the loaded model shows with errors, exactly like the one exported from Cinema 4D. And I cannot apply materials to it. But I was exporting the model from Cinema 4D with the normals. That’s why I think there is something wrong with normals in C4D’s glTF exporter.

Tip on DRACO compression:
While testing the Blender’s glTF exporter I tried to use the provided Draco compressor (also visible on the screenshot above).
And I tried to compress the same (uncompressed) model with gltf-pipeline.

Here are the results:
The uncompressed model: 194 kb.
Blender’s Draco compressor (with default settings — screenshot): 75 kb
gltf-pipeline (with default settings): 19 kb

I’m sure that if I play with the compression settings in Blender I can achieve the same result as with gltf-pipeline. But for now, I know nothing about these parameters and don’t know hot to use them. So if you don’t want to bother, like me, just use the gltf-pipeline for better compression.


There’s a bug open about this, it might be an issue in the Draco compression implemented in Blender: