So far I have read responses from @donmccurdy & @Mugen87 on a couple of similar posts. From this I tried including:
physically correct lights
sRGBEncoding
gammaOutput: true
gammaFactor at 2.2
I currently do all of this at the top of my file like this:
var renderer = new THREE.WebGLRenderer({ antialias: true, physicallyCorrectLights: true, outputEncoding: THREE.sRGBEncoding, gammaOutput: true, gammaFactor: 2.2});
And here is my GLTFLoader code:
// Load a glTF resource
loader.load(
// resource URL
'assets/peachpeach2.gltf',
// called when the resource is loaded
function ( gltf ) {
mesh = gltf.scene;
mesh.scale.set( 20, 20, 20 );
meshGroup.add(mesh);
meshGroup.position.x = 0
meshGroup.position.y = 0
meshGroup.position.z = 0
scene.add( meshGroup );
//scene.add( gltf.scene );
//gltf.animations; // Array<THREE.AnimationClip>
//gltf.scene; // THREE.Scene
//gltf.scenes; // Array<THREE.Scene>
//gltf.cameras; // Array<THREE.Camera>
//gltf.asset; // Object
},
// called when loading is in progresses
function ( xhr ) {
console.log( ( xhr.loaded / xhr.total * 100 ) + '% loaded' );
},
// called when loading has errors
function ( error ) {
console.log( 'An error happened' );
}
);
}
I think I also saw it mentioned that i should be using model traverse inside of my GLTF to set things like sRGBEncoding etc…but from the few explanations that i’ve read that referenced the traverse method, I struggle to understand the syntax and don’t know how to fit it into the GLTFLoader code I have.
I’m scouring my code trying to think what else it could be. I have an ambient light and a directional light so at 1.1 & 0.1 respectively.
I’m not sure how I would create a live example on fiddle for this because of the custom asset hosting, but here is the full document (it’s not that long): index.js
That sounds likely to be the issue, yes. If you’re installing them from npm, then both files can be accessed through the npm installation. If you’ve downloaded the library from GitHub, you can use the branch/version dropdown to find r105, then navigate to the examples/js{m}/loaders directory:
Frustratingly the problem still persists with r105 being used for both three and gltfloader.
My only solution thus far is compensating for the effect inside of blender. I’ve just been making blender objects that have a darker than usual material color and it’s getting me closer to ‘normal’. I don’t think it’s too efficient to do it this way but will work as a last result if i need the project live.
I’ll keep trying solve this - if anyone in the mean time notices something in my code please let me know.
Apologies I was head down in other work. From what I can see now - I think some of my lighting was affecting the colour still. So the fix works for the heavy over saturation.
I also needed to get my lights in check to get the final exact replication.
I now have a perfectly represented model, thank you both.
I had this problem too. If I manually set a texture’s encoding to sRGBEncoding, it renders really saturated. The default value for a texture is LinearEncoding:
The GLTFLoader however defaults to using sRGBEncoding:
It’s really confusing trying to get the right appearance, especially when a GLTF shares the same texture as another asset in the scene. It may depend on the load order, which encoding is used. As mentioned in some posts above, it should just be a case of setting the renderer.outputEncoding to sRGBEncoding and everything should look ok but I’ve found it hard to get the output to match the source. If I use settings that fix GLTF objects, it makes other textures look washed out.
I managed to get one of my scenes to looking correct by setting the renderer outputEncoding to Linear as well as the GLTF textures but I don’t think that’s how it’s supposed to be and it didn’t work the same way on another scene.
The most problematic scenes were older ones that were upgraded, newer scenes seemed to work better just with outputEncoding=sRGBEncoding. The way lights behaved seems to have changed since older versions so that definitely seems to have had an effect on it.
GLTFLoader configures things with the goal of fitting correctly into that workflow. Other loaders may vary, especially since not all formats even contain enough information to know what colorspace the textures and vertex colors actually are using.
For an older project that you just want to freeze with an older gamma workflow, you could also set all the textures, and renderer.outputEncoding to LinearEncoding.
The best practises list is interesting. So given that the recommend encoding for maps is sRGB and the texture loader defaults to Linear, all manually loaded textures used for color maps should be explicitly set to sRGB. Perhaps the material classes could choose the encoding automatically unless the user has assigned an encoding type.
The last suggestion about using LinearEncoding for older projects makes sense. That seemed to work but I wasn’t aware of using a gamma workflow. Some textures I use are done in Photoshop, others are baked in 3D software. It’s hard to know if baked maps are gamma-corrected or not, I guess baked albedo maps are linear?
This issue is mentioned here:
“The same thing happened to me when exporting a roughness map from Substance Painter. When importing a map that has a very specific value for something like roughness or an alpha map, you should uncheck the sRGB box inside the texture. The same way that normal maps uncheck it by default because the RGB values need to be precise and not gamma corrected like most textures.”
This will take a bit of trial and error in the workflow to get everything right but it will be much easier with this explanation.