Threejs: Linear maps (like normal, roughness etc.) and gamma correction

I’ve activated in threejs the following options:

  renderer.gammaFactor = 2.2
  renderer.gammaOutput = true
  renderer.toneMapping = ReinhardToneMapping
  renderer.toneMappingExposure = 1.0

And on color maps the following encoding:

texture.encoding = sRGBEncoding

Now according to the manul (https://threejs.org/docs/index.html#api/en/constants/Textures) this options is only valid for color maps, envMap and emissiveMaps.

I’ve downloaded from Poliigon (a texture site) some PBR materials which contain all the relevant maps (color, roughness (after inverting), metal and so forth). But the maps are in JPG, which means they are in sRGB color space and therefore gamma corrected. BUT the gamma value that is used is set to 0.04545 which is 1/22. As far as I understand this, since its gamma corrected but with the inverse a new gamma correction will bring the image to linear space and the information from the map can be used correctly by the shader.

Is this thought process correct? And if so:

Is threeJs performing this conversion since the doc imply they should be in linear space when handed to the shader?

It could also be possible, that I miss something - this gamma topic is new for me and I’m not sure if I have all puzzle pieces together…

(this post is a crosspost from https://stackoverflow.com/questions/58709117/threejs-linear-maps-like-normal-roughness-etc-and-gamma-correction)

If a texture is in sRGB color space (like your diffuse texture) then it’s necessary to set Texture.encoding to sRGBEncoding. This information is used by three.js to convert the respective texels into linear space when reading them in the shader. Why? Because lighting calculations are performed in linear space. The output encoding of the renderer represents in your case a gamma correction which is performed at the end of the fragment shader.

Note that loaders like GLTFLoader automatically set Texture.encoding since they assume the above linear shader workflow. Besides, it was actually planed to remove gammaOutput and introduce outputEncoding instead in order to make this encoding/decoding setup more clear for the user (and configurable).

1 Like

@Mugen87 Thanks for your reply.My question is, is this performed for roughness, normal and metal maps as well? I’m not sure because the doc says only LinearEncoding is a valid option for this kind of textures, but the maps I download are all JPG with a gamma of 0.04545, so they would need some sort of conversion(?)

No, these maps are supposed to be already in linear space. I’m not sure how your textures were generated but normally exporters respect this encoding convention. For example the glTF spec says about normal textures:

The texture contains RGB components in linear space.

If I run identify -verbose <ROUGHNESS_MAP> (from imagemagick) I get sRGB with gamma 0.04545 - thats why I assumed they need to be gamma corrected before processing. So I guess I have to convert them to linear space with ImageMagick and save them as PNG.

Thanks for your reply and hint about glTF, this is something I have to keep in mind going forward

From the [imagemagick docs] (https://imagemagick.org/script/command-line-options.php#colorspace):

ImageMagick assumes the sRGB colorspace if the image format does not indicate otherwise

I don’t think jpg stores color space information so it will always return sRGB here.

@looeee That is true, but the gamma value should be an indicator, because it should be 1, does it not? (I’m really not sure… (-: )

In general you can assume that your non-color maps (roughness, normals, etc.) will be in linear space if they’re coming from any software that knows about 3D data. Blender, Substance Painter, and Poliigon should certainly have this right. I don’t know much about the metadata on the JPG files themselves, or whether that’s reliable, but if the normals etc. look correct you’re probably fine. Sorry that’s not an especially satisfying answer. :wink:

1 Like