How to use 32 bit HDR lightmap properly in ThreeJS

Hey guys, I have been stuck with this problem for few days. I wanna present my scene in THREEJS with baked lighting.
Here is my workflow.

  1. Prepare the whole scene in blender(using filmic color space with display device set to sRGB)
    image
  2. Export the scene with gltf with bin file. All materials were set properly. And all color texture were sRGB and property map(roughness, normal) were greyscale.
  3. Baking lightmap in blender with this setting, to do this, I manually create a 32 bit 1024*1024 image, after baking I save the baked lightmap into 32 bit HDR format.
    image
  4. Now, in THREE. I correctly load my HDR format lightmap. However, I saw some noticeable color discrepancy between blender and THREE.

I did some research and had some thoughts about my problem,
a. Is there any problem laying 32 bit lightmap on 8 bit srgb texture? From my understand, when applying 32 bit hdr(linear color) lightmap to texture(8 bit srgb, gamma corrected), THREE should be able to automatically transfer srgb to linear color space. So that the lighting calculation should be correct. Hence the sRGB texture should be fine. But I am not sure how THREE will treat HDR(linear color space). I looked in to the glsl shader on github. Here are how THREE handle map(texture) and lightmap in shader.
image
image

From my understand(sorry if I am wrong), map were transferred to linear and lightmapwere transferred as well. Hence , my lightmap were transferred twice, which result in wrong results.
If my thoughts were correct, then all I need to do is transfer my linear hdr lightmap to sRGB. Then it should work? But I am afraid the lost data will result in bad rendering result in THREE.

  1. I am not 100% sure if the 32bit HDR lightmap I saved(from blender) were in linear.is there anyway I could make sure about this?
    Any suggestion or directions would be appreciate! Thanks in advance!

That is correct. Assuming the encoding property of a texture is properly defined, the renderer will decode texels into linear color space for further computations.

Besides, light maps don’t have to be defined in linear color space. Similar to other textures, the shader will decode fetched texels, see:

Thanks for responding, I knew I could use 8bit png lightmap. but the result are far from good. So I think I have to stuck with 32 bit HDR.
I don’t know why the final result has prominent color discrepancy compare to Blender. where did the mistakes happened. the only reason that I could think of is may be 32 bit HDR lightmap and 8 bit texture cannot been calculated directly? maybe i should convert 32 bit to 8 bit so that the calculation can be technically correct.

1 Like

Hi, Richard.

Would you have the code snippet you used in order to load/apply the lightmap at runtime?

Trying to do the same thing but having trouble getting an HDR to load and respond to exposure values in the renderer. Did you pack the texture as a ktx2? I think RGBE texture loading is deprecated in latest 3js.

No, it’s not. You can still use RGBELoader like in this demo: three.js webgl - materials - HDR texture loader

The only thing which is deprecated is the RGBEFormat and RGBEEncoding. RGBE textures are now loaded as half float or float textures and decoded during load time (and not in the shader).

1 Like