so i appriciate it maybe an indepth topic, however im struggling to get to grips with what is required to use HDR files for lighting, ive gone throught all the examples (specifically this one) I can find, but for someone new to it there seems to be a distinct lack of documentation(or at least im missing it).
I can see the use of a HDRCubeTextureLoader, but what are the pmremGenerator and pmremCubeUVPacker for and really do I even need to care?
The question marks refer only to the gammaInput statement. This line is actually not necessary in this example since Texture.encoding determines how the texture is encoded and how the decoding should happen in the shader.
Basically yes, since these entities allow the correct usage of environment maps in context of physically based rendering. MeshStandardMaterial is based on a metalness/roughness workflow. In order to implement the concept of roughness correctly with environment maps, you need to do some preparation. PMREMGenerator creates multiple versions of the map based on different roughness values and also creates the corresponding mipmaps. PMREMCubeUVPacker just takes all these texture data and transforms them into a single texture for rendering (you can see this texture for debugging purposes on the floor of the example). While rendering, the fragment shader picks an appropriate version of the environment map based on the respective roughness value of the material.
Whenever you use MeshStandardMaterial, you should also use prefiltered mipmaped radiance environment maps.
Im pretty new to the whole topic, just wondering: The prefiltered mipmaped radiance environment maps are generated on runtime - is it also possible to load somehow maps which were computed and saved before?
Something like BabylonJS is doing with IBL Baker? https://doc.babylonjs.com/how_to/physically_based_rendering
I mean, generating them on runtime is actually really nice and makes the everything dynamic. But what about the performance impact? Is this better than loading a precomputed map?
I had some trouble to set the hdrCubeRenderTarget.texture as the envMap after the pmremGenerator and pmremCubeUVPacker.
When loading a CubeMapTexture, I can easily assign the CubeTextureLoader to the envMap of the material.
In the hdr example, this is not possible for me. In the example, this is done in the render() function. Is this a need? It only works for me in the render() function like this: