HDR Texture does not load in as a equirectangular map


I’m following the markup code from webgl_loader_gltf.html in order to recreate the same map in my own project. The HDR image does not get split up into a cube as it should, rather it gets displayed as a texture that covers the whole scene.

This is my code.

  function loadEnvironment(url) {
    let rgbeLoader = new RGBELoader();

      (texture) => {
        let pmremGenerator = new THREE.PMREMGenerator(renderer);

        envMap = pmremGenerator.fromEquirectangular(texture).texture;

        texture.mapping = THREE.EquirectangularReflectionMapping;
        scene.background = texture;
        scene.environment = envMap;

What am I missing?


I set scene.background = envMap instead and now it works.
But I read somewhere that Mugen87 doesn’t recommend it. Why doesn’t it work when I set it to texture?

// Thankful noob

You created the texture successfully. Then you applied the texture to the envMap. You then set your scene background to the envMap and it worked.

So, materials can also us an envMap property. So now, you make a geometry, like maybe a box? Then when you apply a material to the box you can set the materials envMap the same way you did with the scene background and your equirectangular material will work the same way.

Please always link to the resource before making such statements.

Using PMREMGenerator on app level is not necessary anymore. Do it like in three.js webgl - GLTFloader. You normally want to assign the same texture to Scene.background and Scene.environment.