Hi everyone,
I’m trying to build app needs realistic rendering. So First im trying to make everything okay for HDR environment map.
As far as i see in documentation .fromEquirectangular code gives an output 256x256
so my background resolution too low. it is okay but real my wonder is whether this issue has an effect on light quality also for scene.environment. Because it will effect realistic rendering.
I’m using following code for env map.
function loadEnvironment(url) {
let rgbeLoader = new RGBELoader();
rgbeLoader.setDataType(UnsignedByteType);
return new Promise(resolve => {
rgbeLoader.load(url, function (texture) {
let pmremGenerator = new PMREMGenerator(renderer);
pmremGenerator.compileEquirectangularShader();
let envMap = pmremGenerator.fromEquirectangular(texture).texture;
pmremGenerator.dispose();
//scene.background = envMap;
//scene.environment = envMap;
resolve(envMap);
});
});
}
In the mean time for the resolution i tried the code below
var envMap = new THREE.WebGLCubeRenderTarget( 4096, cubeRenderOpts2 ).fromEquirectangularTexture( renderer, texture );
It solves the resolution but at this time scene.environment doesnt give any light.
If you can help me about this issue it will be a good clean start for me and will be appreciated.
should i use envmap for materials like material.envMap = envMap; or it generates automatically?
And for sharp env yes your solution solved sharpnes. But this time envrioments doesnt rotate it is stable. So it is looking like object is turning only
loadEnvironment('src/env/hdri/lilienstein_4k.hdr').then(function (envMap) {
scene.background = envMap;
scene.environment = envMap;
gltfLoader.load(
// resource URL
'models/room/furnality-room.gltf',
// called when the resource is loaded
function (gltf) {
gltf.scene.traverse(function (obj) {
if (obj instanceof Mesh) {
let materialAO = "models/room/bakeAO.jpg";
let ao = LoadTextureCorrected(textureLoader, materialAO);
obj.material.aoMap = ao;
// let materialLight = "models/room/lightMap.png";
// let lightMap = LoadTextureCorrected(textureLoader, materialLight);
// obj.material.lightMap = lightMap;
obj.material.receiveShadow = true;
obj.material.castShadow = true;
obj.material.envMap = envMap;
if (obj.material.name == "floor") {
obj.material.map.wrapS = obj.material.map.wrapT = RepeatWrapping;
obj.material.map.repeat.set(10, 10);
obj.material.roughness = 1;
obj.receiveShadow = true;
controls.ground.push(obj);
}
if (obj.material.name == "wall") {
obj.material.map.wrapS = obj.material.map.wrapT = RepeatWrapping;
obj.material.map.repeat.set(15, 15);
obj.material.roughness = 1;
}
}
});
scene.add(gltf.scene);
},
// called while loading is progressing
function (xhr) {
console.log((xhr.loaded / xhr.total * 100) + '% loaded');
},
// called when loading has errors
function (error) {
console.log('An error happened');
}
);
});
is there anything wrong for envMap?
For your second thing
I have mouselook control in my app. When i turn the camera enviorenment is stable. you see only one view of environment. it comes to user like house is turning not camera.
However, you should preprocess the environment map with PMREMGenerator if you assign it to Scene.environment (or envMap) and use a PBR material. You are not doing this in your code snippet.
let rgbeLoader = new RGBELoader();
rgbeLoader.setDataType(UnsignedByteType);
return new Promise(resolve => {
rgbeLoader.load(url, function (texture) {
let pmremGenerator = new PMREMGenerator(renderer);
pmremGenerator.compileEquirectangularShader();
let envMap = pmremGenerator.fromEquirectangular(texture).texture;
pmremGenerator.dispose();
resolve(envMap);
});
});
}
Goal is to use an HDR equirectangular texture as the envMap to obtain physically correct lighting on a model while also having the scene.background use the same HDR equirectangular texture except that the background should be bigger than 256x256 so it looks hi-res.
When using an original hdr texture as the scene.background, the texture is hi-res but the issue is it appears to be fixed in the viewport. So when you use orbit controls, it appears that the model and scene is rotating, while the camera and background are fixed.
Code:
this.pmremGenerator = new THREE.PMREMGenerator( this.el.sceneEl.renderer );
this.pmremGenerator.compileEquirectangularShader();
new RGBELoader()
.setDataType( UnsignedByteType )
.load( '/textures/studio_2k.hdr', ( texture ) => {
// generate envMap used for lighting
const envMap = this.pmremGenerator.fromEquirectangular( texture ).texture;
this.pmremGenerator.dispose();
// set visual background
this.el.sceneEl.object3D.background = texture; // OG texture
// apply PMREM texture as envMap on all motorcycle materials
this.el.sceneEl.object3D.environment = envMap;
} );
Using the PMREM version of the texture as the scene.background solves the issue of it seeming “fixed” to the viewport but the issue then is it looks low res since it’s 256x256, which you can really see in the light in my screenshot.
Code:
this.pmremGenerator = new THREE.PMREMGenerator( this.el.sceneEl.renderer );
this.pmremGenerator.compileEquirectangularShader();
new RGBELoader()
.setDataType( UnsignedByteType )
.load( '/textures/studio_2k.hdr', ( texture ) => {
// generate envMap used for lighting
const envMap = this.pmremGenerator.fromEquirectangular( texture ).texture;
this.pmremGenerator.dispose();
// set visual background
this.el.sceneEl.object3D.background = envMap; // envMap cube texture
// apply PMREM texture as envMap on all motorcycle materials
this.el.sceneEl.object3D.environment = envMap;
} );
What’s the best technique to have a scene.background image that matches the envMap but is also high-res to make the scene look realistic? I’d imagine using a hi-res cubemap version of the texture would do the trick, but how do I convert my equirectangular image to a cubemap and maintain resolution? Or is there another solution I’m not thinking of?
A good example of what I’m trying to accomplish is Virtual Studio. If you click on the environments you see this…