Different color output when rendering to WebGLRenderTarget

I noticed that I get a different color output, depending on whether I render my scene to a WebGLRenderTarget or directly to the canvas frame buffer. Rendering into a WebGLRenderTarget looks somehow dark&oversaturated compared to rendering it to the canvas directly. Maybe you can help me out what I’m doing wrong. I modified the gltfloader example to illustrate the issue. You can change the render output by switching between “compositingScene” and “scene” in “renderer.render(compositingScene /* scene */, camera);”.

“scene”: expected normal rendering:

“compositingScene”: rendering via WebGLRenderTarget:

<!DOCTYPE html>
<html lang="en">
	<head>
		<title>three.js webgl - GLTFloader</title>
		<meta charset="utf-8">
		<meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0">
		<link type="text/css" rel="stylesheet" href="main.css">
	</head>

	<body>
		<div id="info">
			<a href="https://threejs.org" target="_blank" rel="noopener">three.js</a> - GLTFLoader<br />
			Battle Damaged Sci-fi Helmet by
			<a href="https://sketchfab.com/theblueturtle_" target="_blank" rel="noopener">theblueturtle_</a><br />
			<a href="https://hdrihaven.com/hdri/?h=royal_esplanade" target="_blank" rel="noopener">Royal Esplanade</a> from <a href="https://hdrihaven.com/" target="_blank" rel="noopener">HDRI Haven</a>
		</div>

		<script type="importmap">
			{
				"imports": {
					"three": "../build/three.module.js",
					"three/addons/": "./jsm/"
				}
			}
		</script>

		<script type="module">

			import * as THREE from 'three';

			import { OrbitControls } from 'three/addons/controls/OrbitControls.js';
			import { GLTFLoader } from 'three/addons/loaders/GLTFLoader.js';

			let camera, scene, renderer, surfaceBuffer, compositingScene;

			init();

			function init() {

				surfaceBuffer = new THREE.WebGLRenderTarget(window.innerWidth, window.innerHeight, {
				});

				const compositingShader = createCompositingShader(surfaceBuffer);
				compositingScene = createCompositingScene(compositingShader);

				const container = document.createElement( 'div' );
				document.body.appendChild( container );

				camera = new THREE.PerspectiveCamera( 45, window.innerWidth / window.innerHeight, 0.25, 20000 );
				const fff = 1;
				camera.position.set( - 1.8 * fff, 3 * fff, 2.7 * fff );

				scene = new THREE.Scene();

				const ambientLight = new THREE.AmbientLight(0xffffff, 1 * Math.PI);
				scene.add(ambientLight);

				renderer = new THREE.WebGLRenderer( { antialias: true } );
				renderer.setPixelRatio( window.devicePixelRatio );
				renderer.setSize( window.innerWidth, window.innerHeight );
				container.appendChild( renderer.domElement );

				const controls = new OrbitControls( camera, renderer.domElement );
				controls.target.set( 0, 0, 0 );
				controls.update();

				window.addEventListener( 'resize', onWindowResize );

				const loader = new GLTFLoader().setPath( 'models/gltf/DamagedHelmet/glTF/' );
				loader.load( 'DamagedHelmet.gltf', async function ( gltf ) {
					const model = gltf.scene;
					await renderer.compileAsync( model, camera, scene );
					scene.add( model );
				} );

				renderLoop();
			}

			function onWindowResize() {

				camera.aspect = window.innerWidth / window.innerHeight;
				camera.updateProjectionMatrix();

				renderer.setSize( window.innerWidth, window.innerHeight );
			}

			function renderLoop() {
				render();
				requestAnimationFrame(renderLoop);
			}

			function render() {

				renderer.setRenderTarget(surfaceBuffer);
				renderer.clear();
				renderer.render(scene, camera);


				renderer.setRenderTarget(null);
        renderer.clear();
        renderer.render(compositingScene /* scene */, camera);
			}

			function createCompositingScene(compositingMaterial) {
				const screenQuad = new THREE.Mesh(
					new THREE.PlaneGeometry(2, 2),
					compositingMaterial
				);
				screenQuad.frustumCulled = false;

				const compositingScene = new THREE.Scene();
				compositingScene.add(screenQuad);

				return compositingScene;
			}

			function createCompositingShader(surfaceBuffer) {
				const compositingVertexShader = `
					varying vec2 varUv;

					void main() {
						varUv = uv;
						gl_Position = vec4(position, 1.0);
					}
				`;

				const compositingFragmentShader = `
					uniform sampler2D surfaceBuffer;

					varying vec2 varUv;

					void main(void) {
						vec4 surfaceC = texture2D(surfaceBuffer, varUv);
						gl_FragColor = surfaceC;
					}
				`;

				const compositingUniforms = {
					surfaceBuffer: { value: surfaceBuffer.texture },
				};

				return new THREE.ShaderMaterial({
					side: THREE.FrontSide,
					uniforms: compositingUniforms,
					vertexShader: compositingVertexShader,
					fragmentShader: compositingFragmentShader,
					transparent: false,
					depthWrite: false,
					depthTest: false
				});
			}

		</script>

	</body>
</html>

Drawing to a render target does not include tone mapping or the Linear-sRGB to sRGB output transform, in recent versions of three.js. With EffectComposer you could use THREE.OutputPass as the last pass, or you could add this block to your compositing shader:

A goal here is to preserve HDR data for use in the post-processing pipeline, better supporting various effects. Also, it isn’t possible to do the sRGB encoding fully correctly when drawing to a float16 or float32 buffer.

2 Likes

if we use rendertargets that draw onto a texture, what should we do to migrate? for instance the glitch when you enter a portal here Enter portals - CodeSandbox this wasn’t there before.

colorspaces and tonemapping … i never really understood it, but now i feel almost helpless with the new changes. should we add this ifdef SRGB_TRANSFER blob to regular shaders?

1 Like

So much in that demo, in such concise code! Really great.

I’m not sure I understand R3F’s current frame/portal configuration enough to answer that – is this a correct summary of the method?

  1. each ‘portal scene’ is a separate THREE.Scene
  2. the same renderer draws each ‘portal scene’ to a separate render target
  3. those render targets are used as base color textures on a planar mesh (with MeshBasicMaterial or similar ShaderMaterial) representing a portal in the ‘outer scene’
  4. the ‘outer scene’, with its portal meshes, is drawn to canvas
  5. when the user enters one of the portals, that becomes the default scene and is drawn to canvas instead

yes. i think it’s 1:1 as you said. the basic portals are drawn here https://github.com/pmndrs/drei/blob/4aa04c93e4e86711f0a5a6777482a92d52988253/src/core/MeshPortalMaterial.tsx#L183-L208

if, however the blend property is not 0 (on enter), then it will draw the main scene and the portal as a mix until blend reaches 1 and now the portal will take over rendering, system rendering stops and portal just gl.render(myscene, systemcamera). this is when it snaps indicating that now it’s applying tonemapping, so imo the “wrong” colors are seen from outside in the portal mesh planes.

Thanks @donmccurdy, using sRGBTransferOETF fixed my issues.

Ideally, an HDR pipeline for the portal effect would look something like this:

Each render target is float16 or float32, in Linear-sRGB space. Most HDR effects, like bloom, would typically go after the main render pass (to the third render target). OutputPass (or postprocessing equivalent) would come after that, applying tone mapping and the sRGB OETF (“linear to sRGB”). You shouldn’t need to put tone mapping or the sRGB OETF into custom shaders unless they draw to the canvas, and shouldn’t need to manually set toneMapping={false} on any materials.

When entering the portal, we’re changing what’s written to the third render target, but not changing the color space of that target, or the effects and output transform coming after it.


That’s the ideal, and gives you the most flexibility about using post-processing throughout the pipeline. But if your performance budget doesn’t allow this, it’s also reasonable to cut some corners. That third render target could be direct output to the display instead, and that would be fine, in which case you have something more like this:

Here, again, no materials require toneMapped={false}. Tone mapping and the sRGB OETF are applied in materials drawing to the canvas, and not when drawing to the two Linear-sRGB render targets. I used the terms “Display” and “Canvas” inconsistently in these illustrations, but both refer to the same thing, rendering into the HTMLCanvasElement’s drawing buffer.

tl;dr – We’re trying to preserve linear [0, ∞] color data as long as possible, tone mapping and encoding to sRGB only when drawing to the canvas. That rule can be broken, with care, but will make things more complex.

9 Likes

thank you for the in-depth explanations! i hope we could collect it and maybe add to the threejs docs someday.

portal doesn’t normally require postprocessing, it’s a plane with a texture which contains the contents of another scene. but without tonemapping that scene will look different.

i made a small test here with all code included divine-resonance-gt32rp - CodeSandbox

the left is tonemapped, the right portal is not tonemapped which causes a difference. the plane uses a custom shader, but it includes the tonemapping fragment (MeshPortalMaterial.tsx:16-48).

     gl_FragColor = vec4(t.rgb, blur == 0.0 ? t.a : t.a * alpha);
     #include <tonemapping_fragment>
     #include <${version >= 154 ? 'colorspace_fragment' : 'encodings_fragment'}>
   }`

the texture into which the scene is rendered is a regular THREE.WebGLRenderTarget

    const target = new THREE.WebGLRenderTarget(_width, _height, {
      minFilter: THREE.LinearFilter,
      magFilter: THREE.LinearFilter,
      type: THREE.HalfFloatType,
      ...targetSettings
    })

i would like to be able to render into a FBO → texture without loosing tonemapping, without post processing if possible, even if i have to apply it myself in the shader, which would be OK.

ps, i tried to add #ifdef SRGB_TRANSFER into the shader but it didn’t have an effect.

1 Like

I think you’ve effectively got it there –- the only issue I’m seeing is that the <MeshPortalMaterial /> component has disabled tone mapping:

      <portalMaterialImpl
        ref={ref as any}
        blur={blur}
        blend={0}
        resolution={[size.width * viewport.dpr, size.height * viewport.dpr]}
        toneMapped={false}
        attach="material"
1 Like

that’s it! i guess i had to switch it off previously as it would double tonemap before, given that the buffers already applied it, which i’m guessing is the problem you wanted to avoid in threejs. i think/i hope i slowly start to understand it now.

ps, your slides, would it be possible to add this to threejs/docs somehow?

i guess i had to switch it off previously as it would double tonemap before, given that the buffers already applied it, which i’m guessing is the problem you wanted to avoid in threejs…

Yes – There are some good use cases for tone-mapped sRGB render targets too, but rarer, and a Chromium bug makes this less practical at the moment anyway.

I’d certainly like to put a post-processing section into the three.js color management docs… I’ll try to bump that up higher on the backlog. :slight_smile:

2 Likes