Effect Composer Gamma Output Difference

Hi! :raised_hand:
So basically Im running into an issue where Im getting an inferior result using EffectComposer.

Renderer + .outputEncoding = THREE.sRGBEncoding: (correct)

EffectComposer + RenderPass + GammaCorrectionPass: (ugly)

Is this the expected outcome with some textures or is there some way to improve the EffectComposer's quality?
PS. The texture used is a Scene.background CubeTexture with .encoding = THREE.sRGBEncoding

I’m not sure of the exact syntax since this was all updated in the last month.

Can you share the code you are using to set up the renderer and effect composer? And are you using the latest version of three with the updated syntax?

Try to set WebGLRenderer.gammaFactor to 2.2 and see if it helps (default is 2.0). GAMMA_FACTOR is a define in GLSL which is set by the renderer and used by GammaCorrectionShader.

1 Like

@Mugen87 Oh so thats where GAMMA_FACTOR comes from. That’s good to know, but unfortunately setting it to 2.2 only enforces the ugly effect, which I now believe to be gamma correction applied twice. However Im not entirely sure as its hard for me to trust the code at this point :face_with_raised_eyebrow:
Further in this post, I’ll refer to the undesirable second output as DoubleGamma, for convenience sake.

So after a bit of testing, I believe I’ve found some strange behaviors:

  1. Using only renderer:
  • looks properly and respects renderer.outputEncoding = THREE.sRGBEncoding (changes its color space accordingly)
  1. Using EffectComposer with only 1 RenderPass:
  • again, looks properly and respects renderer.outputEncoding = THREE.sRGBEncoding
  1. Using EffectComposer + RenderPass + CopyPass:
  • The Problem Case: DoubleGamma appears and renderer.outputEncoding = THREE.sRGBEncoding is no longer respected. DoubleGamma simply occurs, regardless whether you set the encoding to sRGB or not.

(note: In the 3rd case CopyPass can be any pass, be it FXAA or Bloom, DoubleGamma still occurs. I’ve chosen to use ShaderPass( CopyShader ), since, as far as I understand, it’s supposed to return exact copy of the previous pass, and it doesnt, so it presents the issue clearly)

@looeee Im using r112 - the newest and shiniest version of three.js :smiley:
My postprocessing setup is the simplest possible.
Please have a look at the issue through this repo. I’ve updated it to use postprocessing. Uncomment copyPass from initPostProcessing function (line 147) to see changes.

PS. You will have to apply a tiny change from this PR in order to be able to create a CubeTexture out of 6 compressed images.

Did something break during the last syntax update? Forgive my ignorance, but I feel like there is no way this behavior is intentional. The colorspace conversions are giving me a headache :sweat:

I’ve run into this problem before when dealing with post-processing, and I confess, even after several days of studying color spaces and the tone mapping code in three.js, the only way I’ve been able to solve it is by guesswork.

I do think that there are some bugs in the way tone mapping is handled in the post-processing code, or if not, then the workflow needs to be improved and have much better documentation because it’s very confusing at the moment.

In the end, I solved this by disabling all color correction in three.js and adding my own final pass which does tone mapping/brightness/contrast.

EDIT: looks like I should also add gamma correction (or rather, sRGB transform) into this, after tone mapping and before brightness/contrast.

/*
 * Combined post-processing pass
 *
 * ACESFilmic Tone mapping
 * Brightness
 * Contrast
 *
*/

const CombinedShader = {

  uniforms: {

    tDiffuse: { value: null },
    toneMappingExposure: { value: 1.0 },
    brightness: { value: 0 },
    contrast: { value: 0 },

  },

  vertexShader:

  /* glsl */`
    varying vec2 vUv;

    void main() {

      vUv = uv;
      gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );

    }`,

  fragmentShader:

  /* glsl */`
    #define saturate(a) clamp( a, 0.0, 1.0 )

    uniform sampler2D tDiffuse;

    uniform float toneMappingExposure;

    uniform float brightness;
    uniform float contrast;

    varying vec2 vUv;

    vec3 ACESFilmicToneMapping( vec3 color ) {

      color *= toneMappingExposure;
      return saturate( ( color * ( 2.51 * color + 0.03 ) ) / ( color * ( 2.43 * color + 0.59 ) + 0.14 ) );

    }

    void main() {

      gl_FragColor = texture2D( tDiffuse, vUv );
      gl_FragColor.rgb = ACESFilmicToneMapping( gl_FragColor.rgb );

      gl_FragColor.rgb += brightness;

      if (contrast > 0.0) {
        gl_FragColor.rgb = (gl_FragColor.rgb - 0.5) / (1.0 - contrast) + 0.5;
      } else {
        gl_FragColor.rgb = (gl_FragColor.rgb - 0.5) * (1.0 + contrast) + 0.5;
      }

    } `,

};

export default CombinedShader;

I intend to add color correction to this as well, I’ve held off on that as I’m not sure whether it does before or after tonemapping.

1 Like

Some more considerations for post processing:

  • some passes need to be in linear space (before tone-mapping) and some need to be done after. Some can be done before or after but you’ll get different results (e.g. vignette).
  • Outline effect looks like it has tone mapping built in (probably incorrect?)
  • AdaptiveToneMappingPass and ToneMapShader have filmic tonemapping built in. We should change that to ACESFilmic.

Before tone mappings: bloom, lensflare etc
Then, in order: 1. tonemapping 2. gamma correction
After tone mapping: brightness, contrast, FXAA, SMAA, color correction
Order not important (or, choose what looks best to you): DOF, vignette, film grain etc.

I’m not sure where where exposure comes into this - either before all passes, or just before tone mapping?

Thoughts:

  • We should remove all tone mapping except for ACESFilmic/NoToneMapping. ACES Filmic is the industry standard and as far as I know the only kind of tone mapping Unreal and Unity support. I have so far not seen any arguments for having other kinds of tone mapping.
  • It would be great if a final tone mapping pass was automatic when using the EffectComposer. Even better, combine color correction/tonemapping/brightness/contrast as the final pass (again, that’s what Unreal does I think) (see below).
  • have post processing built into the core so it’s a first class three.js citizen - see Takahirox’s PR here:
2 Likes

Wow, thanks @looeee! This is some great, useful info I haven’t seen anywhere else :eyes:

That would be the dream scenario :heart_eyes:
but removing built in color corrections inside some passes would be a great start. Its simple to just add toneMapping, gamma etc at the end of the composer chain, but what complicates things is selfish passes with corrections built in, that break the chain.
Would Takahirox’s PR fix that issue?

Also Im not sure I got an answer for this, but why would renderPass and renderPass + copyPass give different colorspace renders? (or different results at all?) This actually seems to happen with 1 or more passes added after renderPass, regardless of the pass type (but a basic copyPass shows it clearly).
And why does it suddenly stop respecting the renderer.outputEncoding? :thinking: :face_with_raised_eyebrow:

1 Like

AFAIK, no. It’s just the idea to move post-processing into the core.

In any event, I don’t think I would add tonemapping or gamma correction automatically at the end of the pass chain in EffectComposer. At least there should be a flag that controls this behavior.

That happens because WebGLRenderer.outputEncoding is only respected when rendering to screen (or default framebuffer). When you rendering to a render target, the encoding from it’s texture property is evaluated. The internal render targets of EffectComposer use the default setting LinearEncoding (which makes sense since a render pass should always be in linear color space for further processing).

3 Likes

Hmm, yeah on further consideration this suggestion is contradictory with the need to do some passes before tone mapping and some after.

However, as far as I’m aware, you will always (at least on LDR screens) need to do tone mapping/gamma correction at some point in the post processing chain so it would make sense to have a combined pass that does these and clearly demonstrate in docs or examples how and where to use it.

2 Likes

Well then, what would be the reason for the ugly effect on the second image? Because color-wise it seems like it is converted well, it’s just that those pixely (almost 8-bit looking?) artifacts are showing, almost as if precision was lost?

Im not very knowledgeable in the internals of THREE.WebGLRenderer. Is there a difference between it’s gamma correction and post processing’s shader pass? If not, then I must be missing some additional conversion pass in my composer chain. :face_with_monocle:

So in other words: how can I achieve the first image output using the EffectComposer? Im particularly looking for doing selective bloom, but I might be adding other effects later on as well.

Um, I don’t know so far why these strange artifacts occur. This is definitely worth investigating…

@DolphinIQ unfortunately I can’t get your example to run.

THREE.WebGLRenderer: WEBGL_compressed_texture_astc extension not supported.
THREE.WebGLRenderer: WEBGL_compressed_texture_etc1 extension not supported.
THREE.WebGLRenderer: WEBGL_compressed_texture_pvrtc extension not supported.
THREE.WebGLRenderer: WEBKIT_WEBGL_compressed_texture_pvrtc extension not supported.

Loading Complete!

CompressedTexture {uuid: "8E71EAFF-6413-44CB-980E-645891D3A45B", name: "", image: {…}, mipmaps: Array(1), mapping: 300, …}

three.module.js:20960 THREE.WebGLState: TypeError: Failed to execute 'texImage2D' on 'WebGLRenderingContext': No function was found that matched the signature provided.
    at Object.texImage2D (three.module.js:20956)
    at setTextureCube (three.module.js:21576)
    at WebGLTextures.safeSetTextureCube (three.module.js:22344)
    at SingleUniform.setValueT6 [as setValue] (three.module.js:17002)
    at Function.WebGLUniforms.upload (three.module.js:17421)
    at setProgram (three.module.js:25213)
    at WebGLRenderer.renderBufferDirect (three.module.js:23935)
    at renderObject (three.module.js:24687)
    at renderObjects (three.module.js:24657)
    at WebGLRenderer.render (three.module.js:24436)

Hi @looeee, could you please replace one line in three.module.js from this PR

Mugen made it possible to build a CubeTexture out of compressed images. After that, it should work. Sorry for the confusion

It seems the issue can be fixed if the internal render targets of EffectComposer are created differently. If you enhance the following section:

by this type: FloatType, the result is as expected. The default type is UnsignedByteType by the way.

I’m not yet sure why this removes the artifacts but this might be related to a precision issue.

1 Like

So it was related to precision after all :hushed:
It reminded me a little of those old space games arts

Thanks a lot guys! This was a really instructive thread :pray: :bowing_man:
Managed to get selective bloom working for my game!

Intrstingly enough, the precision still wasn’t as good as when using the renderer’s default encoding. Perhaps that one uses 64-bit floats? Anyway, I lowered the renderer.gammaFactor to 1.7 to get the best of both worlds and all looks well now :+1:

2 Likes

Nope, double-precision floating-point is not supported as a data type of texel data.

1 Like

Well, then Im beat :sweat_smile:

1 Like