sRGB texture encoding gives color banding issue

Hello everybody,

I have a 8bit sRGB png image that I need to render in my game. I haven’t done this before but I’ve found this example from threejs that does sRGB encoding on a texture + gamma correction. So here is what I’ve done in code:

For the renderer:
renderer.gammaInput = true;
renderer.gammaOutput = true;
renderer.gammaFactor = 2.2;
renderer.outputEncoding = THREE.sRGBEncoding;

For the texture:
lightTexture.encoding = THREE.sRGBEncoding;
lightTexture.color.convertSRGBToLinear();

And finally add the gamma correction pass to my composer:
const gammaCorrectionPass = new THREE.ShaderPass(GammaCorrectionShader);
effectComposer.addPass(gammaCorrectionPass);

But here is the result I get, compared to what I had before in linear color space:

The result is somewhat close to what want except that it gives me these weird color bandings that are similar to a bad image compression. Do you have any idea why?
There has to be a step that I’m missing or overdoing somewhere but after many hours of investigation, I couldn’t get any further than this.

Thanks for your help,
Kromah

1 Like

on my monitor I see banding on both images, although the banding on the right is more prominent

Do you have a demo of this? Enabling material.dithering = true would be worth a try.

The original image is made to be rather small and light weight so that explains it has bandings even in linear color space. But why would it get so much worse in sRGB mode?

I’m not able to show a demo unfortunately. I’ll have to make a sand box project with the same configuration.
material.dithering = true didn’t change anything though…

Just a guess, but sRGB would use more precision for the green range (where human perception is better). This image is all blue, and therefore gets less precision. I’m surprised dither has no effect at all, though… that’s the most common fix for banding.

lightTexture.color.convertSRGBToLinear();

There shouldn’t be any texture.color property, I’m not sure what you mean here?

Do you see the same problem with post-processing disabled? It doesn’t seem correct to be using both renderer.outputEncoding = THREE.sRGBEncoding; and a post-processing gamma pass. See color space management in three.js.

So I have managed to recreate the issue in a codepen: https://codepen.io/Kromahtique/pen/bGdOGZz

You are correct @donmccurdy. I wasn’t considering it before, but the FXAA post process is the issue.
It seems to discard the rendering configurations (gammaFactor and outputEncoding) and renders everything with the default config.

To get around this, we do a gamma correction pass that does the sRGB to Linear conversation, and that’s what create this banding issue.

To give full clarity about what the code does, I’m using a custom formula to do the conversation but the result is the same with the threejs version of the code.

So I now that I know the problem is coming from this, the question would be more about how to make FXAA and sRGB encoding work together…

I could’ve sworn there was a post-processing pass already written to do the linear->sRGB step somewhere. Having trouble finding it now though. :confused:

@donmccurdy I believe you might be thinking about GammaCorrectionShader? It was changed recently to use LinearTosRGB() instead of LinearToGamma().

2 Likes

Yep, that’s it :+1:

@Mugen87 @donmccurdy @DolphinIQ

I see the same banding issue when modifying the example codepen to use the GammaCorrectionShader. I see the same problem in my projects expected this to be due to a loss of precision (especially in dark regions) in the target buffer because the color channels only have 8 bits. I would guess that using something like a FloatType or HalfFloatType buffer might alleviate the banding but it doesn’t seem like a great solution. Maybe it’s the practical one, though. Do we know how other engines solve this?

This topic was already discussed here: Effect Composer Gamma Output Difference

@Mugen87

Thanks! That topic didn’t necessarily seem super definitive, though. Looking around online it looks like the consensus is that if you’re doing a post process gamma correction you need to use target buffers that are higher preceision like 16bit floats or shorts. Do you know if this is documented anywhere in the three.js docs already?

It seems like this topic could get more complicated, though, right? Surely texture precision must play a role here, too? I don’t know what the typical DCC workflow is for linearized textures so maybe this isn’t an issue but if a texture is already represented in linear color space with 8 bits of precision then even when using a float render target to run the linear -> srgb conversion you’d get banding on that texture because the precision was never there in the first place.

Consider the process when post-processing is not involved. As far as I know that works well — sometimes you need dithering, but when necessary dithering reduces banding regardless of any tonemapping and encoding settings. Here are the last few steps of meshphysical_frag.glsl.js:

Screen Shot 2020-03-28 at 11.34.04 AM

Note that dithering is last, after any tonemapping and encoding steps. Precision is useful, too, but if you can reduce the appearance of banding without spending more of your performance budget on precision that’s a win.

I think the problem we’re seeing may be that when post-processing is added to the render pipeline, the linear->sRGB pass moves after the dither step, so material.dithering is no longer useful. This could be addressed with a dithering shader, if so.

2 Likes

@donmccurdy

Thanks you’re right it looks like dithering does help quite a bit. I put together a demo that demonstrates what I imagined was the worst case for banding (point light over a plane in a black scene) to see what the effect is on everything. You can check it out here:

https://gkjohnson.github.io/threejs-sandbox/colorspace-exploration/index.html

It lets you change the render target type, dithering type (as post process after gamma correction or on material), and between using the native renderer or composer.

I found that, in my opinion, the combination of HalfFloatType and material dithering looks best. Unfortunately there are still dithering artifacts in the dark sections of the scene which get exacerbated by the linear -> gamma color correction. Interestingly that means that applying dithering before applying gamma correction looks best – in both postprocess and non postprocess cases there is a smooth banding of sorts that occurs over the point light radius when it’s applied after. But it seems smoother if applied before.

In Unity it looks like they apply dithering as a post processing step using rotating blue noise textures (as well as some kind of temporal AA on a higher precision buffer, presumably), which probably helps with the dithering artifacts in dark regions of the screen.

I wonder if it’s worth having a dither option baked into the gamma correction pass to enable people to use that workflow if they can afford a higher depth render target? Do you know what the performance impact of using a higher precision target?

2 Likes

Hm. Yeah I see hardly any effect with the post-processing dither above, although the material dither certainly helps.

I wonder if it’s worth having a dither option baked into the gamma correction pass …

I haven’t looked at this too closely, but the concern mentioned by @DolphinIQ in Effect Composer Gamma Output Difference makes sense to me:

Its simple to just add toneMapping , gamma etc at the end of the composer chain, but what complicates things is selfish passes with corrections built in, that break the chain.

A standalone dither shader might be better, if we have a version that makes more difference than the example above appears to do. Even in a gamma workflow (no linear <-> sRGB conversion) dither can be useful.

Hm. Yeah I see hardly any effect with the post-processing dither above, although the material dither certainly helps.

Dither requires the values to have precision because it only shifts the RGB values up or down by 0.5. If you set the target buffer to use HalfFloatType or FloatType you’ll see a difference. If you render to an UnsignedByteType buffer it’s already too late to dither after that.

A standalone dither shader might be better, if we have a version that makes more difference than the example above appears to do. Even in a gamma workflow (no linear <-> sRGB conversion) dither can be useful.

Yeah I agree with the flexibility of separate shaders. If projects want to dump all the fixes into a single pass for performance that’s an option, as well. Again in order for dithering to have an effect in a postprocessing pass, though, the color buffer must be of at least a half float type.

1 Like

@donmccurdy Also it looks like the reason dithering didn’t have an effect on the example here is because dithering isn’t supported by the MeshBasicMaterial. Likely because it doesn’t use lighting. It would probably be worth adding for cases like this, though. I hacked it into the shader in the example and it does remove the banding entirely here but does add some barely noticeable dithering artifacts:

image

If you feel like testing out some blue noise, there’s a good pack of textures here: http://momentsingraphics.de/BlueNoise.html. Temporal AA is a lot of work, but the dither alone might be worthwhile. I’ve been meaning to try to get alpha hash/dither transparency implemented sometime.

Thank you everyone for your replies, all of your contributions are very helpful to me.
So if I understand correctly, my banding issue is due to precision but can be avoided by adding a dithering pass to the end of my pipeline?

As said above it looks like dithering isn’t supported on MeshBasicMaterials so I can’t test it on the go. @gkjohnson would you mind sharing how you hacked it into the shader?
My knowledge being quite limited I’m not sure either how to use HalfFloatType in my effect composer but it looks like it would be best performance if I could get away with using only dithering?

I’ve added a PR with the changes to MeshBasicMaterial here:

You can hack the change in by modifying the “basic” fragment shader in shader chunk before the application begins:

THREE.ShaderLib.basic.fragmentShader =
   `
   ... basic material shader content with dithering here...
   `;

As for setting the render target type –

My knowledge being quite limited I’m not sure either how to use HalfFloatType in my effect composer but it looks like it would be best performance if I could get away with using only dithering?

You can pass a webgl render target into the composer on instantiation with your desired type:

const target = new WebGLRenderTarget( {
   minFilter: LinearFilter,
   magFilter: LinearFilter,
   format: RGBAFormat,
   type: HalfFloatType,
   stencilBuffer: false
} );
const composer = new EffectComposer( renderer, target );

Regarding performance I’m not sure what the impact of using a higher precision render target is, if anything. I’ve heard this in the past but never seen anything from a definitive source and it’s always hard to find performance guidelines for graphics. Having said that it’s probably safest to jus use dithering if that works for your use case.

2 Likes