[SOLVED] sRGB/HDR color values in postprocessing fragment shaders

Hey guys,

I’m a little unsure about the correct terminology here and the answers I was looking for are a bit vague in previous topics regarding this subject.

Long story short: I’m wondering if a material’s emissive property can be translated to a post processing fragment shader as having color values larger than 1.

I want to create a feature generation shader for bloom & lens-flares and stuff like that, but rather than working with thresholds just below 1, I’m looking for a solution where I can pick any color currently on screen that is within the HDR-range (above 1).

My scene is rendered to a couple of rendertargets first, after which its textures are used for postprocessing; just the normal swap-chain logic that other post processors use as well.

Can this be done without modifications to the renderer itself?

I’m not looking for complete code solutions, just a nudge in the right direction on how to approach this (e.g.; can this be done though some combination of (renderer) settings?)

Thanks in advance!

1 Like

You can set your render texture’s data type to something that supports HDR range, like FloatType or HalfFloatType? Something like:

renderTarget.type = FloatType;

I have a feeling that I might have been misunderstanding your question though…


I think that did the trick! Thanks for the hint! :smile:

To clarify the issue: I simply want “selective bloom” without having to render the scene multiple times. In my opinion, using emissiveIntensity on an object should produce “light”, and should therefore be affected by bloom.

So, any object without an emissiveIntensity will not be affected by bloom effects, while any object with an emissiveIntensity higher than 0 will have bloom (the effect strengthens naturally the higher the intensity value).

For anyone stumbling across this:

My encoding on the renderer is set to sRGBEncoding.
The framebuffer(s) in the swapchain are configured like so:

const rtOptions: WebGLRenderTargetOptions = {
    type:      FloatType,
    encoding:  sRGBEncoding,
    minFilter: NearestFilter,
    magFilter: NearestFilter,

If you then (in your post-fx “feature generation” shader) pass the ambient light value of your scene, it’s as simple as subtracting 1.0 + ambientLightIntensity from the final color so you’ll be left with all pixels from objects that have an emissiveIntensity value higher than 0.