True HDR color support

Chrome has implemented the “WebGPU Extended Range Brightness” proposal for some time now, and it’s also supported by the Safari Technology Preview. I am wondering if there is any interest in exploring support for this tech, since it removes the restriction for the browser to clamp colors to the 0–1 range.

To be clear, I am talking about support for Three to output “brighter than #ffffff” colors to displays that support it. I don’t mean traditional tonemapped images of the same name, where multiple photo exposures are composed into a regular sRGB image. This is also different to how the renderer’s linear HDR colors are tonemapped to the SDR range (e.g. Reinhard, Cineon, ACESFilmic, AgX…).

If you’ve seen true HDR colors, you know: they’re stunning, rich, immersive, and lifelike. Since ThreeJS calculates PBR shading internally with HDR numbers, simply displaying those extra bright colors could yield really impressive and detailed renders. Currently, tonemapping simply discards (or carefully clamps) all of that highlight information, which seems like a waste if the monitor could otherwise display a lot of it.

MacBook Pro supports true HDR colors, but many new monitors and TVs offer at least some degree of support.

As an initial test, perhaps ThreeJS would only need to call the webgpu context.configure() function with a couple of extra params outlined in the webgpu-hdr explainer. It seems that there’s even a webgl analog too (a few extra properties on the context) so this could ultimately be supported by all renderers.

I’m not sure if ThreeJS frag shaders currently clamp buffer colors to 0…1, but it’s possible that setting the renderer’s toneMapping property to NoToneMapping would allow the renderer’s unclamped linear rgb colors to “shine through” to the “extended” buffer. Longer term, implementing “extended” versions of the tonemapping algorithms would probably be important, to avoid harsh clipping at the display’s brightness limit.

I’m also not sure whether there is any way to query that limit. I imagine HDR-ness varies wildly between devices, so HDR tonemapping algorithms probably need to be able to adjust their “rolloff” threshold to a wide range of maximum brightnesses.

We have some discussion going on the three.js side, though it does require the tone mapping updates, to be of much use for lit rendering:

A word of caution about “true HDR colors”. There are a lot of badly-configured “SDR vs. HDR” comparisons floating around that just turn off the tone mapping for HDR… which is a very pointless comparison. The HDR display tech is great and it’s worthwhile to make use of their range, but … if you see a comparison and the colors are really vastly different, something is wrong with that comparison.

Differences should have less to do with information being lost (with correct tone mapping, loss should be minimal) and more to do with perception phenomena like the Hunt Effect and Stevens Effect. Both should be considered, and contrast and saturation adjusted as needed — not just for HDR, but even if you expect your viewers to be in a darkened room.

You’re correct, and unfortunately there is not currently a way. Available HDR headroom varies with the device — and even as you increase or decrease the brightness on a MacBook display.

That’s a major problem in the HDR definitions in WebGPU today, which I hope will be addressed before too many broken color management implementations ship. If not a way to query the HDR headroom, then at least some privacy-preserving alternatives…

3 Likes

Brilliant reply thanks. Not sure how I missed that PR in my search. :+1:

Agree about the implied cautionary point about taking care not to glorify HDR colors unduly. I’m not sure what broken comparisons you’re talking about exactly, but there’s a lot of snake oil out there, for sure. In particular, I find most of the oversaturated reels (of snakes, food, etc) used by many major manufacturers to sell their TVs unsavory at best.

I had meant “HDR colors” only in the “colorspace” sense: the extra luminance range available. Though it’s worth noting that this API can also pair up with “wide color” support, like P3, which actually are significantly more “colorful” (although maybe not “vastly” so).

Interested in your thoughts about HDR+P3. Match made in heaven?

If our shootout is between “traditional” and “modern” web color tech, I do feel like it’d be fair to pit SDR sRGB against HDR P3, even while allowing highlights to punch through to the maximum available luminance in each mode. This might mean the overall apparent brightness is significantly different, but I can’t think of a “fairer” way to compare modes, despite what Hunt and Stevens might say about it. Can you share any thoughts on how SDR and HDR colors should be fairly compared?

I think we must have some objective about what the viewer should see, and then “form an image” (by rendering, tone mapping, and color grading) aiming for that objective. The particular image formation settings that reach the objective will depend on the medium: HDR vs. SDR, bright vs. dark viewing environment, bright vs. dark surrounding context on the webpage, etc. Not that we can control everything, but should account for what we can.

When you see an HDR vs. SDR comparison and the SDR image is drastically “less colorful” than the HDR image, this suggests that the author either did not have any objective for the images to begin with, or made a technical mistake — that’s very easy to do, because adapting tonemapping for HDR and/or wide gamut displays is far outside of most developers’ experience.

I suspect the perceived difference in practice (after letting your eyes adjust — glancing quickly between HDR and SDR images is a different story!) between the HDR/SDR mediums is much less than the effect of viewing a typical sRGB display in darkened vs. bright rooms. Which is definitely not nothing! Maybe there are some colorists around who could comment. But from the ways people talk about HDR, it would be easy to assume differently. :slight_smile:

2 Likes

I’m 95% sure we’re on the same page. But I just want to double check, because if I’m wrong, I really want to know!

My understanding of it is that HDR doesn’t seek to expand available saturation at all, it just gives significant additional luminance headroom. We do agree on that, don’t we?

I think we’re on the same page, yes. I’m using the term “HDR” to refer simply to an API providing more physical energy / “radiance” from the display. I’ll typically use the term “wide gamut” to refer to any display or color space with a gamut larger than sRGB.

Both technologies do have some bearing on the perception of saturation — but not, I think, an entirely clear-cut relationship.

1 Like