How does HDR relate to color space and bit depth?

Answers from Video Production Stack Exchange:
https://video.stackexchange.com/questions/21676/how-does-hdr-relate-to-color-space-and-bit-depth


TV technology and video encoding is in principle outside the scope of this site. Still, as photographers we are presumably interested in new display standards that promise both more dynamic range and wider color gamut. HDR computer monitors are coming, so it probably won’t be long before photographers too will be interested in HDR displays.

HDR by itself just means High Dynamic Range. Dynamic range is the brightness ratio between the brightest and the darkest parts. “High” is relative to what the sensor can capture or to what a standard display can show.

HDR video is one of several standards for video encoding. They have names like HDR10, Dolby Vision, and Hybrid Log-Gamma. They all support higher bit depth (10-12 bits), wider color gamut (Rec. 2020 and/or DCI-P3), higher dynamic range, and 4k resolution. They are not compatible with each other, but some devices support more than one HDR standard.

“HDR TV” doesn’t necessarily mean much, except that it will accept a HDR video signal and display an image. It does not necessarily mean that it can display the wider gamut and higher dynamic range, just that it can read the signals and show something, even if what it shows is plain sRGB with a mundane dynamic range.

“UHD Premium” is a certification standard for displays that can actually show HDR. Requirements include:
•minimum 10 bit color
•minimum 90% of the DCI-P3 color gamut
•use a HDR transfer function (The transfer function for sRGB is called ‘gamma’, it translates bit value to brightness value. HDR uses a different function so it can cover a larger dynamic range with a small increase in the number of bits)
•either: Minimum 1,000 nits brightness and max 0.05 nits black level, contrast ratio 20,000:1 or better (for LCD displays)
•or: Minimum 540 nits brightness and max 0.0005 nits black level, contrast ratio 1,080,000:1 or better (for OLED displays, they can’t go as bright but have better black levels)

Standard displays are 200-400 nits and rarely go beyond 1,000:1 contrast ratio. So a “Premium” HDR display has significantly higher peak brightness as well as an order of magnitude better contrast.

Brighter highlights, higher contrast ratios, and a different “bits to brightness” encoding/decoding all combine to enable the “high dynamic range” part of a HDR TV.

Color space and bit depth is only related to HDR in the sense that they are part of the same standards. Yes, you need more bits to represent a higher dynamic range without banding. But the larger color space is a different decision, it’s more a matter of including enough improvements to make a new video standard worth the trouble. (4k is also part of the standard, and 4k is not related to HDR either.)

“The sun looks brighter on a HDR display” means that a) the display emits more light in the brightest parts of the scene, and b) it may look even brighter because the rest of the scene can be dimmer without losing detail.

So it’s a real improvement; the wider color gamut and higher dynamic range can contribute to more vivid photos and video.

扫描二维码关注公众号,回复: 3089888 查看本文章

The caveat is that TVs/monitors may be sold as “HDR compatible” without any actual HDR capability in the display. The content needs to be made specifically for HDR to make use of the new capabilities. And for computer monitors, you need a graphics card that supports HDR output.


HDR doesn’t relate to color space at all, as it stands for “High Dynamic Range”, ie. brightness, not color, contrast.

Traditional TV signals had a rather low contrast ratio, as old TV sets had a contrast of only 30:1. Modern displays can produce a lot more, 1000:1 or more with dynamic backlight (local dimming).

“HDR” displays are just a marketing term for displays with a very high contrast ratio.

But using this high contrast will require a higher bit depth signal or you’ll have only a few steps spread out over the whole range which may introduce a kind of banding just like in the color domain.

You can’t squeeze 1000:1 into an 8 bit brightness signal, so you need a higher bit depth. 10 bit can encode 1024 steps, which still isn’t that high really, but better than 256.

Calling those displays “HDR” is, from a photography perspective, still laughable, as HDR photos are encoded in 32 bit floating point values with near infinite steps, but the dynamic range is higher than that of a traditional CRT TV or the standard NTSC/PAL signal. (Note that you need an “HDR” signal source to actually enjoy that capacity.)


HDR in the modern TV sense refers to one of several different HDR specifications for color. To understand what High Dynamic Range means, we first need to understand what dynamic range is.

Dynamic Range is the difference between the highest and lowest values in a signal or display. A high dynamic range TV generally means that there is a large amount of brightness levels between the darkest and lightest possible portions of the display that can be shown at the same time.

The HDR specifications specify more than just the max and min brightness levels that must be able to be displayed though. They also specify that a relatively high bit depth must be possible. This allows for a video to have a greater number of levels of brightness within it, which also means that more subtle variations in color can be created and thus better use of the increased range of brightness that can be produced. The higher the dynamic range, the more change in color between each value at a particular bit depth, but you can have fairly low dynamic range displays or fairly high dynamic range displays that still have a high bit depth.

There is also requirements for contrast ratio. Contrast ratio is similar to dynamic range, but not exactly the same thing. Contrast ratio is the relationship, in absolute light level, between the darkest and brightest values the display can simultaneously display. It is dependent both on how bright the display can get and how well it can block light. The contrast ratio determines how well a bright and dark portion of a scene can co-exist at the same time. Static contrast ratio is the meaningful value as it is what the display can really do. Dynamic doesn’t mean a whole lot because it depends on being able to do adjustments that aren’t at a pixel level and don’t really produce an ideal image quality.


To comment two well-known statements here:

“You could also have a fairly low bit depth in a very wide color space where the steps between each gradation are much coarser.”

“Color space and bit depth is only related to HDR in the sense that they are part of the same standards. Yes, you need more bits to represent a higher dynamic range without banding.”

Both don’t take dithering into account. In the absence of dither, the statements are correct but with dither applied, in theory, a smooth gradient would be even possible with only one bit, although it’s extremely noisy when applying spatial dithering.

If one accepts a lower SNR, there won’t be banding regardless of the bit depth.

If it is true that bit depth and colour space are not related (which makes sense as any colour space could exist in a purely analog domain as well without any bits to consider at all), then the logical conclusion would be: theoretical perfect dithering techniques assumed, a given bit depth - analog to audio - only limits the achievable signal to noise ratio and thereas the dynamic range as the noise will mask the lower end, however not the possible amount of colors, not any banding and no brightest whites or whatever.

猜你喜欢

转载自blog.csdn.net/august_zh/article/details/78629389