Why is HDR, HDR10, UHD Premium so important to a gaming TV?

ashley

Member
HDR stands for High Dynamic Range, as opposed to SDR (Standard Dynamic Range). The dynamic range of an image is the extent of its spectrum from the darkest to the lightest area. The use of a greater dynamic range when creating images or videos makes it possible to add more nuances of brightness (and therefore colors), especially in sensitive areas such as dark and light areas, for a richer and more contrasted result, limiting blocked blacks or burnt whites.

So why can't any type of screen support an HDR image?

As you probably know, a screen renders a color by associating three sub-pixels of colors red, green, and blue. The final color is obtained by controlling the light intensity, or more exactly the luminance (measured in candelas per square meter (cd / m²), or in nits), of each sub-pixel, but this luminance is limited by the on-board equipment by the TV. The TV must therefore be able to display the dynamic range required by HDR. And if the UHD Alliance (a consortium of studios, TV channels, TV manufacturers and other content providers) recommends a range from 0.05 cd / m² to 1000 cd / m² for LCD / LED technology, and 0.0005 cd / m² to 540 cd / m² for OLED technology, HDR is not standardized on this point, which allows manufacturers to stamp their TVs “HDR compatible” without necessarily meeting these criteria, but as soon as the TV knows how to process an HDR signal. Because in effect, an HDR signal breaks down the “classic” video part (SDR) and the specific HDR data by proposing an additional luminance channel and metadata “surrounding” the video stream.

Why is HDR, HDR10, UHD Premium so important to a gaming TV?

To be HDR compatible, a TV must therefore be able to process this signal, but once again, without necessarily being able to display all of the information. I see you are lost, rest assured you are not the only one ... an HDR signal breaks down the “classic” video part (SDR) and the HDR specific data by proposing an additional luminance channel and metadata “surrounding” the video stream. To be HDR compatible, a TV must therefore be able to process this signal, but once again, without necessarily being able to display all of the information. I see you are lost, rest assured you are not the only one ... an HDR signal breaks down the “classic” video part (SDR) and the HDR specific data by proposing an additional luminance channel and metadata “surrounding” the video stream. To be HDR compatible, a TV must therefore be able to process this signal, but once again, without necessarily being able to display all of the information. I see you are lost, rest assured you are not the only one ...

But how do you make sure that a TV fully supports HDR?

In the absence of a standard, fortunately, once again thanks to the UHD Alliance, the TV industry has acquired a label guaranteeing the compatibility of a TV with all the required criteria: UHD Premium. Affixed to a TV, this label guarantees several things: a UHD panel of 2160 x 3840 pixels, HDR compatibility, associated with contrast ratios as seen previously, but also a 10-bit panel capable of displaying an extended color space ( Wide Color Gamut) with a compatible input rec. 2020. Because brightness is not everything, you still have to have the right colors. Come on, let's hang on!

Why is HDR, HDR10, UHD Premium so important to a gaming TV?

Indeed, most previous generation LCD / LED or OLED screens offer 8-bit tiles, i.e. 8 bits per sub-pixel color (red, green, blue), i.e. 256 shades per color, or a total of approximately 16 million colors, capable of displaying the entire rec. 709 color space covering 35% of the colors perceptible to the human eye. With a 10-bit panel, we switch to a display of 1024 shades per color, for a total of more than 1 billion colors! This allows better color gradation and covers almost the entire DCI-P3 space covering 45% of the perceptible colors, and displays an extended color space, which is then called Wide Color Gamut (WCG), in s attacking space rec.2020 covering 76% of the colors perceptible to the human eye.

However, even the best TVs of the moment only manage to display around 70% of this space (75% for the Q9F from Samsung), and there is nothing to say that the sources thus encoded fully cover it. Another area of blur therefore. The fact remains that the results with such slabs are striking, the colors are richer, and the gradations are finer and more precise, eliminating posterization phenomena (visible demarcation between two shades of colors). But once again, it all depends on the sources ... Are you still there? Come on, last part! the colors are richer, and the gradations are finer and more precise, eliminating posterization phenomena (visible demarcation between two shades of colors). But again, it all depends on the sources ... Are you still there? Come on, last part! the colors are richer, and the gradations are finer and more precise, eliminating posterization phenomena (visible demarcation between two shades of colors). But again, it all depends on the sources ... Are you still there? Come on, last part!

The signal capable of carrying this information is notably characterized by the HDR10 standard, guaranteeing an HDR stream with colors encoded on 10 bits in the rec.2020 space. Also, a TV bearing this label is supposed to guarantee the adoption of a 10-bit panel. For information, Dolby has already gone further by proposing its Dolby Vision standard, encoding colors on 12 bits and allowing in theory (with a small sampling in the lightest tones) to cover 100% of the color space visible by l human eye. Once again beware, the displayed compatibility of a screen with the Dolby Vision standard does not guarantee the adoption of a 12-bit panel, long live marketing! Indeed, no TV to date offers a 12-bit panel. At most, there are models capable of processing a 12-bit signal, or equipped with color management engines oversampling a signal in 12 or 14 bits for better gradation, but unable to display more colors than what their panels allow at best 10 bits. Not to mention that the sources in Dolby Vision seem to be still very rare for the moment.

Finally (finally!), Know that the HDR used in games is HDR10. You now have all the information you need to choose the TV that will make the most of this medium: either direct you to UHD Premium certified TVs, or dive into the specifications and tests of each TV to find out the definition of its screen, its color depth, its contrast ratio, its compatibility with HDR signals, for a choice that can then be made à la carte according to your budget and your priorities.
 
Top