Let us say you visit an electronics store and checkout the TVs available for sale.
What would make you choose one TV over another?
For most people, it is the quality of the picture and visual experience.
What are the factors that can improve the picture quality and the visual experience?
1. Contrast Ratio - the ratio between the brightest and the darkest spots in your TV
2. Refresh Rate - the rate at which the frames are refreshed
3. Color Gamut - the gamut of colors available for display
HDR technology aims provide better contrast ratios and larger color gamut.
This results in a far better and more realistic picture quality.
It is one of the must haves if you are planning on buying a new TV.
Still not convinced?
Have a look at the image below and you will know why HDR matters!
The left part of the image is displayed with HDR. The right part is displayed without HDR.
What is HDR?
HDR stands for High Dynamic Range.
In pictures and videos, dynamic range refers to the range of luminance ( measured in nits )
The night sky has a luminance of about 0.001 nits and the sun at noon has a luminance of about 1600000000 nits.
A regular LED TV can have a luminance of about 300 - 400 nits. While a HDR TV can have a luminance upto 2000 nits.
How does HDR work?
HDR works by increasing the dynamic range of luminance.
This means, the TV will be able to show the details of the darker portion of the screen which has lower luminance, as well as the brighter portion of the screen which has higher luminance ( dynamic range has increased ).
To get an idea of how HDR works in the TV, let us first see how it works in static images.
Consider the images below
Image with low exposure
Image with high exposure
The first image is shot with low exposure. The second with high exposure.
The third image combines these two to produce an image that has high dynamic range - which is more closer to what the human eye sees. This is how the HDR works with respect to images.
The HDR processing in videos is a little different and depends on which HDR standard is being used.
Before we explore each of these standards and their working, let us discuss some of the terminologies associated with HDR.
Color Gamut -
If you have ever watched anything on a CRT TV, you would have noticed that the video never looked real ( like in real life ). One of the reasons for this, is the limited set of colors that was available for display. The CRT TVs adhered to Rec. 709 color space, which covered 35.6% of the colors visible to the human eye. Now the new standards are in place, namely Rec. 2020, which covers 75.8% of the colors visible to the human eye. The HDR standards require the TV to support Rec. 2020 color space - meaning the pictures look more realistic on a HDR TV.
Bit Depth -
Each pixel needs to emit a certain amount of luminance, as per the TV signal, to show the picture. To digitally encode 1000 units of luminance 10 bits are required; and for 10000 units of luminance, 14 bits are required. The number of bits required to digitally encode the luminance level is known as the bit depth. The human eye is less sensitive in brighter regions than in darker regions. This fact is used to reduce the bit-depth required to represent the luminance [ Higher bit depth = higher size of the video ]
HDR Mastering -
Once the video is shot, the technicians use the reference monitors to render the video in appropriate dynamic range and color setting. This is stored as metadata in the video. When the video is played at home, the metadata is used to render the video. The video so displayed will be as intended by the content creator - with all the necessary effects to create the desired experience.
Transfer Function -
The human visual system is more sensitive to changes in luminance in darker regions than in brighter regions. Hence, the changes in darker regions need to be captured in a more detailed fashion ( more bits to encode ). Similary, the changes in brighter regions need to be captured in less detail, as human eye cannot make out much difference ( less bits to encode ). To achieve this encoding and decoding there are some mathematical functions known as transfer functions. There are two main transfer functions when it comes to HDR - Perceptual Quantizer ( by Dolby and SMPTE ) and Hybrid Log-Gamma ( by UHD Forum ).
HDR Standards and how they work
1. HDR10 [ Also known as HDR10 Media Profile ] -
This standard uses the Perceptual Quantizer with a bit depth of 10-bits. It uses the Rec. 2020 color space. The standard defines metadata that is static. Meaning, there is a fixed metadata for the entire video, using which the HDR content is rendered. This is not as effective as HDR10+ or Dolby Vision.
2. HDR10+ -
This standard also uses Perceptual Quantizer with a bit-depth of 10-bits. It also uses the Rec. 2020 color space. However, unlike HDR10, the metadata is both static as well as dynamic. The dynamic metadata allows for customized rendering of each frame / scene. This is much more effective than HDR10 and is capable of producing a much better visual experience. This is an open-standard - meaning there is no royalty fee for a manufacturer to use this in their TV. However, the manufacture must obtain a certification before this can be used.
3. Dolby Vision -
This standard uses Perceptual Quantizer with a bit-depth of 12-bits. It uses the Rec. 2020 color space. And like HDR10+, it has static and dynamic metadata. The dynamic metadata allows for customized rendering of each frame / scene. This is very much effective and better than HDR10+ ( it has a bit-depth of 12-bits compared to 10-bits of HDR10+ ). It can theoretically support a TV with luminance of upto 10000 nits ( currently, the TVs can go upto 4000 nits ). However, the real difference will be evident when TVs with higher luminance enter the market. This is a propereitary solution - requiring a $3 licencing fee per TV set to be paid by the manufacturer.
4. Hybrid Log Gamma -
This standard uses Hybrid Log-Gamma transfer function with a bit depth of 10-bits. It uses the Rec. 2020 color space. It is developed by NHK and BBC. It's main advantage is it is backward compatible with SDR. If you do not have a HDR TV and you are watching the HDR content, it works fine. The limitation of this standard is, it does not provide as good a HDR content as HDR10+ or Dolby Vision.
So, the bottom line is, unlike some fads and gimmicks, HDR is the real thing. Depending on the HDR standard used and the hardware capabilities of the television, it can really provide you the ultimate visual experience. It is something that you should definitely look for when you buy a TV!