[ad_1]
High dynamic range (HDR) video is one of the biggest 4K TV feature bullet points. It can push video content past the (now non-existent) limitations to which broadcast and other media standards have adhered to for decades. It’s impressive to see on TVs that can handle it, but it’s also a fairly confusing technical feature with several variations with differences that aren’t very well-established. That’s why we’re here to explain them to you.
Dynamic Range on TVs
TV contrast is the difference between how dark and bright it can get. Dynamic range describes the extremes in that difference, and how much detail can be shown in between. Essentially, dynamic range is display contrast, and HDR represents broadening that contrast. However, just expanding the range between bright and dark is insufficient to improve a picture’s detail. Whether a panel can reach 200 cd/m^2 (relatively dim) or 2,000 cd/m^2 (incredibly bright), and whether its black levels are 0.1cd/m^2 (washed-out, nearly gray) or 0.005cd/m^2 (incredibly dark), it can ultimately only show so much information based on the signal it’s receiving.
Many popular video formats, including broadcast television and Blu-ray discs, are limited by standards built around the physical boundaries presented by older technologies. Black is set to only so black, because as Christopher Guest eloquently wrote, “it could get none more black.” Similarly, white could only get so bright within the limitations of display technology. Now, with organic LED (OLED) and local dimming LED backlighting systems on newer LCD panels, that range is increasing. They can reach further extremes, but video formats can’t take advantage of it. Only so much information is presented in the signal, and a TV capable of reaching beyond those limits still has to stretch and work with the information present.
What Is HDR?
That’s where HDR video comes in. It removes the limitations presented by older video signals and provides information about brightness and color across a much wider range. HDR-capable displays can read that information and show an image built from a wider gamut of color and brightness. Besides the wider range, HDR video simply contains more data to describe more steps in between the extremes. This means that very bright objects and very dark objects on the same screen can be shown very bright and very dark if the display supports it, with all of the necessary steps in between described in the signal and not synthesized by the image processor.
To put it more simply, HDR content on HDR-compatible TVs can get brighter, darker, and show more shades of gray in between (assuming the TVs have panels that can get bright and dark enough to do the signal justice; some budget TVs accept HDR signals but won’t show much of an improvement over non-HDR signals). Similarly, they can produce deeper and more vivid reds, greens, and blues, and show more shades in between. Deep shadows aren’t simply black voids; more details can be seen in the darkness, while the picture stays very dark. Bright shots aren’t simply sunny, vivid pictures; fine details in the brightest surfaces remain clear. Vivid objects aren’t simply saturated; more shades of colors can be seen.
This requires much more data, and like ultra high-definition video, Blu-rays can’t handle it. Fortunately, we now have Ultra HD Blu-ray, a disc type (distinct from Blu-ray, despite the name) that can hold more data, and is built to contain 4K video, HDR video, and even object-based surround sound like Dolby Atmos. Just be aware that you can’t play them on regular Blu-ray players; you need dedicated Ultra HD Blu-ray players or a relatively new game console to play them.
Online streaming also offers HDR content, but you need a reliably fast connection to get it. Fortunately, if your bandwidth is high enough to get 4K video, it can get HDR; Amazon Video and Netflix’s recommended connection speeds for 4K content are respectively 15Mbps and 25Mbps, regardless of whether that content is in HDR or not.
What Is Color Gamut?
This is where HDR gets a bit more confusing. Wide color gamut is another feature high-end TVs have, and it’s even less defined than HDR. It’s also connected to HDR, but not directly. HDR deals with how much light a TV is told to put out, or luminance. The range and value of color, defined separately from light, is called chromaticity. They’re two separate values that interact with each other in several ways, but are still distinct.
Technically, HDR specifically only addresses luminance, because that’s what dynamic range is: the difference between light and dark on a screen. Color is a completely separate value based on absolute red, green, and blue levels regardless of the format of the video. However, they’re tied together by how we perceive light, and a greater range of light means we’ll perceive a greater range of color. Because of that, HDR-capable TVs can often show what’s called “wide color gamut,” or a range of color outside of the standard color values used in broadcast TV (called Rec.709).
This doesn’t mean HDR guarantees a wider range of colors, or that they’ll be consistent. That’s why we test every TV for both contrast and color. Most TVs today can hit Rec.709 values, but that leaves a lot of color that the eye can see but that those TVs can’t show. DCI-P3 is a standard color space for digital cinema, and it’s much wider. Rec.2020 is the ideal color space for 4K TVs, and it’s wider still (and we’ve yet to see any consumer TV that can reach those levels). And here’s the kicker: Rec.2020 applies to both SDR and HDR, because HDR doesn’t directly address color levels.
The above chart shows the range of color the human eye can detect as an arch, and the three color spaces we mentioned as triangles. As you can see, each expands pretty significantly on the previous one.
All of this might seem confusing, but it boils down to this: HDR doesn’t guarantee that you’ll get more color. Many HDR TVs have wide color gamuts, but not all of them. Our TV reviews tell you whether a TV is HDR-capable and what its full range of color looks like.
Types of HDR
HDR isn’t quite universal, is currently split into two major formats, with a few others gaining momentum.
Dolby Vision
Dolby Vision is Dolby’s own HDR format. While Dolby requires certification for media and screens to say they’re Dolby Vision compatible, it isn’t quite as specific and absolute as HDR10. Dolby Vision content uses dynamic metadata. Static metadata maintains specific levels of brightness across whatever content you watch. Dynamic metadata adjusts those levels based on each scene or even each frame, preserving more detail between scenes that are very bright or very dark. By tweaking the maximum and minimum levels of light a TV is told to put out on the fly, the same amount of data that would be assigned across the full range of light an entire movie or show uses can be set across a much more specific, targeted span. Darker scenes can preserve more detail in shadows and lighter scenes can keep more detail in highlights, because they aren’t telling the TV to be ready to show opposite extremes that won’t even show up until the next scene.
Dolby Vision also uses metadata that’s adjusted to the capabilities of your specific display, instead of dealing with absolute values based on how the video was mastered. This means that Dolby Vision video will tell your TV what light and color levels to use, based on values set between the TV manufacturer and Dolby that keep in mind the capabilities of your specific TV. It can potentially let TVs show more detail than HDR10, but that ultimately depends on how the content was mastered and what your TV can handle in terms of light and color. That mastering aspect is important, because Dolby Vision is a licensed standard and not an open one like HDR10. If Dolby Vision is available in the end video, that probably means that Dolby workflows were used all the way through.
HDR10
HDR10 is the standard pushed by the UHD Alliance. It’s a technical standard with specific, defined ranges and specifications that must be met for content and displays to qualify as using it. HDR10 uses static metadata that is consistent across all displays. This means HDR10 video sets light and color levels in absolute values, regardless of the screen it’s being shown on. It’s an open standard, so any content producer or distributor can use it freely.
HDR10+
HDR10+ is a standard developed by Samsung. It builds on HDR10 by adding dynamic metadata, like Dolby Vision. It doesn’t use individualized metadata for each screen, but it still adjusts the range of light it tells the TV to display for each scene or frame. It can potentially add more detail to your picture over what HDR10 shows, and like HDR10 it’s an open standard that doesn’t require licensing with a very specific production workflow.
Hybrid Log-Gamma (HLG)
Hybrid Log-Gamma (HLG) isn’t as common as HDR10 or Dolby Vision, and there’s very little content for it yet outside of some BBC and DirecTV broadcasts, but it could make HDR much more widely available. That’s because it was developed by the BBC and Japan’s NHK to provide a video format that broadcasters could use to send HDR (and SDR; HLG is backwards-compatible). It’s technically much more universal because it doesn’t use metadata at all; instead it uses a combination of the gamma curve that TVs use to calculate brightness for SDR content and a logarithmic curve to calculate the much higher levels of brightness that HDR-capable TVs can produce (hence the name Hybrid Log-Gamma).
HLG can work with SDR and HDR TVs because it doesn’t use metadata at all, while still holding a much wider range of light data. The only issue with it is adoption. It’s developed for broadcasters, and we still don’t see many broadcasters showing 4K video over the airwaves, cable, or satellite services at all. HLG still has far to go in terms of content.
Each can offer significant improvements over standard dynamic range, but they have their own advantages and disadvantages. In terms of adoption, HDR10 and Dolby Vision are the only significant standards that have plenty of content and compatible TVs available. Dolby Vision potentially offers a better picture, but it’s also less common than HDR10 because it’s a licensed, workflow-based standard and not an open one. HDR10+ is open, but we’ll need to see more companies actually start to use it, and for more content to be available. HLG could become the most universal standard due to its metadata-less nature, but again we’ll need to actually find some things to watch in it, first.
What You Need for HDR
HDR is not simply 4K. A 4K TV might support HDR, but that doesn’t apply to all sets. If your TV doesn’t support HDR, it won’t take advantage of the additional information in the signal, and the panel isn’t calibrated to handle that information even if it was properly read. Even if the TV can handle the signal, it might not produce a particularly better picture, particularly if it’s a less-expensive TV.
Most major streaming services like Amazon, Google Play, iTunes, Netflix, Vudu, and YouTube now support HDR for some of their 4K content. HDR10 is fairly universal across these services, while some also have Dolby Vision or other formats. Of course, there are also UHD Blu-ray discs, which often feature HDR10 or occasionally Dolby Vision HDR.
If your TV supports HDR, it probably has access to at least some streaming services that support HDR. However, it might not have all of them, so you might want to get a separate media streamer. The Amazon Fire TV Cube and Fire TV Stick 4K support HDR10, HDR10+, Dolby Vision, and HLG. The Roku Ultra supports HDR10, Dolby Vision, and HLG, while the Premiere+ only supports HDR10.
The PlayStation 5 and Xbox Series X both support HDR10 and Dolby Vision, for streaming apps as well as UHD Blu-ray playback. Of course, the all-digital versions of the consoles, lacking optical drives, can’t play UHD Blu-ray discs, but they can still stream 4K HDR content.
Is HDR Worth It?
4K is now the effective standard for TVs, and HDR is one of the most important features to consider when buying a new one. It still isn’t quite universal, but both HDR10 and Dolby Vision have both proven to offer some compelling improvements in contrast and color over standard definition, and there’s plenty of content to watch with both. If you’re looking to make the jump to 4K and you have the budget for it, HDR is a must-have feature.
Now that you know all HDR, make sure to read up on 8K to see why it isn’t as important as HDR, at least not yet.
[ad_2]
Source link