Anyone with even a passing interest in TV and cinema will have heard of resolution buzzwords like 1080p and 4K. It’s a fairly easy marker of image quality - progressively sharper images are always better, aren’t they? After all, almost anyone could tell the difference between a 720x480 DVD and a 1080p Blu-Ray disc. That’s why TVs are often sold on these specs.
However, a new spec has been making the rounds in recent years, and it’s becoming ever more popular to judge a TV by its ability to support it. HDR, or high dynamic range, is a trickier sell than a sharper image, but when done right, it can represent an even bigger change for image fidelity than a higher resolution.
What is HDR?
“High dynamic range” refers to the ability to capture detail in both bright and dark areas of an image. Similar concepts, also called HDR, can be found in a lot of visual media. In photography, it refers to combining multiple exposures to create an image with plenty of detail even in very bright and dark areas. In film CGI and video games, it refers to a particular way of representing colors that allows detail to be preserved in highlights and shadows.
For TVs, HDR is a technology that allows for a wider and deeper range of colors, better contrast, and improved detail preserved in bright and dark areas of the video stream.
Normal, non-HDR video signals and TVs are known as “SDR” or “standard dynamic range”, and will look comparatively muddy, less vibrant, and less natural overall.
HDR TVs provide a greater contrast ratio and can display a wider range of colors between the darkest and brightest tones, more closely resembling how colors look in real life.
The HDR Format War
Like so many multimedia technologies before it, HDR has its very own format war, with a slew of companies putting forth their own HDR format and trying to corner the market.
Let’s break down each of these main types of HDR.
HDR10
The most “basic” and accessible of all HDR formats, HDR10 was created by the Consumer Technology Association as a free and open standard that any manufacturer can use at no cost. Almost all HDR video streams and media have an HDR10 stream, so it’s probably going to play on every HDR TV.
HDR10 is certainly a big step up above SDR content, but it’s not the absolute best that HDR can be. For one, it only has “static metadata,” which means that the same “setting” of HDR is used throughout a single movie or TV episode, even if that setting is inappropriate for a particular scene.
HDR10+
Faced with the limitations of HDR10, Samsung created the HDR10+ standard, which adds “dynamic metadata,” or the ability to change the HDR settings between scenes in the same video stream. HDR10+ also improves on HDR10’s color depth and brightness representation.
Like HDR10, HDR10+ is free to use, but because it’s produced by Samsung, some manufacturers might understandably be skeptical about implementing it. For example, LG’s top-tier OLED TVs don’t support the standard, despite LG OLEDs being one of the most popular and best-reviewed high-end televisions on the market.
HLG
HLG, or hybrid-log-gamma, was created by two national television networks, the UK’s BBC and Japan’s NHK. HLG was designed with the goal of creating an HDR standard for broadcast that would work even with SDR television sets. Many major sports events such as the World Cup have since been broadcast in 4K HDR.
Unlike other HDR formats, HLG doesn’t use metadata, but rather broadcasts an SDR signal along with an additional logarithmic gamma brightness curve. If the receiving television set doesn’t support HDR, it’ll happily accept the SDR signal and be on its way. If it does support HDR, though, it will sense the HLG curve and apply it to create a gorgeous HDR image.
HLG isn’t as good as other more advanced formats, but it’s the most backward-compatible of them, filling an important niche.
Dolby Vision
With dynamic metadata, an impressive array of color and brightness specs, and the backing of major industry players, Dolby Vision looks like one of the best HDR formats on the market.
Unfortunately, there are royalty costs associated with using Dolby Vision, which may dissuade manufacturers from using it, and could encourage a shift to more open, free models. Nevertheless, Dolby Vision remains the second-most popular HDR format.
Advanced HDR by Technicolor (SL-HDR1, 2 and 3)
Advanced HDR features several HDR varieties that each fulfill separate niches. SL-HDR1 works similarly to HLG in that it’s composed of an SDR signal and an HDR layer that are transmitted concurrently, allowing it to work with SDR sets. SL-HDR2 is similar to HDR10+ and Dolby Vision with its use of dynamic metadata. SL-HDR3 starts with an HLG signal and then adds dynamic metadata to further improve image quality.
HDR10, HDR10+ and Dolby Vision are the most commonly used formats.
Which formats are supported by the greatest number of devices?
HDR10 and Dolby Vision represent the biggest players in the format war. Virtually every major 4K HDR television set supports both formats, as well as nearly all video game consoles.
What do you need to stream in 4K HDR?
Looking to venture into the world of HDR? Here’s everything that you need to get started.
1. 4K HDR-compatible TV
Many 4K TVs produced after 2015 have some kind of HDR support. Exactly which formats are supported vary between manufacturers and models, but most of them at least work with HDR10 and Dolby Vision.
Some budget TVs, though, only have support for the HDR signal, while not actually being able to display it because they lack a screen with a wide color gamut. Instead, these TVs convert it to an SDR image that they can display properly. This is almost invariably inferior to a real HDR signal, and depending on the conversion quality, it might even look worse than a standard SDR signal.
Learning which TVs have a wide color gamut is somewhat more difficult because of all the marketing speak. Look up more technical TV reviews such as those on Rtings.com to find out whether a TV has “real” HDR support!
2. HDR-ready Streaming Media Player
The next step is to use a device that allows you to actually display 4K HDR content. These include current-generation gaming consoles like the PS4 Pro and the Xbox One. Streaming devices like the Roku Ultra, Apple TV 4K, NVIDIA Shield, and Amazon Fire TV Cube all explicitly support HDR as well.
Many modern PCs and laptops will also support HDR.
3. 4K HDR Compatible Content
What good is all that fancy HDR-ready equipment if you don’t have any HDR content to play on it? You’ll want a streaming service that provides HDR streams.
These formats are supported by major streaming services such as Amazon Prime, Netflix, YouTube, Disney+ and iTunes, as well as Vudu, FuboTV, and FandangoNOW. There’s a growing library of available HDR titles on these streaming services, giving you more options for a truly cinematic and immersive experience.
Note that Netflix doesn’t give you HDR unless you have an Ultra HD plan. This is the most expensive Netflix tier, allowing you to download and watch on up to 4 screens at the same time. Other tiers don’t support HDR video.
4. Sufficient Bandwidth
4K HDR content requires quite a hefty connection. Netflix, for example, requires a 25Mbps connection or higher if you want to stream in 4K HDR. Make sure your internet service provider is fast and stable enough to maintain a solid stream.
Mbps is used to specify internet connection speeds.
5. 4K HDR-compatible HDMI cable
To stream in 4K HDR, you need a cable that supports HDMI 2.0. Many cheap manufacturers produce cables that claim to feature support for HDMI 2.0 and 4K HDR signals, but your experience may vary depending on how good your cable is.
First, if you have a cheap HDMI cable that is very long, there’s a chance that it won’t support the 4K HDR signal over its entire length. In addition, very cheap or older, worn-out cables might not be able to handle the high bandwidth required of 4K HDR.
The safest way to get a good HDMI experience is to choose a Premium HDMI Certified Cable. This ensures that you’ll get a proper 4K HDR signal with any device you use.
What is HDCP and why is it important in choosing a cable?
Supported specification and bandwidth aren’t the only considerations when choosing an HDMI cable. HDCP, or High-bandwidth Digital Content Protection, is an anti-piracy protection system built right into the HDMI system. It’s sort of like a handshake between the source device and the TV which confirms that both devices support the content protection.
The latest version is HDCP 2.2, and any device that wants to stream 4K HDR content must be compliant with it. This means that every device in the video chain, including the TV, set-top box, gaming console, PC, media player, and yes, the HDMI cable, must be compliant with HDCP 2.2. If at any time, one device fails the “handshake,” then the content might be played back at a lower resolution, or even not show up at all.
If you use an HDMI cable that doesn’t perfectly support HDCP 2.2, then you won’t be able to playback 4K HDR content, along with most any kind of copyrighted video.
HDCP helps bring HD/UHD digital content to consumers by providing copy protection over HDMI. HDCP protects this interface.
Is it so hard to choose a good 4K HDR HDMI cable?
How do you find the right HDMI cables? Stick with a brand that’s known for nothing but premium-quality peripherals and accessories, and you’ll never have to worry about a bad cable again.
What cable/adapter do you need?
You will need to find a cable or adapter that can connect your source device to your display so first identify what ports you have. The ports you will find on your source/display will vary. If you're lucky you might even have a few options to choose from.
Source | Display | Accessories needed | |
Media Player Box | Premium High-Speed Cable | ||
Gaming Console | |||
HDMI Laptop | |||
USB-C Laptop | USB-C to HDMI Cable | ||
USB-C Laptop | USB-C Monitor Cable | ||
USB-C Laptop | USB-C to DisplayPort Cable |
Comments
0 comments
Please sign in to leave a comment.