Everything You Need To Know About HDR

If you followed the news out of CES closely you probably heard the word HDR tossed around a lot. This coming year we'll see TVs for under $700 with the feature, and fancy monitors for over $1300. But what does HDR even mean?

A series of HDR televisions. HDR especially shines when it comes to things like stain glass windows, reflections on water and the intricacies of quilts. (Images: Gizmodo)

HDR stands for high dynamic range. Originally the term was applied exclusively to a style of still photography that greatly diminished shadows and highlights in photos. It made it useful for architects and real estate agents, people who would want to represent the insides of their buildings without all the nasty glare of sun and the darkness of shadowy corners. Yet HDR also found fans amongst really bad photographers with access to Photoshop, and consequently an unattractive HDR aesthetic emerged in still photography.

That aesthetic, thankfully, can't translate to moving pictures. But HDR in movies and TVs is still about revealing details in areas of extreme brightness and darkness.

A display accomplishes the feat by having a truly exceptional contrast ratio. The UHD Alliance, a consortium of TV makers, content creators and distributors, actually defines the peak brightness and darkness a TV needs to produce to be HDR-compliant.

Specifically, the UHD Alliance says a TV has to be able to put out 1000 nits (twice as bright as Samsung Galaxy S7 phone in sunlight) and get as dark as .05 nits — something a lot of LED TVs can do. Or it has to get as bright as 540 nits and as dark as 0.0005 — something only an OLED display is really capable of. It also must be able to display content at a minimum 4K resolution and produce wider colour gamut than that the one used for the last 30 years, Rec. 709.

The new LG OLED W7 plays all three HDR formats as well as a new unreleased one from Technicolor.

Rec. 709, or sRGB, is a colour gamut that was used to master nearly every show, movie and video game until only recently. Your TVs, phones and even your computers were all calibrated to that specific set of colours, but Rec. 709 can only replicate about 34 per cent of what the human eye can see. Which is why the movie mode on your TV or laptop always makes things look sort of brown and washed out.

Newer TVs and monitors are capable of replicating a lot more colours. Most high-end TV sets can reproduce the colour gamut found in the digital movie projectors at your local theatre: DCI P3. That's about 46 per cent of what the human eye can see. It's an extremely popular gamut of colours — including being the gamut of choice for all 2016 and newer Apple products and, asĀ announced recently, Instagram.

But being able to display less than half of all colours a pristine human eye can see still feels like a waste. That is why, in 2012, a new, even wider colour gamut was introduced: Rec. 2020.

Rec. 2020 gamut represents a whopping 67 per cent of what the human eye can see. So everything on your TV would be brighter and more vibrant than anything you'd seen on a TV before. Unfortunately, not a single consumer display is yet capable of Rec. 2020. So only a handful of HDR formats actually require it over the less impressive DCI P3.

Which brings us to the actual methods for distributing HDR content. Currently, the most popular HDR format, because there are many, is HDR10. Its popularity stems from the fact that it's cheap to use. There are no nasty licensing fees associated with HDR10 so any TV maker or content producer can use it. Right now every TV maker with an HDR set can play HDR10 content, and it's the format used by the Xbox One S and PlayStation 4 Pro as well. Content from Netflix and Amazon can also be played back on HDR10 sets.

But its affordability also creates limitations. Specifically, HDR10 is... sort of dumb. Other formats adjust depending on the quality of the TV in the hopes of providing the best possible picture on a set. HDR10 doesn't do that. If your TV can perfectly match the colours and brightness and darkness posited by the HDR10 content than it will look exactly as the director intended. If your TV cannot, than you may find instances where light sources are blown out, or scenes appear too dark, or colours too vivid.

Also, HDR10 doesn't scale well. In fact, HDR10 content on future TVs will look worse as those TV sets improve and peak brightness and darkness levels get better and and colour reproduction improves — sort of like trying to watch standard definition content on a 4K set.

This scaling problem is dealt with in some of the other HDR formats. Dolby Vision is especially popular at the moment. Introduced last year, Dolby Vision has quickly gained a following amongst content creators because of its dynamic metadata. Theoretically, a filmmaker can master their movie in Dolby Vision and never have to master it again for Dolby Vision sets. It's also mastered to reproduce more colours and better contrast — specifically, it can replicate up to Rec. 2020 and a theoretical 10,000 nits peak brightness. HDR10 just goes up to DCI P3 and 4000 nits peak brightness.

Samsung's Q9 is the rare non-OLED that's optimised handle all of the new HDR formats.

But Dolby Vision requires expensive monitors to master content on, and it has pricey licensing fees, too. So while it's the most futureproof HDR format and potentially produces the highest quality content, it hasn't seen wide adoption. Still, more and more films are being mastered in Dolby Vision, and nearly every major TV maker announced support for the format in their flagship TVs at CES this year. Crucially, both Netflix and Amazon Video opted to support the format halfway through last year.

The problem keeping Dolby Vision from being the one HDR format to rule them all is that its dynamic metadata means it takes up a lot of space. So while its fine for people already streaming gigabytes worth of data over the internet, it's terrible for broadcasting content over airwaves — something necessary if broadcast TV ever plans to adopt HDR.

This is where Hybrid-Log Gamma (HLG) comes in. It's the newest HDR format on the block. Developed in tandem by the BBC and NHK, HLG was created specifically with broadcast television in mind and ignores metadata completely, instead relying on the TV to know precisely what its capable of reproducing from the signal it receives. This also allows it to be completely backwards compatible with standard dynamic range (SDR) television sets — something HDR10 and Dolby Vision content isn't.

HLG is the newest HDR format, but its support from BBC and NHK (the largest broadcasters in the UK and Japan respectively), its open source nature and the lost cost of implementation into sets has improved adoption in sets. Samsung, LG and Panasonic all announced support at CES.

Sony's Bravia XBR-A1E OLED can also play all three new formats.

And others could follow suit. The beauty of HDR is that implementation of new formats can in some cases be as simple as a software update. So future formats like the Technicolor backed SL-HDR1 could be added to your $6000 set with the press of a button

But HDR does have one major hardware limitation that will leave you frustrated when setting up your home theatre. It requires an HDMI 2.0 port with HDCP 2.2. HDMI 2.0 looks identical to the more common HDMI 1.4, but the ports are more expensive to produce and the cables much pricier, so most TVs have a mix of 1.4 and 2.0 HDMI ports, and none of them actually label the damn things or the provided HDMI cables.

Which means getting HDR content on your TV can be a nightmare of testing ports, fiddling with the television's settings app and haunting home theatre enthusiast forums. In fact, double check. Did you get an Xbox One S or PlayStation 4 Pro for Christmas? There's a good chance it isn't actually displaying the HDR content promised.

Confusing formats and struggles of port labelling aside, when HDR is working it's a stunning improvement to the films and TV shows you've been watching. Shows steeped in shadows no longer seem too dark to watch, explosions appear more realistic and cars and the shine of a football player's helmet are almost extraordinary in how real to life they appear. As the technical challenges to broadly implementing HDR are overcome, we're quickly finding out that the bright future was worth the wait.

This story originally appeared on Gizmodo


    The bright future is always worth the wait, of course.

    But we've seen just as many mis-steps as we have breakthroughs when it comes to the mass-market and adoption rates.

    Australians in particular, we are still struggling (collectively) to understand that the move from SD to HD has taken too long, 4K is starting to break through now, surely?

    You don't need a HDMI 2.0 cable to run 4k signal over HDMI that's just marketing bs to try and sell more cables, as long as your cable isn't ancient or crap (I'm talking like years and years old v1.0-1.3, and even then it might still work) it will work fine.
    Pretty much all HDMI cables sold in the last 5 years or so have been "high speed", even the crap generic ones.
    The ones included with the xbone and ps4 pro will work fine too.
    It's a bit like the difference between cat5/5e &6 Ethernet (cat 5 being HDMI v1-3, cat 5e being HDMI 1.4 high speed, and cat 6 being HDMI 2.0) - cat 5 will do gigabit speeds, it's just not rated for it - depends on your cable.
    If you do need a new cable, please don't waste your cash on monster cables - normal brand cables will work fine - digital is digital - it'll either work or it won't, your not going to get a better "signal" with monster.

    As for the different HDR formats, what this article doesn't explain is that Dolby vision requires a dedicated chip in both the screen and the input device (e.g. 4k bluray player) one chip for decoding and one for display mapping, unfortunately neither the ps4 pro or the xbone s have this chip built in - in fact there aren't even bluray players available with this yet.
    So even if your tv supports dolby vision hdr, only the inbuilt apps (netflix) will use it.
    4k dolby vision gaming will not happen at all (unless scorpio has this chip maybe?) and 4k dolby vision bluray playback will only work on certain supported titles once someone actually releases a player capable of playing them. The rest of the time they will just revert to HDR10.

    Always important to remember the superior format does not always win - betamax was better quality than vhs, vinyl was better quality than cassette, hd-dvd was (arguably) the better format, i'm sure there's more examples...

      Not sure what arguments you were following but I didn't ever see one argument for HD-DVD besides prejudice against the alliance that developed it. Lol

      I'm not 100% sure on the claims of 1.4 cables being HDR capable, 4k yes, but that doesn't necessarily mean HDR. Especially with Dolby Vision and what sounds like the enormous amount of additional metadata you may actually require a 2.0 cable

        They were cheaper to manufacture and (and therefore cheaper to buy), were region free and offered the same quality as blu-ray. Also a majority of the discs had the ability to be played in regular dvd players by reversing them. So from a consumer standpoint they were better especially for people transitioning (hence the arguably statement and the format war). Lol

        And 1.4 cables work fine with HDR, I'm using my old HDMI 1.4 cable from my original day one Xbone on my one s plugged into my Series 9 Samsung no worries at all. Like i said it depends on the cable, if it works it works - just plug it in and run the test.

Join the discussion!

Trending Stories Right Now