Tech News
← Back to articles

HDR10 vs. Dolby Vision vs. HLG: HDR Formats Compared

read original related products more articles

Nearly all modern TVs can decode and, to varying degrees of success, display high dynamic range content. The best TVs can make HDR shows, movies and even some videogames pop with color and contrast that non-HDR TVs could never accomplish. In addition to a TV that can display HDR, you also need HDR content. Most HDR content is available in multiple HDR formats and each format is slightly different. Confused yet? I hope not because even better, not every TV can decode every HDR format.

Right now the two main formats are HDR10 and Dolby Vision. There's also hybrid log gamma, or HLG and HDR10+. The good news is that every HDR TV can decode at least HDR10 which is available with most HDR shows, movies and some games. There are several differences between the different formats, so they're worth discussing in detail.

Image quality

Dolby

Winner: Dolby Vision (and the upcoming Dolby Vision 2)

This is a broad generalization, and in many cases the best picture quality will come down to the specific content and the specific display. That said, Dolby Vision, and the upcoming Dolby Vision 2, can potentially look better for a few reasons. For one thing, unlike HDR10, DV has dynamic metadata. This means that the brightness levels of HDR content can vary on a scene-by-scene basis, giving filmmakers finer control over how the image looks. HDR10 (more on HDR10+ in a moment) has static metadata. This means the HDR "look" can only be determined per entire movie or show.

The other major reason DV could look better than HDR10 is Dolby itself. TV manufacturers must pay Dolby a fee for DV compatibility, but for that fee Dolby will also make sure the TV looks as perfect as possible with Dolby Vision content. It's basically an end-to-end format, with Dolby ensuring all the steps look right, so the result at home looks as good as that content and that display possibly can.

HDR10 is an open format. Each manufacturer is left to their own devices, pun intended. This presumes that the manufacturer will have engineers that know enough about HDR and TVs to get the HDR looking correct on their TVs. This is a big presumption. Most manufacturers' 4K Blu-ray players still have the chroma upsampling error, something that should have been solved in the DVDera (yes, that's a link to an article from 2001 and yes it's amazingly still relevant). So you'd hope that TVs would read the HDR data correctly and look great, but that's not necessarily the case. For example, here are two HDR projectors running the same video. The right one isn't processing the HDR data correctly:

Enlarge Image Two projectors, side by side. Notice how there are three individual lights in the left image, but a single blob of light on the right. Geoffrey Morrison/CNET

Other factors, such as Dolby Vision being potentially 12-bit over HDR10's 10-bit, is less of a factor and depends more on the content and display if you'd see any difference. Both formats have wide color gamut too, so that's not an issue.

... continue reading