Dolby Vision Vs HDR10 (Complete Guide 2021)

Dolby Vision Vs HDR10 (Complete Guide 2021)

Do you want to know what is the difference in Dolby Vision Vs HDR10? This guide is right for you. You will get all the necessary information here in this guide regarding the same. So relax and keep reading to know more.

Unsure about whether you should pick an HDR10 or Dolby Vision display? Let’s dive deeper into the similarities and differences between these two technologies.

Key Differences [Dolby Vision Vs HDR10]

Peak Brightness

Most Dolby Vision content is currently mastered at 4000 cd/m²; HDR10 and HDR10+ content are mastered at a variety of levels from 1000 to 4000 cd/m² depending on the title.

All three standards cater for images of up to 10,000 cd/m², although no display can currently reach that level. Therefore there is no real difference between the formats as they both top out at 4000 cd/m².

Winner: Dolby Vision. Dolby Vision content will have more consistent mastering since HDR10 and HDR10+ are not as specific in their requirements. Keep in mind, though, few 2018 TVs even go above 1000 cd/m², so it doesn’t matter for now.

Tone Mapping

If you have a TV with a maximum brightness of 1400 cd/m², how does it deal with highlights on a film that was mastered at 4000 cd/m²? It is crucial how TVs with a relatively low peak brightness deals with a film that has been mastered at much higher peak brightness.

The easiest way is clipping. In our example of a 1400 cd/m2 TV, everything from 1400 to 4000 would be clipped. What does this mean? It means that there would be no detail seen in that range of brightness and there would be no discernible colors in this region. This is simply because the TV cannot reproduce details in areas this bright as it is above its maximum output. Thankfully, most decent TVs don’t use this method anymore.

The alternative is tone mapping. On a 1400 cd/m2 TV, the highlights from 1400 to 4000 are remapped to fall below 1400 cd/m2. In practice, this means that there is some gentle roll-off of color in the highlights starting around 1000 cd/m2.

This would mean that some highlights on a TV that uses tone mapping would appear slightly dimmer than if the same TV used clipping. While this is inevitable, a tone-mapped picture shows a lot more detail in the highlights than one which is clipped.

*With Dolby Vision, the first devices that supported Dolby Vision needed a proprietary Dolby chip that checks the TV’s model and applies tone mapping using the TV’s limitations as a reference. Dolby has relaxed that requirement, and some TVs, including Sony TVs, do this via software instead. With HDR10 and HDR10+, tone mapping is entirely the manufacturer’s choice, which can lead to inconsistency.

HDR10 and Dolby Vision are two main HDR formats. The difference is that HDR10 is an open-standard and non-proprietary, whereas Dolby Vision requires a license and fee from Dolby.

And while Dolby Vision is currently capable of producing a better image quality, there are no TVs that could take full advantage of what it provides as opposed to HDR10.

However, Dolby Vision does offer a better picture quality, mainly due to its dynamic metadata.

HDR or High Dynamic Range is the new buzzword when it comes to modern TVs and monitors. But, what exactly is it all about?

In short, an HDR TV or monitor can display more life-like colors as well as higher contrast and brightness levels while playing HDR-compatible content.

In this article, we’ll focus on the differences between two main HDR formats: HDR10 and Dolby Vision (DV).

Metadata

Dolby Vision Vs HDR10

Metadata can be thought of as a sort of instruction manual, that describes various facets of the content; it is contained alongside the series or film and helps the display deal with the content in the most effective way.

One of the ways the three formats differ is in their use of metadata. HDR10 only asks for static metadata. With static metadata, the boundaries in brightness are set once, for the entire movie/show, and are determined by taking the brightness range of the brightest scene. Dolby Vision and HDR10+ improve on this by using dynamic metadata, which allows them to tell the TV how to apply tone-mapping on a scene-by-scene, or even a frame-by-frame basis.

Winner: Dolby Vision and HDR10+. They are better at adapting to scenes that have very different lighting.

Gaming

Although HDR was initially designed for movies, the advantages of gaming are undeniable. Most modern consoles support HDR10, including the original PlayStation 4 as well as the PlayStation 4 Pro, and new Xbox Ones. The Xbox One also supports Dolby Vision, although it isn’t supported by all TVs, as it uses a new ‘low-latency Dolby Vision format, ensuring the extra processing used for Dolby Vision doesn’t add too much latency for gaming. 

Like with movies, game developers have to enable support for HDR in their games. Unfortunately, HDR10 isn’t always implemented properly, so the actual results may vary. There are a handful of Dolby Vision games available now, including Assassin’s Creed Origins, Battlefield 1, and Overwatch, just to name a few.

Availability

Availability of the new HDR formats has drastically improved in recent years. Both Dolby Vision and HDR10 can now be found on UHD Blu-rays, and are supported by most external devices, although there are still only a few movies available on a disc that support Dolby Vision. The first HDR10+ movies have been released on UHD Blu-ray, but most external playback devices don’t support it yet. It’s important to note however that if the TV itself doesn’t support Dolby Vision, using an external source won’t make a difference. Find out where to find HDR content

Monitors

Monitors have been very slow to adopt HDR. It is starting to change though, with more and more monitors now supporting HDR10. The VESA standards group has been pushing for new standards for HDR on monitors, which typically can’t hit the brightness levels necessary for a great HDR experience. Despite this, monitors are still a few years behind TVs. The few monitors that deliver a decent HDR experience are very expensive and still fall short of the HDR experience on TVs. No monitors support HDR10+ or Dolby Vision, although a few laptops have been released with Dolby Vision support. 

HLG

There is also HLG or hybrid log gamma. It is currently supported by most major TV manufacturers. HLG aims to simplify things by combining SDR and HDR into one signal. This is ideal for live broadcasts, as one signal can be played by any device receiving it. If the device supports HDR, it will display in HDR, if it doesn’t, the SDR portion of the signal is played. As it is intended for live broadcasts, there is very little HLG content available.

Supported TVs

While the vast majority of TVs support HDR10 and many models support at least one of the more advanced formats, none of the models available in the U.S. support all formats. Sony, LG, Vizio, Hisense, and TCL support HDR10 and Dolby Vision on most of their models, whereas Samsung remains invested in HDR10+, and does not currently support Dolby Vision on any of their models. Outside of the U.S., some Panasonic TVs support all three formats.

You shouldn’t expect the cheaper HDR TVs to make use of all the extra capabilities of the formats. For most of them, you won’t even be able to see a difference. Only high-end TVs can take advantage of it.

Dolby Vision vs. HDR10

Dolby Vision Vs HDR10

First of all, HDR10 is free and open-standard, meaning that TV producers and content creators don’t need to pay for implementing it, while Dolby owns Dolby Vision.

So, what exactly do you get for paying the premium for Dolby Vision?

Well, DV is capable of displaying 12-bit color depth, which amounts to 68.7 billion colors, whereas HDR10 is limited to 10-bit and 1.07 billion colors.

However, since there are no 12-bit TVs or such content available yet, Dolby Vision downsamples its color depth to 10-bit, which provides only a subtle improvement over the native 10-bit color.

Dynamic HDR

Moving on, Dolby Vision implements its metadata on a frame-by-frame basis, which makes it ‘dynamic HDR’ while HDR10 is static.

What does this mean for you?

It means that if you are watching an HDR movie on an HDR display, HDR10 will set the same metadata for the entire film. In contrast, Dolby Vision can change information such as color and brightness dynamically, thus making the viewing experience more immersive.

Moreover, there’s a new HDR10+ format, which aims to introduce dynamic HDR to the HDR10 standard while remaining royalty-free.

At the time of this writing, HDR10+ content is only available on a few streaming services (including Amazon) and discs, but more and more TVs are beginning to support HDR10+.

Bit Depth

Bit depth describes the number of graduations of colors in an image. SDR content is typically mastered at 8-bit, which allows for 16.7 million colors. HDR content is usually mastered at 10 bit, which allows for up to 1.07 billion colors. The more colors you can display, the more realistic the image appears, with less banding and a more subtle transition in areas of similar color. For more information, have a look at our article on gradients.

Dolby Vision content allows for up to 12-bit color; HDR10 and HDR10+ are only 10 bit. This might not sound like a big difference, but 10-bit color equals 1.07 Billion colors, whereas 12 bit increases that to an impressive 68.7 Billion colors. This allows for much finer control over gradations, resulting in a more life-like image, with no banding.

Availability

Unsurprisingly, HDR10 has the lead when it comes to content availability since it’s free.

Further, more TVs support HDR10 than Dolby Vision, and there are no monitors that support DV yet. The PS4/Pro and PS5 only support HDR10, while Xbox One S/X and Series S/X support both HDR technologies.

Most HDR content can be found on streaming services such as Netflix, Amazon Video, and Vudu, where you can find content for both Dolby Vision and HDR10.

Keep in mind that all TVs that support Dolby Vision are also compatible with HDR10; so, you are not forced to choose between the two formats.

There are plenty of PC and console games that support HDR10 as well.

Another vital HDR format is HLG (Hybrid Log-Gamma), which was developed by BBC and NHK for live broadcast; HLG features static HDR and is compatible with HDR10 and DV displays.

HDR Monitors

HDR monitors have different standards than HDR TVs. You may even find 8-bit color monitors that feature HDR; these are often referred to as ‘pseudo hdr displays since they don’t offer a particularly significant improvement over the standard image quality.

In order to distinguish HDR10 monitors, look for VESA’s DisplayHDR certification.

Most people find that a display should have at least 1,000-nit peak brightness for the optimal or “true” HDR viewing experience.

Conclusion

What’s the bottom line?

For now, HDR10 is a more cost-efficient and widespread format, while Dolby Vision is the premium option.

However, with the introduction of HDR10+ and HDMI 2.1, dynamic metadata will be more accessible, so it remains to be seen whether Dolby Vision is going to be worth it.

Dolby Vision is arguably the most advanced HDR format from a technical standpoint, but although it has improved significantly, the lack of content is holding it back a bit. HDR10 has the distinct advantage of having more content available and being supported on the majority of TVs. HDR10+ almost matches the capabilities of Dolby Vision but is extremely lacking in content, and in the U.S. at least, is only supported on Samsung TVs.

Ultimately, the difference between the two formats isn’t that important. The quality of the TV itself has a much bigger impact on HDR. Although the technology has improved significantly in recent years, it’s still quite early days for HDR. Both formats have the ability to produce much more dynamic images than we are seeing on the best TVs today. The limitation is down to both the TV technology and the way the titles are mastered. We can’t yet reach the 10,000 cd/m2 maximum peak brightness and the expanded 12-bit color range.

Leave a Comment