Streaming Media

 
Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn Streaming Media on Google+ Streaming Media on YouTube
Sponsors

Blackest Blacks: Ten Things to Know About Producing HDR
Shake your viewers all night long with the best-looking high dynamic range video imaginable. For those about to color grade, we salute you.
Learn more about the companies mentioned in this article in the Sourcebook:
{0}
{0}
{0}
{0}

High dynamic range, or HDR, gives video producers an expanded range of colors and brightness to display their videos. From a standards perspective, it’s the journey from Rec. 709 video to Rec. 2020, as you can see in the excellent simulation shown in Figure 1. From a brightness perspective, it’s the journey from 300–500 nits to 1,000–4,000 nits. And from a marketing perspective, it’s the promise of the blackest black viewers have ever seen on their screens.

While the transition from HD to 4K was straightforward, the transition from standard dynamic range (SDR) to HDR is much more profound, and it requires many new tools and workflow changes. In this article, I’ll identify the 10 things you need to know about HDR and introduce you to some HDR producers and technologists, and to technologies these producers leveraged along the way.

1. It’s actually quite simple.

HDR sounds complex, and at a technical level it is. Abstractly, however, it involves just five simple concepts.

First, to acquire the expanded brightness and color palette needed for HDR display, you have to capture and maintain your video in 10-bit or higher formats. Second, you’ll need to color grade your video to fully use the expanded palette. Third, you’ll have to choose and support one or more HDR technologies to reach the broadest number of viewers. Fourth, for several of these technologies, you’ll need to manage color and other metadata through the production workflow to optimize display on your endpoints. Finally, although you’ll be using the same codecs and adaptive bitrate (ABR) formats as before, you’ll have to change a few encoding settings to ensure compatibility with your selected HDR TVs and other devices.

I don’t mean to oversimplify, and you should know that VIVE Lifestyle Networks, which debuted a live HDR service at the NAB show in March, advised that it took “months of R&D” to finalize its workflow. However, we’ll cover these five steps in the discussion that follows, which should leave you with a high-level overview of the issues you’ll face when attempting to produce and distribute HDR video.

Figure 1. Rec. 709 vs. Rec. 2020 vs. P3 color spaces (from SMPTE presentation entitled SMPTE ST 2094 and Dynamic Metadata by Lars Borg, principal scientist, Adobe) 

2. All that said, you probably don’t care in the short term.

Hey, I appreciate your reading this article and all, but unless your target customers have HDR displays, there’s no sense producing HDR content. Today, and for the short-term future, the only significant concentrations of HDR displays are in the living room. If you’re producing premium content, you may want to shoot long-tail content in an HDR-compatible format, but if you’re distributing primarily to computers and mobile devices, it will be several years before a critical mass exists. If you’re not producing premium content, it will be years before you should consider producing garden-variety training, news, marketing, or other corporate-type videos in HDR.

On the other hand, if you’re a premium publisher targeting OTT, you should be well on your way with HDR plans and workflows. According to IHS Markit, about 4 million HDR TVs shipped in 2016, and 30 million units are expected by 2020. Since HDR content is coming from Netflix, VUDU, and most other premium content distributors, it’s becoming table stakes for high-end OTT.

3. There are two fundamental problems.

First, your production software is inadequate, at least in the short term. Second, your display hardware is inadequate. Why? Because of the expanded color and brightness discussed earlier. For years, we’ve been working with the Rec. 709 color gamut shown in Figure 1, the smallest square in the RGB color spectrum. We shot, captured, encoded, and displayed our videos in Rec. 709. It’s the color gamut in most Windows-based monitors, and while 2015 and later Macs expanded to the Digital Cinema (P3) format, even that is still much less than the colors represented by Rec. 2020. Trying to color grade video footage bound for a Rec. 2020 display in a Rec. 709 or even a P3 color space is like trying to paint with colors that you can’t see.

The same holds true for brightness, which is measured in nits. SDR displays and computer monitors display at between 300 and 500 nits, while HDR displays target 1,000 nits or higher. Again, you can’t adjust your brightest brights without using hardware and software that shows an accurate preview. For these reasons, at some point in the workflow, you’ll need to grade your color and brightness on hardware that displays it and in software that can drive the display.

You won’t have start from scratch, though. Many hardware and software components of your existing workflow will continue to work as before. For example, Adobe Premiere Pro can’t display HDR at this point (though it’s coming soon), but can pass through HDR-related metadata. So you can still edit in Premiere Pro (or Final Cut Pro X), but you probably will need a color grading program to optimize the colors and contrast in your video.

4. You can probably use your existing camera (for VOD anyway).

There are two requirements necessary to shoot video that can be produced as HDR. First, the camera must shoot HDR video, which means between 11 and 15 stops of dynamic range, a wide color gamut, 4K resolution, and preferably 60 fps. Second, it has to store that information in a raw or logarithmic format that preserves the full dynamic range of the sensors. This means that many older, yet high-quality digital cinema cameras, like the Red One or Epic, can still be used to shoot HDR for VOD production.

For example, VIVE Networks is using a Red Epic for VOD productions. As we’ll discuss soon, however, VIVE had to use a much newer and much more expensive Sony HDC-4300 for live and live-to-tape productions.

5. You’ll need to pick one or more HDR standards.

Your job as content producer is to make your content look as good as possible and to enable it to play on as many end points as possible. Table 1 presents the three most prominent options available for HDR and some of their key characteristics. Let’s identify the three, and then dig into the table.

Table 1. Comparing the top three HDR technologies 

Briefly, Dolby Vision is a standards-based system developed by Dolby, whose primary business model is licensing revenue from TV set, mobile, and other players. Note that Dolby Vision display hardware is backward-compatible with HDR10, so every Dolby Vision-enabled set can play HDR10.

HDR10 is an open standard supported by many companies and industry groups. Finally, Hybrid Log-Gamma (HLG) was jointly developed by the BBC and Japanese state broadcaster NHK, which needed a single format that could display on HDR and SDR 4K sets, and could be used for live and traditional broadcast channels.

The first four lines in Table 1 involve quality, starting with metadata. In Dolby Vision and HDR10, the metadata carries color and brightness information to optimize the picture on the display. With Dolby, this information is dynamically updated throughout the video; with HDR10 (but not HDR10+, discussed below), it’s static, and is set once for the entire movie.

Figure 2 illustrates the difference. On the left, one tonal mapping value is applied to clips of varying brightness. As a result, the two darkest clips have poor contrast, though the brightest clip looks perfect. On the right, multiple tonal mapping values are applied to match the content. The two darker clips have much better contrast, while the brightest clip, mastered at the same value as the static solution, looks the same. In this manner, dynamic tone mapping, enabled by dynamic metadata, should produce a superior experience.

Figure 2. Static vs. Dynamic tone mapping (adapted from Borg presentation) 

As the table shows, HLG doesn’t use metadata. Rather, it deploys a hybrid method of storing the color and brightness information that separates the SDR data from the HDR data. SDR is encoded using a traditional gamma curve compatible with SDR displays, while the HDR data is encoded using a logarithmic curve that lets compatible HDR displays stretch the signal over the additional brightness and contrast in the display. This allows one bit stream to serve both SDR and HDR sets, and you can see how the name was derived from the italicized words in the paragraph. Later in this feature, you’ll read more about the creative challenges this technique imposes.

Beyond metadata, Dolby Vision can process at up to 12 bits, which may help avoid color banding and similar issues over the two other systems that process at 10 bits. Although the contrast of most current HDR displays peak at around 1,000 nits, you can master Dolby Vision at up to 4,000 nits, which should make it more future-proof for more capable TVs in the future. Finally, as part of the license, Dolby Vision-equipped displays use a consistent color mapping engine that should ensure that Dolby Vision-mastered content provides better color and tone accuracy.

As stated above, backward compatibility with older 4K sets without HDR was a major driver for HLG. Dolby Vision can support backward compatibility via a dual-layer system, in which the base layer contains a legacy SDR bit-stream and an optional enhancement layer carries a supplementary signal and the Dolby Vision metadata. For delivery to known Dolby Vision-enabled systems, you can also produce a single layer that contains both video and metadata, which is the approach most producers use. HDR10 does not have a configuration that supports SDR devices.

At press time, Samsung and Amazon announced HDR10+, which is essentially HDR10 with dynamic tone mapping. This takes HDR from static mapping on the left in Figure 2 to dynamic mapping on the right; a very significant improvement. This feature will be available on 2017 TVs and on older HDR10-compatible TVs via a firmware update.

Figure 3 shows pre-NAB support for the various manufacturers and publishers for the technologies discussed above. For complete coverage, you’re going to have to select two or possibly three technologies and create the bit-streams necessary to support them. Hopefully, the list won’t keep growing.

Figure 3. HDR support by manufacturer and publisher (from Ultra HD 4K News) 

Beyond the features table analysis above, several considerations will likely dictate which standards you decide to support. For example, Dolby Vision is also deployed in movie theaters, so if you’re using the technology there, you have a substantial investment in expertise and equipment, and the Dolby workflow easily supports distribution to all display points. This makes Dolby support a natural.

Related Articles
Does a higher resolution guarantee the best image quality, or does better contrast and brightness? And can today's limited bandwidth handle all that data?
The Ultra keeps the Chromecast's compact size, adds 4K, and doubles the price. It's the cheapest 4K option, but will that be enough for shoppers?
While most viewers don't yet have a TV that can display high dynamic range content, YouTube is taking an early position in supporting the rich color technology.
4K is making inroads, but it's the profound visual richness of high dynamic range video that will really revolutionize how people watch television. Streaming networks are leading the way.
Apple's new Macs feature 10-bit HEVC decoding for HDR video viewing. To understand why that's important, it's worth a look at an old technology—film cameras
Why the move to HDR matters for more than just 4K—and why OTT services and devices are going to lead the way