-->
Save your seat for Streaming Media NYC this May. Register Now!

Blackest Blacks: Ten Things to Know About Producing HDR

Article Featured Image

On the other hand, if you’re looking for live HDR technology, Dolby doesn’t yet have an offering, while both HDR10 and HLG systems are available. Finally, geographical considerations are also important, and the technology selected by the BBC and NHK will likely dominate the U.K. and Japan. One U.K. broadcasting executive I spoke with said they adapted HLG because they knew it would likely dominate in the U.K. In this regard, when choosing which HDR technology or technologies to support, it’s not only about supporting the highest-quality technology available; it’s about supporting the technologies used by your target viewers.

6. Your workflow and toolset will be dictated by your format choice.

Your workflow will almost certainly be dictated by your HDR technology choice, particularly if you elect to produce Dolby Vision. You can see this in Figure 4, which details the Dolby Vision workflow that incorporates key development partners at every step. This graphic is from an informative overview presentation available on Vimeo at vimeo.com/ channels/dolbyvision.

The Dolby Vision workflow is very structured, and includes grading and output provisions for screens ranging from digital theaters to old Rec. 709 TVs. Martin Zeichner, Colorist at Deluxe’s Encore Post in New York, who mastered the second season of Netflix’ Marco Polo for Dolby Vision, describes the process: “What Dolby envisioned is that the colorist would work on the HDR image first, then Dolby Vision software would create a metadata file—much smaller than the rendered image—that would map that HDR color grade to any nit value that any TV has. It’s genius. It lets you create any deliverable you want from the HDR master.” This functionality comes at a price; there’s special hardware required for some of the processing, which means CAPEX at the start of preparing your Dolby Vision master at a post house. (Click here for an extended version of our interview with Martin Zeichner.)

blackest4

Figure 4. Deluxe colorist Martin Zeichner running DaVinci Resolve with Dolby Pulsar and Sony X300 monitors 

The ultimate output from this process, for most producers, is a Dolby Vision mezzanine file, an MXF file with 12 Bit RGB JPEG2000 frames and the associated metadata. Larger publishers, like Netflix, use this file as the source for all of their deliverables, from Dolby Vision and HDR10 at the high end to low bitrate Rec. 709 encodes for mobile downloads on the low end. (Click here for an extended interview with Netflix about its approach to HDR.)

Otherwise, for HDR10 and HLG, you should be able to use most existing tools, although many producers will need both a colorist and a mastering display. Where Dolby Vision tools manage the HDR-to-SDR conversion for you, outside the Dolby workflow you’ll have to figure out these conversions yourself. You shouldn’t have to grade twice; for efficiency tips, check out Cross Converting HDR to HDR & HDR & HDR to SDR at the Mystery Box website. For those who don’t know Mystery Box, it’s a production company with a deep knowledge of HDR, and its five-part tutorial on the subject is the perfect technical primer.

As stated above, when working with non-Dolby workflows, you’ll need to choose an intermediate format that maintains the color and brightness information and any metadata. According to Mystery Box website, “Pro-Res 4444 /4444 (XQ), DNxHR 444, 12 bit DPX, Cineform RGB 12 bit, 16 bit TIFFs, or OpenEXR (Half Precision) are all suitable intermediate codecs, though it’s important to double check all of your downstream applications to make sure that whichever you pick will work later. Similarly, any of these codecs should be suitable for mastering, with the possibility of creating a cross converted grade from the master later.”

7. You’ll almost certainly need a colorist.

If you’ve distributed premium content in the past, you’re undoubtedly familiar with the role of the colorist in the production process. (If you’re not, I’ll explain briefly. Colorists perform two functions: color correction, which ensures proper exposure, white balance, and contrast; and color grading, which involves creating the specific look desired by the director.)

Most video editors know color correction to some degree, but HDR’s expanded palette of color and brightness require an expanded skillset. Where color correction is a largely technical skill, color grading is art. So if you want your production to achieve a certain look or even fully realize the creative opportunities available in HDR, you should involve a pro, at least for the first few projects.

Here’s how Deluxe colorist Zeichner describes the process: “What HDR introduces is an extension of the color palette. If you’re working with a DP [cinematographer] who’s used to the luminance range of film prints and projection or video, they may experience the extended luminance range in HDR as ‘too much.’ ... When they start the HDR grading process, DPs can be fairly conservative in their requests to extend the luminance range, then gradually start wanting to do more HDR highlights.”

Interestingly, Zeichner also recommends grading Dolby Vision and HDR10 projects differently. “With HDR10 the recommendation is generally to grade for Rec. 709 first, then grade the HDR version.” As described above, with Dolby Vision, you master HDR first, and the software maps that to SDR and Rec. 709 for you, though you can also make manual tweaks as desired.

When grading HLG, Mystery Box producer Samuel Bilodeau recommends setting up two displays, one simulating SDR the other HDR, so you can visualize the results of the grade on both displays simultaneously. Regarding the process itself, Bilodeau shares, “The biggest complaint I have with grading in HLG is the relative contrast between the HDR and the SDR images. Because HLG runs up to 5,000 nits with its top digital values, if you’re grading in 1,000 nits you end up with a white level in the SDR version below the usual peak white. This often means that the whites in the SDR version look muddied and lower contrast than the same content graded for SDR natively.”

After sharing additional concerns, Bilodeau concluded: “Personally, I find grading in HLG compounds the minor problems of HDR with the problems of SDR, which I find extremely irritating. Rather than being happy with the grade, I’m often left with a sense of, ‘It’s okay, I guess’.” It appears that not only will you need a colorist, you’ll need a colorist with experience with your target HDR outputs.

Even uploading HDR to YouTube requires a colorist, or at least someone familiar with color-grading software. A warning on the site states, “If you did not grade your own content in HDR, or don’t know what it means to color grade a video, you should not use the YouTube HDR metadata tool.” In other words, don’t try this at home, hire a pro.

8. Encoding will likely be the simplest operation.

As mentioned before, encoding will be relatively straightforward, though it will vary from platform to platform and program to program. For example, I spoke with Greg Mirsky, VP of product management at codec and video optimization vendor Beamr, who shared that HLG was the simplest case. Specifically, you just have to make sure that the source is 10-bit and then you encode as normal, using the Main 10 profile and Rec. 2020 color space. I asked if you needed to tell the encoder to use the special HLG transform and he said no; but Mystery Box’s Bilodeau said you needed to include it when encoding with FFmpeg.

For HDR10, you need specific metadata values like the specs of the mastering monitor, and values for MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Frame-Average Light Level). Some services, like cloud provider Hybrik, can calculate these values for you by scanning the RGB values in the master.

When not available, Bilodeau stated that most producers use estimates based on the capabilities of the mastering monitor. “For instance, the Sony BVM-X300, which has a max individual pixel brightness of 1,000 nits and a max frame average of 180, and an indicator will turn on if you exceed the 180 value. If you stay below both the max pixel value (which you can ensure by watching your scopes in color correction) and keep any individual frame below the frame average of the display, you can be confident that your final file will have the MaxCLL of 1,000 and MaxFALL of 180.”

Bilodeau added that, at least for the moment, adding these values really doesn’t matter. “Fortunately, no displays currently on the market make use of the MaxCLL and MaxFALL data, so if they’re off by any amount it’s not a terrible thing.”

Dolby Vision should be the simplest case since, you’d be feeding Dolby Vision mezzanine files into a Dolby Vision-certified encoder or encoding service that knows precisely how to handle it.

9. Live is going to be expensive.

As mentioned above, you can’t currently stream Dolby Vision live, although the company is working in that direction. Both HDR10 and HLG live production workflows and distribution networks are currently available. For example, at NAB, Globecast and Harmonic partnered to demonstrate the first ultra-high-definition (UHD), high dynamic range (HDR), OTT-managed platform as a service (PaaS) for delivery of linear content over the internet to connected TVs. VIVE Lifestyle Network used the platform to deploy its new service at NAB.

As I mentioned earlier, to produce VOD video, VIVE used a Red Epic camera. For live and live-to-tape, VIVE used a Sony HDC-4300 (~$62,500), which plugs into the Harmonic-powered encoding system shown in Figure 5. The system output five layers of HDR video with a peak rate of 25 Mbps, which is distributed via Globecast Networks (Figure 6). During our conversations, Harmonic reps claimed that their Electra VS was the only single-rack unit encoder capable of outputting multiple layers of HDR in real time.

blackest5

Figure 5. The Harmonic- powered encoder system 

There have also been multiple live trials with HLG, mostly in Europe, but also in Canada. Whether HDR10, HLG, or ultimately Dolby Vision, you should expect live HDR production to require much newer and much more expensive production gear than what you can use for VOD productions.

blackest6

Figure 6. The VIVE lifestyle HDR network, encoded by Harmonic and distributed by Globecast 

10. The laws of nature still apply.

As we all know, having great-looking video is useless if you can’t deliver it to your viewers, and 4K HDR should consume anywhere from 15 to 25 Mbps. Checking Akamai’s State of the Internet report for the fourth quarter of 2016, the average connection speed in the U.S. is 17.2 Mbps, with 42 percent of viewers above 15 Mbps.

Those are encouraging numbers, but a useful reality check comes from Netflix’s ISP Speed Index, which measures primetime ISP performance in multiple countries. For March 2017, the top performer was Comcast at 3.94 Mbps. To be clear, not all Netflix customers were requesting 4K files, so this could be more of an average number than an indication that 4K streams weren’t getting through. Still, before you invest in creating the service, be sure you can deliver it to your viewers.

This article appears in the June 2017 issue of Streaming Media magazine as "Ten Things to Know About Producing HDR."

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

SME 2018: Netflix Bridges the Content and Technology Gap

At a Streaming Media East keynote panel, producers, colorists, and cinematographers spoke with Netflix's Christopher Fetner about the challenges and opportunities that come with creating in 4K and HDR

Amazon Prime Video and Samsung Announce First Use of HDR10+

Dolby Vision isn't the only HDR game in town, although consumers might wish it was. Royalty-free HDR10+ is finally available for streaming.

Video: Will UHD and HDR Live Streaming Require Higher Frame Rates?

Harmonic's Ian Trow discusses the escalating frame rates associated with UHD/4K and HDR delivery

Infographic: The OTT Takeover

Why the move to HDR matters for more than just 4K—and why OTT services and devices are going to lead the way

HDR: The More Bits, the Better

Apple's new Macs feature 10-bit HEVC decoding for HDR video viewing. To understand why that's important, it's worth a look at an old technology—film cameras

The State of 4K and HDR 2017

4K is making inroads, but it's the profound visual richness of high dynamic range video that will really revolutionize how people watch television. Streaming networks are leading the way.

YouTube Adds Support for HDR Video on Compatible Screens

While most viewers don't yet have a TV that can display high dynamic range content, YouTube is taking an early position in supporting the rich color technology.

Google Adds 4K and HDR to Chromecast Ultra, Competes With Roku

The Ultra keeps the Chromecast's compact size, adds 4K, and doubles the price. It's the cheapest 4K option, but will that be enough for shoppers?

The Great UHD Debate: 4K Battles HDR for the Future of TV

Does a higher resolution guarantee the best image quality, or does better contrast and brightness? And can today's limited bandwidth handle all that data?

Companies and Suppliers Mentioned