-->
Save your seat for Streaming Media NYC this May. Register Now!

How To Master Remote Production

Article Featured Image

During the pandemic the broadcast industry underwent a radical transformation in the way people collaborate on the live production of concerts, sports and events, as well as remote production and post-production for movies and TV shows.

Ironically, this coincided with a historically high demand for original content as we attempted to distract ourselves and quench our thirst for human connection.

This transformation had an enormous impact on remote editing and production teams dealing with the travel and in-person restrictions from the pandemic. Hollywood studios and national broadcasters experimented with new technology to enable broadcastquality features through consumer-grade workflows.

Actors, directors, color graders, post-production editors, VFX supervisors, music composers, live event producers, on-air talent… they all needed a way to collaborate in real-time to produce content, separated from their professional production equipment and brickand-mortar studios.

And here lies the problem.

Hollywood was brought to a standstill by the Coronavirus pandemic, forcing the editing and production teams to work from home. But working from home demands that you connect to your local ISP, suffer the vagaries of domestic Internet connections and work on your consumer grade computer.

However high your home computer specs might be, it’s not your studio kit.

While real-time remote production or post production is not new, it was historically done through file exchange and annotation, which is iterative and sequential, and thus not interactive.

Teamwork requires real-time connections to allow for collaborative post-production. That also is not new to the broadcast ecosystem. But existing solutions required dedicated appliances, networks and robust internet connection speeds to guarantee the necessary quality to work well.

When the live and film production process demands digitally connected teams, all working in real-time, you are suddenly confronted with some major problems.

Let’s consider what happens in a typical film studio. You are fully digital, connected in real-time to your co-workers during:

  • Pre-visualisation: location scouting, testing visual scenes with 3D tools like Unreal Engine and Unity virtual sets.
  • Pre-production: testing scene variations including color, positioning, camera lenses, angles and lighting before shooting the scene.
  • Production: shooting and digitally mixing real and virtual components on virtual sets.
  • Post-production: the most demanding stage of production in terms of visual quality as the finished movie is being created, with special effects, color editing, sound editing, music compositing, with large editing teams working together to create the final product.

It is in post-production that the “heavy lifting” takes place — creating the finished movie in real time. And here the term “realtime” is used to refer to real-time renderings including the capacity to compute the final scene with all of the visual effects, with a level of quality and resolution the studio expects for validation and release. Post-production teams use a combination of NLEs and VFX software. This includes Adobe Premiere and After Effects, Avid Media Composer and Pro Tools or DaVinci Resolve that combine editing, color correction, visual effects, motion graphics and audio post production. As well as Autodesk Maya, 3ds Max, or Unreal Engine for 3D and VFX.

Real-time rendering used to be a challenge, even in a physical studio environment without the challenges of remote workflows. It requires that teams have access to dedicated networks, studio-level cameras and displays, and powerful servers located in one or more linked on-site studios. A far cry from the usual work-from-home (WFH) environment.

Post-production is a creative process that requires teamwork. The technology today is great for on-premises work between studios and sets, where team members stand and interact with each other in real-life with over-the-shoulder reviews.

And the same is true for the live remote production of concerts, sports and other events, where live content needs to be captured from remote locations and managed from a central studio or control room.

However, these legacy workflows fail when confronted with this new remote WFH environment with consumer grade hardware, desktops or laptops, and public internet access.

But, as the old adage goes, the show must go on.

And for the show to go on, broadcasters and Hollywood studios must find a way to enable a live and post-production environment for their WFH teammates. That is, to create the same professional broadcastquality output as would be achieved in-studio with high speed, high capacity networks and expensive hardware and applications.

Remote Post-Production

Whether in the studio or WFH, remote post-production requires the same holy trinity:

  1. Movie / Broadcast Quality Production
  2. Near-zero latency for team interactivity and collaboration in real-time
  3. Affordability because everybody needs to access these solutions from home.

Requirement 1: Quality

Whether you are using a desktop, laptop, browser or application at home, you have to deliver “movie quality” or “broadcast quality” production that supports the full gamut of color and sound.

Color measurement refers to three things: color depth, color accuracy and color sampling:

  • Color Depth: the base is 8 bits, improving to 10-bit and then 12-bit that offers a full range of Whites, Reds and Blues.
  • Color Accuracy: color grading is a vital part of the post-production process, with High Dynamic Range (HDR) corresponding to the physical values of luminance or radiance that can be observed in the real world. Note that HDR can only be achieved with a minimum of 10-bit color depth.
  • Color Sampling: without going too deep into the technical details, there are 3 levels (4:2:0, 4:2:2, 4:4:4) with the numbers relating to Chroma components of the image. 4:4:4 is the original size of the image and the best possible Chroma sampling.

And then there is audio quality. The bare minimum is stereo audio, with 5.1 and 7.1 surround sound requirements for most live and post-production projects.

To honor these visual and auditory quality requirements, these features need to be supported natively in the tool we use to interact with our remote co-workers everyday: the web browser. But this is easier said than done, as browser vendors need to build-in this support.

To summarize, these remote production workflows must support professional grade input support (SDI), 10 and 12-bit color depth, HDR, 4:4:4 chroma sampling, and surround sound. WFH means that this solution needs to run on consumer-grade home desktops or laptops, and work over the public internet.

Requirement 2: Near-Zero Latency

Another challenge of remote production solutions is maintaining quality while simultaneously delivering real-time latency to enable interactivity for team members located across the world.

It is pointless to use latent HLS live streams that are delayed three, five, ten seconds or more when you are trying to pull in a remote interview live on-air, or reviewing a VFX scene with your colleagues in post-production. There are video conferencing apps like Zoom or Teams, but you sacrifice quality for interactivity by not being able to control the media pipeline.

Requirement 3: Cost

Existing remote production equipment including studio servers, application hardware and software are extremely costly. Some of the most used hardware encoding and streaming “boxes” cost as much as $50,000 a piece. Most require dedicated private networks installed at Illustration of codec, quality and color profile settings. home. Not a financially attractive proposition for a full WFH remote production team — notwithstanding the performance limitations.

But don’t we already have a solution for this? What about NDI or SRT?

NDI (Network Device Interface) allows anyone to use real time, ultra-low latency video on existing IP video networks, and it is popular in both the production and broadcast industries to create virtual video and audio streams. The problem with NDI is that it performs very poorly across the public internet as it was never engineered for that purpose and is not supported natively in the web browser.\

SRT (Secure Reliable Transport) is an open-source video streaming protocol that brings high quality, low-latency live video over the public internet. SRT is fast becoming the successor to RTMP, but like RTMP is not natively supported in the web browser, requiring a hardware device to decode live streams for playback. This can prove expensive and hard to manage, as you need to send a hardware device to every member of the production team who is remote.

The Internet, and consequently the web browser, is the only tool purpose-built to enable the interactivity and collaboration needed by remote production teams around the world.

Dolby.io acquired Millicast in February of 2022 specifically for this reason. Millicast was able to support these requirements not only for Hollywood Studios and post-production houses including NBC Universal, CBS Viacom, HBO and Disney, but also for live remote production and REMI use cases for NFL, ESPN and NASA.

Dolby.io has made encoder and decoder optimizations in WebRTC that add native support in the web browser, enabling remote production studios to WFH even for the most demanding aspects of their work like visual effects and color grading.

Quality

Using an extended version of WebRTC, Dolby.io supports 4:4:4 video quality and 5.1 audio, while staying compatible with the Chrome browser for rendering and playback.

Collaboration

Typical latency between team members is less than 250ms, a latency needed to enable human interactivity and collaboration in real-time with security and encryption on par with what studios already use today.

Bandwidth

Support for newer video codecs provide better compression that uses less bandwidth, therefore reducing cost and adding extra resilience to handle the unreliable quality experienced through public internet connections at home.

Cost

Beyond capture and output, where your usual SDI input/ output cards can be used, no special hardware or network access is needed to enable the full editorial, design and creative process required by your geographically dispersed editing team. The Dolby.io Real-Time Streaming solution works on any device or web browser, including off-the-shelf PC or Mac computers without external encoders or decoders.

Dolby.io makes the science of sight and sound accessible to developers and creators through simple-to-use APIs. These APIs are used by Broadcasters and Hollywood Studios to develop in-house products for remote production. And we also have customers like 5thKind that use Dolby.io to power their CORE Live product that provides secure, real-time collaborative review for media production workflows, with high-quality livestreams and visual watermarking. Their product won the NAB Show Product of the Year in 2020 and is used by many studios and post-production houses.

5thKind provides remote production workflows that help film and TV studios with daily reviews, post-production and VFX.

A key component of any production set is the Video Village, where crew and cast gather to view what is being shot on-set. Video assistants are responsible for distributing on-set camera video signals to the director's monitors onset as well as offset and off-site monitors for clients, crew, and other people involved in the production.

Off-camera playback allows proper shot per shot review without involving the camera crew or the camera's playback. The reason why separate playback is crucial is because it allows the camera operators, assistants, and the DP to work with the live image while the director reviews what was shot, freeing each department to work between setups.

Production teams have a growing need for a Virtual Video Village, where off-site editors, color graders and VFX engineers can review onset footage in real-time to catch errors and provide feedback that can reduce expensive production costs incurred from re-shooting scenes.

The fusion of computer games and movie making is all but complete with an ever-growing need for remote production collaboration with gaming engines like Unreal Engine, Unity and 3D software like Maya and 3D Studio Max. All stages from pre-production, pre-visualization, storyboarding, to production and post-production are now driven by virtual visualization.

In recent years everything from Jurassic Park to Avatar, the remake of the Lion King, and The Mandalorian are using virtual characters and virtual sets to evolve the movie production process.

Creativity has moved to post production in a digital world where anything can be manipulated. Teams of creative artists, and editors are connected by high-capacity networks to large servers and ultra-high definition monitors and software. Colocated and working in real time, a movie is no longer “cut” from daily rushes but created in real time as the movie is shot on a digital set.

REMI Production

REMI (Remote Integration Model) refers to a production workflow that allows live content to be captured from remote locations and managed from a central control room. REMI helps broadcasters and live event producers achieve a number of benefits, including a reduction in infrastructure costs, production costs, and improved workflows to make production teams more effective.

REMI combines broadcast-grade production tools with the flexibility of the cloud, to deliver live broadcasts and events in a collaborative, browser-based workspace. It maintains all of the key production elements that you’d find in a standard broadcast. The key difference is that your on-camera talent is in a remote location.

Todd Mason, CEO of Broadcast Management Group, works with ESPN to broadcast events like the Little League World Series and March Madness, the NCAA Men’s Basketball National Championship.

In a recent visit to their REMI facility in Las Vegas, I spoke with Todd and he mentioned that in today’s climate, with people unwilling or unable to leave their homes, a REMI production workflow makes sense. And it’s one that’s being heavily utilized by production companies and content creators throughout the country.

Any REMI production starts with capturing video and audio in the field. Regardless of the content you’re producing, you need a mechanism for capturing high-quality video and audio of your talent. Each of these video and audio signals then needs to be transmitted to your central control room.

REMI production workflows have been utilized in the sports world for years (more on that later). With a live sports production, you’re dealing with multiple cameras that all need to be transmitted to a central control room. This requires a more comprehensive camera setup at the local stadium and a reliable mechanism for transmitting those video and audio feeds.

If you’re producing a live news or entertainment program where you have individual correspondents in remote locations, your field camera kits can be more streamlined and efficient.

Your camera and audio needs in the field can vary, but they are a necessary part of any REMI production.

At Broadcast Management Group, they have developed at home camera kits as part of their Live At Home REMI solution, that include a 4K PTZ Camera.

All REMI production workflows rely on a central control room. This is where all of your camera and audio signals are fed and where your primary switching, mixing, and transmitting takes place.

Your REMI control room can range from a flight pack to a dedicated control room to a production truck. Your REMI control room can – and will – vary depending on your needs. And those needs will be dictated by the content that you’re producing.

Generally speaking, a REMI control room will need the following:

  • Video switcher
  • Audio mixer
  • Graphics system
  • Record and playback system
  • Comms system
  • Inbound and outbound transmission capabilities

Again, these are baseline requirements. Your content will dictate the technical needs.

An added aspect that they have integrated into their Live At Home REMI solution is virtual set technology. By shipping a green screen to your talent, you can create a completely customizable, branded virtual backdrop that fits the program’s aesthetic.

Transmission and connectivity are huge components of a successful REMI production.

Your transmission needs can be broken down into three categories:

  • Connectivity at your primary talent location
  • Incoming transmission capabilities at your REMI control room
  • Outbound transmission capabilities at your REMI control room

From your primary talent location, you’ll need to send video and audio signals with little to low latency. For a live sports production utilizing REMI, these signals might be sent back via fiber or satellite. For a single-camera feed, an IP-based solution – like LiveU – may be sufficient. For anything IP-based, it’s important to assess your local bandwidth availability in advance. The best remote transmission method will depend on what’s available to you at your primary location and your budget tolerance.

At your REMI control room, you’ll need the ability to receive all of your incoming feeds. For example, if you’re remotely switching a sporting event with eight cameras, your REMI facility will need to have the infrastructure available to receive those eight signals. The receive infrastructure is not always a one-to-one ratio. Meaning you don’t always need one receiver for every one incoming feed. For example, LiveU’s LU-2000 server can bring in up to four remote feeds.

Finally, you’ll need the ability to SEND your signal somewhere. This may mean streaming live to the web, or sending your signal out via fiber or satellite. This piece of the transmission puzzle is dependent on your content goals. For any of the options above – IP, fiber, or satellite – your REMI facility will need to have the appropriate equipment and outbound connectivity.

Key components of a successful REMI production is clear communication and real-time collaboration between all members of the team. With the team geographically dispersed across the world, even one second of delay between team members can prove disastrous.

To solve this, broadcast studios and control rooms use a Multiviewer to display multiple video sources on a single display monitor and help engineers check signal distribution paths from their routers that act as live preview devices. Multiviewers are able to show live video signals in full, split, quad, and custom configurations on TV displays. The Dolby.io Real-Time Streaming service provides a WebRTC CDN that enables Broadcast Management Group to send the Multiviewer output to remote clients and staff with broadcast quality and sub-500ms glass-to-glass latency, with native playback support on any device or web browser.

Applications for REMI Production

Now that we’ve covered REMI production from a high level, it’s time to dig into some possible applications. What type of programming can utilize a REMI workflow?

Live Sports

We’ve already mentioned live sports. REMI workflows have been utilized for live sporting events for quite some time. From a cost perspective, REMI is an ideal solution. Even with a streamlined remote production, there are still a set number of hard costs: technical crew, OB trucks, transmissions, travel, etc. If you’re a large organization, like ESPN, with plenty of physical infrastructure and available capacity, REMI makes a lot of sense financially. You’re able to streamline your remote team while utilizing in-house staff and existing infrastructure.

Live News

Outside of live sports, REMI production workflows can be utilized for live news programming and events. REMI is a perfect fit for live or live-to-tape interviews with a host, and multiple guests spread across numerous destinations. Each on-camera talent can be sent an athome camera kit before the production. Video and audio feeds from each of those kits can be transmitted to a central control room where the program can be switched, audio can be mixed, graphics and playback assets can be added, and the program can be transmitted.

Live Entertainment

REMI production also has a place in the entertainment space. We’ve had many clients inquire about converting previously planned in-person events into virtual experiences. This has ranged from movie premieres to award shows, to fundraisers and galas. While each event is unique and has a distinct creative approach, the technical infrastructure is consistent. Anyone on-camera receives video and audio equipment, which is transmitted to a central location.

While REMI workflows have been deployed for years, they have historically been utilized in the sports world. But with remote working here for the foreseeable future, REMI production can be an ideal solution for news and entertainment programming.

Although Covid restrictions are winding down, remote production continues to be popular because it offers significant cost savings, allows you to source global talent, and reduces the carbon footprint for onsite production. A skeleton crew work at BMG’s REMI Hub in Las Vegas, with the bulk of the crew working remotely from anywhere in the world.

About Dolby.io

Dolby.io gives developers and creators access to services and APIs designed to improve the quality of the media, communications and streaming within their products. Learn how to integrate high-fidelity audio and video solutions into your real-time communications and file-based media processing applications.

This article is Sponsored Content

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Companies and Suppliers Mentioned