Streaming Media

Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn
 
Upcoming Industry Conferences
Streaming Media West [19-20 Nov 2019]
Esport & Sports Streaming Summit [19-20 Nov 2019]
OTT Leadership Summit [19-20 Nov 2019]
Video Engineering Summit [19-20 Nov 2019]
Live Streaming Summit [19 Nov 2019]
Streaming Media East [5-6 May 2020]
Past Conferences
Streaming Media East [7-8 May 2019]
Live Streaming Summit [7-8 May 2019]
OTT Leadership Summit [7-8 May 2019]
Video Engineering Summit [7-8 May 2019]
Content Delivery Summit [6 May 2019]
Streaming Forum [26 February 2019]

Choosing the Optimal Data Rate for Live Streaming

The object of the exercise was to show how output quality was affected by the data rate of the incoming video in a cloud transcoding scenario.

When you’re a hammer, the world looks like a nail. When you have objective video quality measurement tools, you can use them to check all the assumptions that you’ve made in the past to verify their veracity. One of the assumptions that many of us make is that copious outbound bandwidth is necessary to achieve high-quality streaming when producing live events. While this is true to some extent, as demonstrated by the tests I undertook for this article to identify an optimal data rate for live streams, you quickly reach a level of diminishing returns.

Object of the Exercise

The object of the exercise was to show how output quality was affected by the data rate of the incoming video in a cloud transcoding scenario. Specifically, sites like Ustream, YouTube Live, and Brightcove/Zencoder input a single stream and transcode in the cloud into an adaptive group of streams. When transmitting video to these sites, while higher data rates are always better, outbound bandwidth from many sites is limited, and can be expensive in an offsite conference or similar scenario. The question is, how much difference will a typical user see between an affordable 3 Mbps 720p stream, and a possibly extravagant 6 Mbps 720p stream?

As it turns out, not so much.

How I Tested

I tested with two clips. One is a talking head clip of me, the other a song from a recent concert appearance of the Loose Strings band in Galax, Va. The Loose Strings clip is a single-camera concert shoot with lots of pans and zooms. It's more challenging than a talking-head clip, but not exactly a chase scene from Mission: Impossible when it comes to encoding complexity.

Both clips started life as 1080p AVCHD which I copied to disk, imported into Premiere Pro, added timecode, and output as ProRes. I then played the clips out using a Blackmagic Design DeckLink HD Extreme 3D card via HDMI, and captured with a Matrox Monarch HDX.

For reasons I’ll explain in a moment, I first captured a 20 Mbps 720p stream, then a 10 Mbps stream, then I captured at decreasing data rate values down to 1 Mbps. These served as the source test clips.

In a transcode scenario, you don't care about the quality of the incoming clip, you care about the quality of the clip encoded from that source clip. To measure this, I created two presets in Sorenson Squeeze, one for 640x360 at 1.2 Mbps output, the other for 1280x720 at 2 Mbps output, both using x264 with the Fast Encoding preset that many live producers use during transcoding. Then I input all the source clips into Squeeze, and encoded with those presets.

(To keep us all sane, I’ll call the original Monarch-encoded clips the “source” clips, all produced at 720p, and the 360p and 720p Squeeze-encoded clips the “transcoded” clips.)

Next, I measured the quality of the source and transcoded clips using the Moscow University Video Quality Measurement Tool (VQMT), and the SSIMWave Video Quality of Experience Monitor (SQM).

These tests raised some interesting test-related technical issues. By way of background, when testing the quality of encoded video on demand clips, you start with the source clip, encode the clip, and compare the output with the source using the selected tool. Easy, peasy.

In a live scenario, the source file is less easy to identify because it’s slightly changed so many times. For example, during my capture workflow, the clip was processed by the Blackmagic card to transmit via HDMI, and then scaled by the Monarch during capture. First, I tried comparing the Monarch-encoded source clips with a 720p clip scaled and output in Premiere Pro, but the small scaling and other differences were so substantial that they essentially masked the true compression-related quality differences.

So, rather than comparing the source clips produced by the Monarch-to-Premiere Pro output, I compared them to the aforementioned 20 Mbps source clip. To analyze the transcoded clips produced in Squeeze, I encoded the 20 Mbps clip using the same 360p and 720p presets, and compared all other encoded clips to those outputs.

I can’t say for sure that this is the best approach--and I’m open to suggestions for future articles--but this is the procedure I used for this exercise. Now, enough navel-gazing on test procedures. Let’s get to the results.