-->
Save your FREE seat for Streaming Media Connect this August. Register Now!

What Is Streaming?

At its most basic level, streaming media is the delivery of audio and video files from a server to a client over the internet or a cellular data network. The first streaming audio was delivered in 1995, while the first streaming video followed two years later; you can read more about the early history of the technology in the first version of "What Is Streaming,"  which we published in 2011. Needless to say, much has changed since then.

What does streaming media look like in 2019? If you’re new to the field, it probably looks like an impenetrable collection of standards, products, and technologies, and in truth, it is. But it can be broken down into a small set of decisions that streaming producers must make when defining their service. This guide identifies those decisions and points you to other content to help you make them.

Target Streaming Devices

Start by choosing the platforms that you want to deliver video to. Virtually all producers want to reach computers and mobile devices, but you have to dig a bit deeper. For example, which browsers do you want to support on computers, and how far back do you want to go? If your viewers work in government or educations you may need to support browsers like Internet Explorer 11 or earlier, which means support for legacy formats like Flash. If you’re targeting younger viewers, Flash fallback is likely not an issue.

For mobile, you need to decide between delivering via an app or the browser. Apps enable more features and design flexibility but increase development cost and time. Browser support is faster and cheaper but offers fewer features (see "Video: Browsers vs. Apps for Content Delivery"). 

The next major set of platforms tackled by streaming producers are OTT (over-the-top) devices like Roku, Apple TV, Chromecast, and Amazon Fire TV. You’ll have to create a channel or the equivalent for each of these devices, but they represent an opportunity to support huge swaths of viewers with each development effort. If your target viewers are younger, you may also want to support game platforms like PlayStation or Xbox.

The final set of target platforms typically undertaken by only the largest of streaming producers are smart TV platforms. Though there is some standardization through organizations like the SmartTV Alliance (Phillips, LG, Panasonic, Toshiba) or HbbTV, each platform will likely need a separate effort. For an overview on supporting these platforms, download the presentation handout from this Streaming Media West workshop entitled "Encoding 2018: Codecs & Packaging For PCs, Mobile, & OTT/STB/Smart TVs."

Adaptive Bitrate (ABR) Formats

Each platform identified above supports certain adaptive bitrate (ABR) formats that dictate how the video files are encoded and packaged. If you’re distributing video to iOS devices in the Safari browser, you’ll have to package your video into the HTTP Live Streaming (HLS) format (see "What is HLS"). If you’re distributing to Android devices via a browser, Dynamic Adaptive Streaming over HTTP, or DASH, is preferred (see "What is MPEG DASH"). As noted, if you distribute video to mobile devices via apps, you can typically choose whichever ABR format you would like.

For computers, your format decision will most likely depend upon the off-the-shelf player you select; an excellent starting point for technical readers is this video from Robert Reinhardt at Streaming Media West entitled "Choosing the Best Off-the Shelf Video Player." All OTT boxes and smart TVs support one or more formats except Apple TV (of course) which only supports HLS. Some older gaming platforms are similarly inflexible and only support older formats like Microsoft’s Smooth Streaming.

In the end, to reach their intended viewers, most producers end up supporting at least two formats, HLS and DASH, with a smattering of support for other formats. We’ll discuss how to support multiple formats in the Packaging & Encoding Schema section below.

Feature Sets

Fundamental to adaptive bitrate streaming is the concept that each input file, whether live or video on demand (VOD), is encoded to a different set of files with different resolutions and bitrates to optimize the playback experience for all viewers, whether watching from a mobile phone via 3G or on a 4K smart TV connected via 100Mbps broadband. The configuration for these different files is called an encoding ladder; the chart below is a suggested encoding ladder from Apple’s HLS Authoring Specification for Apple Devices.

Apple Encoding Ladder

Apple’s suggested encoding ladder from the HLS Authoring Specification for Apple Devices.

Your encoding ladder will change based upon the input of your video, the compression technology that you use, your target platforms, and even your geography. Here are two useful videos for creating your encoding ladder—one covering bitrates, and one covering  resolution. Here are some observations you can use to fine-tune your encoding ladder.

Digital Rights Management

If you’re distributing sensitive or premium content, you may have to protect it with digital rights management technology, or DRM (here’s a useful primer on DRM, while this article describes how Hollywood studios use DRM). As with ABR technologies, different platforms support different DRM technologies. For example, Chrome and Chromecast support Google Widevine; Apple TV, iOS, and MacOS support Apple FairPlay; and Edge supports Microsoft PlayReady. Fortunately, deploying multiple DRMs is simpler than it sounds from both a technology and an administrative perspective.

Technology-wise, the transition from Flash to HTML5 was enabled by a specification called the Encrypted Media Extensions (EME) that allows a single file to include multiple DRM technologies. At the same time, multiple vendors offer licenses to all relevant DRMs simplifying the commercial side.

Closed Captions

For certain types of videos, closed captions may be required, while for others they may be desirable to reach the hearing impaired or for playback in loud or public places where audio may not be discernible. You can find a good overview of closed captioning in this article entitled "Closed Captioning for Streaming Media."

Streaming Codecs

Codecs are the technologies that compress audio and video and allow you to deliver your content to viewers via a range of connections (see "What is a Codec," here). Codecs are absolutely critical to streaming video; no codecs, no streaming video.

For about the last ten years, a video codec named H.264 with AAC audio compression has been the technology of choice for almost all streaming producers. Over the last five years, however, two video codecs, HEVC and VP9, have been deployed to reduce bandwidth costs and increase video quality over lower bitrate connections. This latter point is key; for example, where H.264 could deliver a high-quality 720p stream at 2 Mbps, HEVC and VP9 can deliver a high-quality 1080p stream at the same data rate, which will look better to most viewers. In 2018, a technology called AV1 started shipping, with a technology called Versatile Video Coding to follow in 2020 or so.

Choosing and deploying a codec is a complex analysis involving factors like encoding efficiency, platform compatibility, and support within an ABR technology. For an overview of these considerations, check out this video from Streaming Media East, "HOW-TO: Comparing AV1, VP9, HEVC & H.264."

Streaming Packaging and Encoding Schema

As discussed above, to reach all of your target platforms you’ll likely have to support multiple ABR formats, typically DASH and HLS. There are two approaches; static and dynamic packaging.

With static packaging, you encode and package all files necessary to deliver both ABR formats and upload them to an origin server for distribution. Depending upon how you encode your videos, this may double your encoding cost and will certainly increase online storage costs.

The other approach is called dynamic packaging. Here, you encode all the rungs of your encoding ladder and upload those to an origin server. When a viewer clicks on your link, a separate server detects which format the player requires and automatically creates the required packaging in real time.

Dynamic packaging minimizes storage and encoding costs but requires a server running 24/7 to package the content. Typically, dynamic packaging is cheaper than static packaging when considering all associated costs, but this varies by application. To learn more about how dynamic packaging works, check out this how-to article

 At some point in 2020 or so, a technology called the Common Media Application Format (CMAF) will enable a single set of files to support both HLS and DASH for most newer platforms, but not all legacy platforms. For producers who can ignore legacy viewers, CMAF will slash the storage costs associated with static packaging and make this the most affordable option.

Content Delivery Networks

Compared to most other forms of web content, like text, images, and PDF files, streaming video is much larger and therefore harder to deliver. For this reason, most streaming producers deploy a content delivery network, or CDN, to deliver their video (See "What is a Content Delivery Network").

Larger organizations may want to deploy multiple CDNs, both for redundancy and to optimize delivery in different regions. To learn more about how and why to support multiple CDNs, check out the panel discussion from Streaming Media West entitled "CDN Optimization: Working Toward Broadcast Economics & Quality at Scale."

Quality of Service (QoS) and Quality of Experience (QoE)

When streaming video is mission-critical to your organization, measuring how effectively your content is delivered becomes equally mission-critical. There are two basic technologies here; quality of service (QoS), which measures the technical effectiveness of your video infrastructure, and quality of experience (QoE), which measures the actual viewing experience.

While obviously related, the concepts are definitely separate. For example, if your video packaging is flawed, QoS could be perfect, but the viewing experience would be awful. For this reason, most larger producers use different services to monitor both. For an overview of QoE and QoS technologies, check out "Measure it, Improve it: For Video Publishers, QoE and QoS are Critical." For a tutorial on how to deploy a QoE technology, check out "How to Measuring Video Encoding QoE ." 

We’ve thrown a lot of concepts at you in this "What Is" guide, but also a lot of resources. While this article is just a tip of the big toe into the world of streaming media, if you’ve made it this far you’ve got a great start.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

How to Succeed on Patreon: A Guide for Video Publishers

Over 100,000 creators use Patreon to run their creative businesses, and brands can tap the platform's enthusiasm and resources. Here's how to succeed on Patreon.

What Is HEVC (H.265)?

Not sure what to make of the new format on the block? Read this to get up to speed on how HEVC was created, the challenges it now faces, and when it will go into everyday use.

What Is HLS (HTTP Live Streaming)?

Apple's HTTP Live Streaming (HLS) protocol is the technology used to deliver video to Apple devices like the iPad and iPhone. Here's a primer on what HLS is and how to use it.

What Is Adaptive Streaming?

A look at what adaptive streaming is, the primary technology providers, and the factors you should consider when choosing an adaptive streaming technology

What Is H.264?

A look behind H.264, the world's most popular video codec, including encoding parameters and royalty concerns

What Is a Content Delivery Network (CDN)?

A definition and history of the content delivery network, as well as a look at the current CDN market landscape

What Is Streaming (2011 Version)

A high-level view of streaming media technology, history, and the online video market landscape