-->
Cast Your Vote in the Streaming Media Readers' Choice Awards. Voting Closes Oct 4!

20 for 20: The Most Important Standards of the Last 20 Years

Article Featured Image

Why it matters. The MPEG-4 system contained the base media file format (MP4) long before it became its own substandard, and MP4 survived as a container format even when the MPEG-4 system faltered. Elementary streams for everything from mobile video capture to desktop video editing tools to fragmented/segmented HTTP-streaming delivery (fMP4) rely on this robust legacy container format.

THE REST OF THE BEST

The remaining 10 standards in the Top 20 received less than 50% of overall votes, but are still equally important to the growth of the streaming media industry.

Advanced Audio Coding (AAC)

Percentage vote: 43%

Definition & Why it matters. Advanced Audio Coding (AAC) is a lossy digital audio compression scheme, approximately four times more efficient than MP3 (the MPEG-2 Part 3 audio codec). AAC is most widely used by iOS devices (iPod touch, iPad, and iPhone) and the accompanying iTunes music library, but its High Efficiency (HE-AAC) version continues to make inroads into music subscription services as well as Apple HTTP Live Streaming (HLS) audio and video streaming delivery.

High Efficiency Video Coding (HEVC)

Percentage vote: 43%

Definition & Why it matters. High Efficiency Video Coding (HEVC), also known as H.265, is a joint codec specification by the ITU-T and MPEG standards committees. It is a part of the MPEG-H system, designed to replace MPEG-4, and promises a 50% increase in encoding quality or a 50% decrease in encoding time, depending on how it is deployed. Licensing issues, however, have continued to leave a significant amount of uncertainty around the adoption of HEVC, allowing alternate codecs such as the VPx series and AV1 to gain ground.

Dynamic Adaptive Streaming over HTTP (MPEG-DASH) 

Percentage vote: 43%

Definition & Why it matters. The Moving Picture Experts Group (MPEG) ratified a standard for Dynamic Adaptive Streaming over HTTP (DASH) in late 2011, partly as a way to rationalize the numerous HTTP-based adaptive bitrate (ABR) solutions offered by Adobe, Apple, Microsoft, Move Networks, and others. DASH offers either an MPEG-2 Transport Stream (M2TS) approach favored by Apple or a fragmented MP4 (ISO Base Media File Format) favored by Adobe and Microsoft. DASH defines two formats. The first is the Media Presentation Description (MPD), which is somewhat equivalent to the Apple HTTP Live Streaming (HLS) manifest file, and “provides sufficient information for a DASH client

for adaptive streaming of the content by downloading the media segments from a HTTP server.” The second specification is the segment format construct, which addresses the “entity body of the request response when issuing a HTTP GET request or a partial HTTP GET.”

Fragmented MP4 (fMP4)

Percentage vote: 43%

Definition & Why it matters. Fragmentation (or segmentation) is the process of dividing a larger file into a series of smaller files. Segment lengths in popular HTTP-based streaming delivery range from 2 to 10 seconds. As HTTP-based streaming can be used for multiple-bitrate content of the same length but varying resolutions, segmentation also refers to the act of simultaneously segmenting the multiple files at the exact same point in each file. All on-demand HTTP-based streaming solutions use a technique that fragments longer elementary streams—often based on the MP4 file format (ISO Base Media File Format)—into shorter segments. These segments or fragments of ISOBMFF are referred to as fragmented MP4 (fMP4), and each fragment is delivered as a small file via an HTTP server to the client’s media player. Fragmentation eliminates the need for a specialized media server. Each of the thousands of small file segments making up long-form content (e.g., a movie or hour-long television show) is often no more than 2 to 10 seconds in length. Media servers are often used to simultaneously segment a number of equal-length elementary files, each at a different bitrate or resolution, so that HTTP streaming can adapt to varying network bandwidths by sending the appropriate-quality segment at any given moment.

MPEG-2 Part 3 (MP3)

Percentage vote: 43%

Definition & Why it matters. MPEG-2 Part 3, also known as MPEG-1 Layer 2, is a popular audio compression container and codec. Using the shortened MP3 extension, this audio codec has reached the royalty-free stage as of 2017.

WebRTC (Real-Time Communications)

Percentage vote: 29%

Definition & Why it matters. Web Real-Time Communications (WebRTC) is a set of web standards for communication that Brendan Eich of Mozilla calls “a new front in the long war for an open and unencumbered web.” RTC approaches tended to be complex and costly, often requiring in-house resources to develop and integrate into traditional data services, or required extensive (and expensive) licensing of audio and video technologies. Google launched WebRTC on its Chrome browser, to “bring real-time communications to the web” and develop a new form of communication platform by “building a state-of-the-art media stack in your browser.” Initially, WebRTC was supported on the following browser platforms: Chrome (including Chrome for Android), Firefox, Opera, and native Java and Objective-C bindings. By late 2017, though, Apple announced support for WebRTC on Safari 11, using WebKit on macOS High Sierra, which also supported H.265 video (HEVC). Apple’s support began with WebKit version 12604.1.25.0.2 and now also allows WebRTC in native iOS apps.

Border Gateway Protocol (BGP)

Percentage vote: 29%

Definition & Why it matters. Border Gateway Protocol (BGP) is used to exchange routing information for the internet and is the protocol used between internet service providers (ISPs). Because it is used to traverse multiple service providers, each of which has its own autonomous network, BGP is considered to be an interautonomous system routing protocol. While an ISP’s network or group of networks falls under a common administration and shares common internal routing policies, the use of BGP allows an ISP to interexchange customer and ISP routing, carrying traffic from the customer network to an ISP network, then on to another ISP, repeating this scenario n times until the traffic is delivered to the appropriate customer on an ISP’s network.

Asynchronous Transfer Mode (ATM)

Percentage vote: 29%

Definition & Why it matters. Asynchronous Transfer Mode (ATM) is a networking protocol designed to move multimedia data around with high reliability and speed. Some ISPs use ATM as the protocol for their backbones. Unlike Ethernet, which supports speeds of up to 100Mbps, ATM allows a bandwidth of 25Mbps to 622Mbps. It uses variable-sized packets.

Motion JPEG (M-JPEG)

Percentage vote: 29%

Definition & Why it matters. According to the Library of Congress, Motion JPEG (M-JPEG) is a legacy format worth saving, because so many still images are stored in the JPEG compression format, and M-JPEG is essentially a restricted JPEG with a fixed YCbCr colorspace, encoded at 4:2:2, and using basic Huffman encoding rather than arithmetic or progressive encoding. M-JPEG encodes video one frame at a time, and applies standard JPEG compression to each independent frame. It is considered an intraframe compression, offering only spatial compression. As such, it offers no additional multiframe compression, also known as temporal or interframe compression. M-JPEG continues to be used as an I-frame-only capture, despite more than a decade of the more popular H.264 interframe compression, primarily in lower-end still image cameras that already capture in JPEG and can therefore apply a format wrapper around a series of JPEG stills to present a multi-image moving image file.

Session Initiation Protocol (SIP)

Percentage vote: 29%

Definition & Why it matters. Session Initiation Protocol (SIP) supports the setup and tear-down of media sessions, including audio teleconferencing, videoconferencing, and unicast streaming. SIP is defined by the Internet Engineering Task Force (IETF) as RFC 3261, which defines an application-level signaling protocol between end-points over an IP data network. While SIP does not actually contain the data itself, it was traditionally used hand-in-hand with the G.7xx audio codecs as well as Real-Time Transport Protocol (RTP) or Real-Time Streaming Protocol (RTSP) to exchange actual audio, video, or other multimedia content between participants. Recent updates include RFC 8217 and RFC 7463, the latter of which allows for “shared appearances of an Address of Record (AOR) since SIP does not have the concept of lines” when a multi-line or bridged-lines participation is initiated.

CONCLUSION

There are a number of standards that didn’t quite make the cut—including the Common Encryption Scheme, as well as a patent owned by Arris that appears to essentially patent HLS—but the Top 20 list we’ve covered here shows that the streaming industry of today is built on agreement around innovation adoption on a global scale. Don’t forget to check out our Top 20 technologies (see page 111), which include several de facto “standards” that have seen widespread adoption—and, in some cases, equally widespread abandonment—over the past 2 decades.

[This article appears in the 2018 Streaming Media Industry Sourcebook as "20 for 20: The Most Important Standards of the Last 20 Years."]

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

20 Years of Streaming Media: Looking Back and Looking Ahead

Streaming Media launched during the dot-com bubble, and while some of the early players faded away when their funding ran out, others helped chart the future of the industry.

The Future of HEVC Licensing Is Bleak, Declares MPEG Chairman

Thanks to a fractured HEVC licensing system companies no longer have the financial incentive to innovate, but Leonardo Chiariglione suggests steps to reverse the damage.

RTMP in the Age of HTTP Video Streaming: Don't Count it Out

Think RTMP is out of date? Despite the ubiquity of HTTP video streaming, there are still use cases where RTMP is the standard.

HTTP/2.0 and DASH: Planning Tomorrow's Improved Video Delivery

In this guest post, Bitmovin's CTO and co-founder explains how HTTP/2.0 will solve the problems of previous generations and seamlessly integrate with DASH.

What Is HEVC (H.265)?

Not sure what to make of the new format on the block? Read this to get up to speed on how HEVC was created, the challenges it now faces, and when it will go into everyday use.