How to Encode for Adaptive Streaming
Industry conferences are great places to get together with colleagues and compare notes. They're also great places to learn new skills.
In a 45-minute session at the recent Streaming Media West conference in Los Angeles, Jan Ozer taught a roomful of attendees the basics and complexities of adaptive streaming.
"There's two basic types of adaptive streaming technologies: one is a server-based technology, and the primary server-based technology now is RTMP Flash. With a server-based technology, the server is in charge of delivering a different stream to the player," Ozer started out.
"With an HTTP-based system, which is HTTP Flash, HTTP Live Streaming -- or HLS, which is the Apple system -- the player monitors the same characteristics, but it's in charge of going to get a different stream when the circumstances dictate a stream change," he explained.
Ozer taught how to produce video for each type of adaptive streaming, and explained related concepts, such as keyframes, chunking, and manifest files. He also guided people in creating an adaptive strategy:
"Choice one is how many streams do you need? And at a high-level, you want relevant coverage, you want to produce enough streams so that every level that you're serving gets a decent quality iteration. If you're starting with SD [standard definition], then three to four streams is typically necessary."
How do the big video sites approach adaptive streaming? MTV, Ozer explained, went to all its online properties and asked what size windows it would be serving video to, then created streams for each window size. Unlike other sites, MTV doesn't switch video streams unless the viewer changes window sizes. MLB.com, on the other hand, is a subscription site, and offers more streams to ensure better service.
Scroll down to view the entire presentation and download the presentation:
How-To: Encoding for Adaptive Streaming
This session identifies the most relevant adaptive streaming technologies and details the most critical factors for comparing them. Next, it details how to choose the ideal number of streams and key encoding parameters. Then it provides an overview of options for encoding and serving the streams and closes by describing techniques for serving multiple target platforms like Flash and iDevices with one set of encoded H.264 files.
Speaker: Jan Ozer, Principal, Contributing Editor, Streaming Media magazine, Doceo Publishing
Apple's HTTP Live Streaming (HLS) protocol is the technology used to deliver video to Apple devices like the iPad and iPhone. Here's a primer on what HLS is and how to use it.
A look at what adaptive streaming is, the primary technology providers, and the factors you should consider when choosing an adaptive streaming technology