-->
Save your seat for Streaming Media NYC this May. Register Now!

Building a DASH264 Client: Streaming Media East 2013

Article Featured Image

 

 

This article is a transcript of the above video, filmed at the 2013 Streaming Media East conference, in which Digital Primates' Jeff Tapper discussed how to build a DASH264 client and other MPEG-DASH-related issues.

All right. Welcome, everyone. I’m Jeff Tapper. This is a session on building a DASH264 player. Real quick, a little about me: I’m a senior consultant with Digital Primates. Our focus is building next-generation client applications. I’ve had a chance to work on video applications for many of the most-watched live broadcasts over the past decade or so. I’ve been building internet applications for the past seventeen years and had a chance to work on about a dozen books over that time.

So our agenda today: We’ll talk about what HTTP streaming is all about, why is it interesting, why do we care, what are the options out there for HTTP streaming today? And then we’ll talk specifically about DASHas one of those options and we’ll see what that provides for us. We’ll take a look at a more narrow version of DASH264, and see what that’s all about. And then we’ll dig into the meat of it to figure out how do we actually make it work in the browser.

And how many of you consider yourselves developers? Excellent. Languages you’re familiar with-- are most of you comfortable with JavaScript? Okay. So the player we’ll be spending most of our time in today is a JavaScript-based player. We have re-implemented this in a couple different technologies now, but the JavaScript we want is most interesting given all the hype around HTML 5. So the fact that we can actually make this work in a browser without any plugins is kind of interesting, at least to me. And I’m guessing the fact that you’re here maybe it’s interesting to you, too? 

All right. So our challenge: We have most folks agreeing that HTTP streaming is generally a good thing. Right? The internet for the past fifteen years or more—past twenty years now—has really been optimized for delivering content over HTTP. And we’ve had a number of different video streaming protocols over the years that were not HTTP-based, but were just effectively opening a socket over TCP and pushing data down. And there were all sorts of issues, issues with firewalls and other things of that sort. There were some benefits to that. We had much lower latency. When we could just open a socket. We could get bits down almost instantly. But we couldn’t make use of any of the caching servers out on the internet. We had all sorts of firewall problems and so HTTP streaming has become a much more efficient, much more elegant choice.

Video is dominating the internet. We have lots and lots of video on the internet. We’re having exponential growth—not terribly surprising—even more so on the mobile side. HTTP adaptive streaming is a preferred choice these days for streaming content over the internet and the idea is we take one large file, we segment it into a series of small files and we deliver it over HTTP to the end client. There are a lot of benefits to this, and a lot of drawbacks to this. There is more overhead on the client to stitch these back together into a coherent video. However, ultimately the benefits of being able to bypass all our firewall issues, being able to leverage caching servers and leverage all the infrastructure that exists in the Internet today to support HTTP delivery far outweighs the downsides of this.

On the landscape we have HLS (Apple’s solution), Microsoft’s Smooth Streaming, Adobe’s HDS. All three of these effectively are HTTP delivery solutions. There are differences to how they work internally, but ultimately they all do the same thing. They take a video, they segment it and they deliver individual pieces. The client takes the pieces and puts them back together. Make sense?

All right. So the challenge is, as we say, it’s a very efficient choice but there’s problems, of course, with HTTP streaming. Different devices support different technologies. Right? The Apple devices only support HLS. Smooth Streaming is currently available for the Xbox and for Silverlight—actually, I believe Microsoft recently released a plugin for Adobe’s OSMF, Open Source Media Frameworks. So you can play some with streaming in Adobe’s technologies. HDS, as far as I know, is only available in the Flash platform today. Anyone know of HDS being delivered to a platform that’s not Flash? But, of course, the Flash platform is still a pretty dominant platform for video streaming these days. So there’s an awful lot of HDS out there in the world. But there’s no one standard that’s currently supported ubiquitously. And so we end up having to have several different formats of our video if we want to deliver it everywhere.

So if we want to just deliver video straight to the browser, what are our options? The only thing that’s really ubiquitous is progressive downloads—send them the whole file—and even that has issues, because different browsers support different codec. Right? There’s H.264, there’s WebM, there’s VP8, there’s VP9, there’s different things supported in different ways.

In terms of actually streaming to HTTP browsers, HLS is natively supported in Safari as long as it’s in Mac OS or iOS. Safari and Windows do not support HLS. But what’s interesting is that there’s these—recently, these media source extensions and encrypted media extensions, MSEs and EMEs are the acronyms you’ll hear around these things, that are part of a spectrum W3C. Currently, the Chrome browser supports these media source extensions and what these extensions do is they allow us to hand a little bit of data of media to the video tag in HTML at a time making it possible to control HTTP streaming to a browser-based client.

Now this is really interesting. While HLS is supported natively in Safari, as a developer we have no control over what happens to that stream. We simply tell Safari, “Hey, here’s a video. Go play it,” and it makes every decision that needs to be made about that video. What quality you’re watching? You have no control; they decide. What are the adaptive bitrates? They’re built-in logic. If you want to have any control over what’s happening with that stream you really can’t. Whereas with the media source extensions, we as developers decide what bits we hand to the API when we’re ready to hand it to it. So if as we deliver one piece of content, if we realize, “You know what? This one’s not playing well. We’re dropping a lot of frames. It takes longer to download the segment than it does to play out the segment,” I can switch to a lower bitrate. Or the inverse: Maybe I can decide, “Hey, I’ve got a lot of extra room: I want to deliver a higher-quality bitrate.” And we can determine, as we need to, exactly which segment to hand to the media buffer, which ones we want to download, which ones we want to play. And that’s why the media source extensions are so interesting to me.

So the downside, of course, right now is they’re not yet universally supported. They are supported currently in the release version of Chrome. Throughout the different channels of Chrome you’ll find different levels of support. We’ve been working with the Chrome team hand-in-hand to make sure that the work that we do is supported in there. So in the Canary version, which are the nightly builds of Chrome, we have far better support. In the release version we have some support and, of course, they’re working their way through the channels, so eventually we’ll have support for all of this. And I understand that other browser manufacturers are going to be releasing versions of their browser with the media source extension in the very near future. I don’t know any details about timelines. Is there any more I can say on that? Is that close enough? I think that’s close enough.

Now this is interesting. So what is MPEG-DASH? How many of you know what is MPEG-DASH? Most of you. How many of you were at the panel discussion yesterday about DASH? Not as many of you as I would have thought. Okay. So as most of you know, for the folks who aren’t as well versed, DASH is Dynamic Adaptive Streaming over HTTP. It’s an international open standard published by ISO. It was put together by the MPEG group starting in 2009 when they saw all the segmentation within the HTTP streaming technology. And they brought together all of the major players from the industry. Apple and Microsoft and Google and Adobe and many others came together to discuss “How can we do this better?” And that’s what came out of that is DASH. And it does everything from the simplest “Here, just play this video” to very advanced use cases with ad insertion and multi-language support and any other features you may think you need. It’s there in the spec. The spec is roughly 550 pages. It’s not light, summertime reading. I wouldn’t recommend taking it to the beach with you, but it’s very informative. And, as I say, the idea is to attempt to come up with a single standard for devices for HTTP streaming.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues