-->
Save your seat for Streaming Media NYC this May. Register Now!

Video: Where Streaming Latency is Today

Learn more about latency at Streaming Media's next event.

Watch the complete video of this panel from Streaming Media West, DT103. The Road to Scalable, Real-Time Live Streaming, in the Streaming Media Conference Video Portal.

Read the complete transcript of this clip:

Tim Dougherty: Do you remember "Click here to download Real Player?" "Click here to download Windows Media Player?" I did my first stream in 1999--yes I know, that's awesome.

Real server, there's Flash, QuickTime. Those are some of the old protocols, but are super-fast. RTSP, for example, I believe that's what Real was using at the time, and there are people who are building mobile applications to scale using RTSP because it is low latency. It doesn't have a lot of bells and whistles on it, but some of these old legacy video streaming protocols actually are quite remarkably fast.

Today, of course, there are the HTTP streaming protocols. That's the old Apple logo, by the way, which I think is pretty cool.

There's Adobe, San Jose Streaming, Microsoft Smooth Streaming, which was another phenomenal protocol, and then of course there's MPEG DASH. This gives you 8-10 seconds. That's where we've been.

Where are we today? Out of the box, you get a 10-second chunk. I just kind of tried to describe a couple minutes ago how, when you're watching a video, you're watching 10 seconds, and then your player's downloading the next 10 seconds and then it gets into that second chunk, and then the next chunk arrives and that's functionally how it works.

We can change that. You get into a media server and you can say, “Don't do a 10-second chunk, do a half-second chunk or a one-second chunk.”

Then the media server gets super-busy, the network gets super-busy, but you end up with better latency. You end up with 8-10 seconds instead of 30-45 seconds.

You might remember from earlier how I shake down people who think they want low latency. Sometimes tuned, optimized, super HLS--whatever you want to call it--sometimes that's plenty good. Again, if you can get by and you don't need that ultra-right-now, super-low latency, HTTP modified is fantastic.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Key Findings: Wowza 2019 Streaming Latency Report

Wowza Director of Sales Engineering Tim Dougherty distills results of Wowza's 2019 Streaming Latency Report in this clip from his presentation at Streaming Media West 2019.

4 Tradeoffs of Low-Latency Streaming

THEO Technologies' Chris Vanderheyden discusses the consequences of prioritizing low latency in streaming encoding and the benefits of chunked packaging in this clip from Streaming Media West 2019.

Video: When Low Latency Matters, and When it Doesn't

Sometimes low latency is critical, but in other streaming applications it's not worth prioritizing, Wowza Senior Solutions Engineer Tim Dougherty argues in this clip from Streaming Media West 2018.

Video: What Does Low Latency Really Mean?

RealEyes' David Hassoun discusses what low latency is and what it isn't, and sets reasonable expectations for the current content delivery climate.

Video: Best Practices and Key Considerations for Reducing Latency

Wowza Senior Product Manager Jamie Sherry discusses key latency considerations and ways to address them at every stage in the content creation and delivery workflow.

Video: Do You Really Need Low-Latency Streaming?

Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.

Video: Is Latency Still Hurting Live Streaming?

Ooyala's Paula Minardi and Level 3's Jon Alexander discuss the key issues facing live streaming and VOD providers regarding latency times, buffering, and meeting evolving viewer expectations.

Companies and Suppliers Mentioned