Streaming Media

 
Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn Streaming Media on Google+ Streaming Media on YouTube
Sponsors

Video: Is Latency Still Hurting Live Streaming?
Ooyala's Paula Minardi and Level 3's Jon Alexander discuss the key issues facing live streaming and VOD providers regarding latency times, buffering, and meeting evolving viewer expectations.
Learn more about the companies mentioned in this article in the Sourcebook:
{0}
{0}
{0}

In this excerpt from a panel at Live Streaming Summit, Ooyala's Paula Minardi and Level 3's Jon Alexander discuss ongoing issues with latency times and buffering and what viewers of live and VOD are inclined to tolerate as acceptable parts of the online video viewing experience.

Learn more about the next Live Streaming Summit!

Read a complete transcript of this clip:

Jon Alexander: We definitely see latency as one of those kind of emotional subjects, where people always want lower latency. Why would you sign up for higher latency?

On the broadcast side, it's kind of an interesting anecdote, if you talk about the NFL, the propagation of video, a broadcast feed across our networks, it's about 10 milliseconds. If we do an encode on that, it adds about 100 milliseconds of latency. We've been seeing an increased number of broadcasters coming to us now asking for uncompressed feeds. They don't want us to do any kind of compression, even a very lightweight lossless compression, because it adds latency. That is an additional 100 milliseconds of delay on the broadcast side that is something that they are trying to eliminate.

At the same time, they have to have a three-second delay imposed to avoid a Janet Jackson-type event. They are trying to trim off 100 milliseconds on one end, when they are imposing a three-second delay at another end. If you're doing satellite, your roundtrip time up and down is roughly a two-second delay. End-to-end delay, even in television best case, you're looking at 5-6 seconds. A lot less than the 30-40 seconds we typically see with HLS, but still non-zero.

For me, I think the biggest challenge with internet is it is non-deterministic and synchronization between devices isn't there. You could be watching on your phone, I could be watching on my phone and we're not synchronized. I think that's the experience which is more off-putting, rather than someone tweeting what's going to happen in five seconds time.

Paula Minardi: We worked with Nice People at Work, one of our partners, to look at buffering times and how that sort of impacts live and VOD content. It was kind of interesting. What we found is that the buffer ratio is that the buffer time over the session time. When we found that the buffer ratios were less than 0.2% that was kind of ideal for people to hang with the content. If you're watching it on a VOD stream, people hung about 16 times as long, but they dropped off very quickly once that ratio started going up. On the live content, people hung on about 24 times as long.

What it sort of tells us is that maybe they are a little forgiving at this point around getting that experience, because they are valuing that content. I think as time goes on they are going to be a little less forgiving, especially those younger audiences. They are going to just start demanding.

Jon Alexander:  We definitely see latency as one of those kind of emotional subjects, where people always want lower latency. Why would you sign up for higher latency?

 

                On the broadcast side, it's kind of an interesting anecdote, if you talk about the NFL, the propagation of video, a broadcast feed across our networks, it's about 10 milliseconds. If we do an encode on that, it adds about 100 milliseconds of latency. We've been seeing an increased number of broadcasters coming to us now asking for uncompressed feeds. They don't want us to do any kind of compression, even a very lightweight lossless compression, because it adds latency. That is an additional 100 milliseconds of delay on the broadcast side that is something that they are trying to eliminate.

 

                At the same time, they have to have a three-second delay imposed to avoid a Janet Jackson-type event. They are trying to trim off 100 milliseconds on one end, when they are imposing a three-second delay at another end. If you're doing satellite, your roundtrip time up and down is roughly a two-second delay. End-to-end delay, even in television best case, you're looking at 5-6 seconds. A lot less than the 30-40 seconds we typically see with HLS, but still non-zero.

 

                For me, I think the biggest challenge with internet is it is non-deterministic and synchronization between devices isn't there. You could be watching on your phone, I could be watching on my phone and we're not synchronized. I think that's the experience which is more off-putting, rather than someone tweeting what's going to happen in five seconds time.

 

Paula Minardi:   We worked with Nice People at Work, one of our partners, to look at buffering times and how that sort of impacts live and VOD content. It was kind of interesting. What we found is that the buffer ratio is that the buffer time over the session time. When we found that the buffer ratios were less than 0.2% that was kind of ideal for people to hang with the content. If you're watching it on a VOD stream, people hung about 16 times as long, but they dropped off very quickly once that ratio started going up. On the live content, people hung on about 24 times as long.

 

                What it sort of tells us is that maybe they are a little forgiving at this point around getting that experience, because they are valuing that content. I think as time goes on they are going to be a little less forgiving, especially those younger audiences. They are going to just start demanding.

Related Articles
It's one of the biggest challenges facing live video streaming today, and it's remained frustratingly stubborn. Explore the technical solutions that could finally bring latency down to the 1-second mark.
The upcoming third edition of DASH will address several missing features, says a Comcast principal architect, and will drive down live video latency.
Ensuring broadcast quality for viewers is about more than just reducing buffering. Publishers need to improve the efficiency of back-end operations, as well.
Reel Solver's Tim Siglin, Rainbow Broadband's Russ Ham, and Verizon's Daniel Sanders discuss how attacks on Net Neutrality will impact video delivery in general and latency in particular.
Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.
The report, "Create the Streaming Media Experience Users Want," focuses on time to first frame and end-to-end latency in five markets: user-generated content, online gaming, sports, news, and radio.