-->
Save your seat for Streaming Media NYC this May. Register Now!

4 Tradeoffs of Low-Latency Streaming

Learn more about low-latency streaming at Streaming Media East.

Read the complete transcript of this clip:

Chris Vanderheyden: When you are thinking about deploying your low-latency solution, it is all about tradeoffs. So, there are tradeoffs to be made. This slide shows a fairly graphical view that will tell you that if you want the lowest latency, your viewer experience or your scalabilities will suffer. If you prefer higher viewer experience or more compute cycles in your encoder or a bigger client buffer, then your latency is going to suffer. So it's all about trade-offs.

First tradeoff: The two biggest contributors to latency in an end-to-end video production workflow are the time you are spending encoding and the size of the client buffer, and these traditionally were determined by segment size. Now chunked CMAF has used chunked packaging, and that allows you to push out partial content and deploy that and propagate all the way to the edge and already into the client's player without it being fully complete at the final chunk of that segment. So this is what this low-latency chunked transfer and chunked packaging is, but if you want to have a very low latency, you are going to set your encoder settings to real-time or near real-time and then either your visual quality is going to suffer or you won't achieve such a good compression. So that's something to consider.

We see most of our customers prefer to go with a better quality experience. Certainly for stuff like sport streaming, you don't want to watch a crappy stream on a high-end service. So we see people spend quite some considerable time in encoding anyway.

The second part where you need to make the tradeoff is the size of your client buffer. So, again, previously to chunked packaging, we needed to cache at least an entire segment. The Apple specification for HLS actually says that you need to cache at least three and a half times the segment size. So on traditional 10-second segments, that was 35 seconds in your client buffer.

Now for these old latency protocols there is still a very considerable amount of the end-to-end latency that sits in the client buffer especially, and I'll come to that in the next slide because you need to assess when you need to make a quality switch. But however, using chunk transfer, using chunk encoding, or chunk packaging, the smaller chunks come in and your buffer is actually fairly predictable. And so we are in a way better able to maintain a smaller buffer size, but just by having chunked packaging, chunked transfer in there.

The third tradeoff, that's your join latency or what you might compare to traditional linear television. The time it takes when you click on your remote control and the next channel appears. So this is also in all streaming protocols--certainly secondary streaming protocols influenced by the size of your segment--because every segment needs to start with a key frame or starts with a keyframe, and that's the point where you can pick into the stream. Even with chunked packaging, you cannot just randomly pick into that segment any way you want. So you need to wait for the start of the segment. So you have a choice. If we are encoding part number four, do we choose to wait for part number five and reduce our end-to end latency or do we already start playback at the fourth segment but incur a larger latency?

So if you have six-second segments, you might incur an average of three seconds end-to-end additional latency. There are ways of combatting that. One strategy that we employ is we will start playing back the previous segment and just slightly increase playback rates to a level that's unnoticeable to viewers and catch up to the end-to-end latency. It's a very useful technology to employ. If you're far out of the synchronization window, you might even choose to seek out and own dedicated devices, native platforms. It might be even more useful to download that previous segment and do a fast encode without even showing the frames.

So these are strategies that you can employ to reduce your end-to-end latency, but again, you need to make that choice about your segment size.

Switch latency is very similar. Switch latency is the time it takes before when your player detects that it might play back to higher-quality stream. So a higher resolution, a higher bit rate, at the time you were actually playing that. Why is that important? Here we see that we might play at a higher quality, but we still have some content back in the buffer. So what do we do with that content? Do we wait until that content is fully played out? Or do we just trash it? So two of the segments, we trash them, and we play and start downloading the high-resolution already? That's a choice. Your bandwidth is going to suffer a little bit, but you need to maintain at least that active segment on which you are. So again, segment size is very important to you.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Buyers' Guide to Low-Latency Solutions

If you need low latency, here's how to pick the technology solution that's right for you. But it's not a one-size-fits-all affair.

Why Low Latency Matters for Live Event Streaming

VisualON SVP and Head of Business Development Michael Jones discusses the challenges and timetable for reaching <1 second latency in large-scale live sports streaming in this clip from Esports & Sports Streaming Summit at Streaming Media West 2019.

Sports Fans Care More About Picture Quality Than Latency, Says Verizon Report

Viewers would rather have 4K than lower latency, according to a new study by the streaming platform and CDN that delivered the Super Bowl

Low-Latency Sports Streaming at Scale

Mux Founder & Head of Product Steve Heffernan discusses the pros and cons of different methods of lowering latency for large-scale live sports event streaming in this clip from Streaming Media West 2019.

Streaming Video Latency: The Current Landscape

NGCodec CEO, Founder & President Oliver Gunasekara breaks down the low-latency landscape for distribution in this clip from a Live Streaming Summit panel at Streaming Media East 2019.

Video: When Low Latency Matters, and When it Doesn't

Sometimes low latency is critical, but in other streaming applications it's not worth prioritizing, Wowza Senior Solutions Engineer Tim Dougherty argues in this clip from Streaming Media West 2018.

Video: Where Streaming Latency is Today

Wowza Media Systems Senior Solutions Engineer Tim Dougherty surveys the recent and current state of streaming latency in this clip from his presentation at Streaming Media West 2018.