-->
Save your seat for Streaming Media NYC this May. Register Now!

Video: Best Practices and Key Considerations for Reducing Latency

Learn more about reducing latency at Streaming Media West.

Read the complete transcript of this clip:

Jamie Sherry: Depending on your use case and what you're trying to achieve, and how latency is part of that use case, there are different techniques for lowering latency. There isn't one silver bullet here. You can choose one of many protocols or technologies to try to solve these things. They have trade-offs, they have positives and negatives. But you really need to think about it at every step in your workflow.

It's not just about the network, it's not just about the buffer management, which I do have listed here. It's content creation too. It's the features you need. In the sense that if you need to do transcoding, for example, that's going to introduce latency. That can be minor, but it can be major, too. If you're going to manipulate content in other ways with metadata or with encryption or anything else, usually these things are OK, but again, it just depends.

It all depends topology-wise where you're doing these things in relation to how you're then delivering it to the user. I know that doesn't sound easy, but that's the way this goes right now. Content creation, things like codecs. There's bitrate and resolution, these all impact things. Streaming workflow and devices, whatever devices you're hitting on whatever networks they are: WiFi, 4G, all those things have network fluctuations. Public internet. If you're on private connections with enterprise or corporate stuff, it can be different or it can be better because you have more control there, usually.

The workflow, again, reflects on what you're trying to do with the content before you actually get it to the consumer. Buffer management, the encoder and the player specifically, also the server. Again, you can reduce these buffers in these areas to almost nothing but, again, you run into the risk of if you have a network fluctuation or degradation, that you may have a negative impact on the playback.

Network considerations really boil down to two areas: how you optimize the delivery using the transport protocols that we talked about and the protocols that might sit on top of that. HTTP vs. RTMP vs. RTSP. Then how you reach your audience. Again, size, location, all those things matter in terms of the latency. The front-runners that people are trying really today are things like WebRTC and WebSocket. There are definitely a lot of offerings out there that have, from the ground up, written services and software that use these under the hood to do what they do. There are a lot of them that are doing really good things.

These pieces you need to just make sure, again, when you're dealing with proprietary protocols, these are standards based. But, essentially, when you're not dealing with HTTP, on the playback side, you need to have ... I call them aware clients. Basically, you can't just drop a video tag, an HTML5 video tag in a browser and have it work. You need to actually write some code around that to make it possible to do this. Then the server scale-out infrastructure piece, depending on what protocol you use, the scaling of that in either an Origin Edge, mid-tier topology, or otherwise requires a lot of servers and a lot of specifics to those protocols.

CDNs are definitely investigating and trying things and doing things in these spaces. They want to keep their caching infrastructure going. They want to support new use cases and leverage that to support them. What I call tuned HTTP, is to reduce chunk size and to reduce other aspects of the content, are what I call a front-runner today but there are CDNs considering WebRTC. There are, again, other options out there that I'll talk about in a second that don't use HTTP.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Video: When Low Latency Matters, and When it Doesn't

Sometimes low latency is critical, but in other streaming applications it's not worth prioritizing, Wowza Senior Solutions Engineer Tim Dougherty argues in this clip from Streaming Media West 2018.

Video: Where Streaming Latency is Today

Wowza Media Systems Senior Solutions Engineer Tim Dougherty surveys the recent and current state of streaming latency in this clip from his presentation at Streaming Media West 2018.

Video: What Does Low Latency Really Mean?

RealEyes' David Hassoun discusses what low latency is and what it isn't, and sets reasonable expectations for the current content delivery climate.

Video: Three Ways to Replace Flash for Low-Latency Live Streaming

Limelight's Charlie Kraus discusses three emerging strategies for delivering low-latency live streaming in the post-Flash era.

Video: What is the Best Way to Move Streams Across Unmanaged Networks?

Haivision CTO Mahmoud Al-Daccak discusses the challenges of delivering low-latency streams across unmanaged networks in varying use cases at Streaming Media West 2017.

Video: Is WebRTC the Silver Bullet for Network Latency?

Streaming Video Alliance's Jason Thibeault and Limelight's Charley Thomas address the question of whether WebRTC provides a viable solution for network latency issues in this panel from Live Streaming Summit.

Video: How to Optimize Video Delivery to iPhone Users

StackPath's Nathan Moore explains the protocols, latency, and bandwidth challenges inherent to delivering video content to iOS devices and how content providers can stream to these devices more effectively.

Video: How to Reduce Latency for Mobile VR Streaming

Yahoo Director of Engineering Satender Saroha discusses latency issues particular to VR streaming to mobile and technical measures to address them.

New Report from Wowza Evaluates Live Streaming Latency

The report, "Create the Streaming Media Experience Users Want," focuses on time to first frame and end-to-end latency in five markets: user-generated content, online gaming, sports, news, and radio.

Video: Do You Really Need Low-Latency Streaming?

Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.

Video: How Would an End to Net Neutrality Impact Latency?

Reel Solver's Tim Siglin, Rainbow Broadband's Russ Ham, and Verizon's Daniel Sanders discuss how attacks on Net Neutrality would impact video delivery in general and latency in particular.

Video: Is Latency Still Hurting Live Streaming?

Ooyala's Paula Minardi and Level 3's Jon Alexander discuss the key issues facing live streaming and VOD providers regarding latency times, buffering, and meeting evolving viewer expectations.

Companies and Suppliers Mentioned