Latency: The Final Frontier for Streaming Video Entertainment?
Streaming video has come a long way in just a short time. Compared to how long it took broadcast television to take root and gain widespread adoption, online video has happened in the blink of an eye. Of course, the environment for adoption was different. When TV first showed up, people had to buy their first television sets. They had to believe that “moving pictures” were better than the radio and make what was an expensive initial investment. Today, people have a plethora of connected devices (smartphones, computers, tablets, STBs, etc.) from which to connect to the video content they want to watch. Coupled with the birth of direct-to-consumer offerings (Netf lix, Amazon Prime Video, iTunes, etc.), online video has exploded in popularity. Most recently, though, consumer interest in online video has begun to shift toward live streaming. From sporting events to concerts to reality TV to eSports, demand has been steadily increasing over the past several years.
But streaming video isn’t quite like broadcast television yet. In many cases, an online stream might be 30 seconds or even 2 minutes behind. The resulting experience is anything but satisfactory for consumers. Imagine being connected to Twitter while watching a favorite sporting event only to see tweets roll past about a big event, like a goal, way before you actually see it.
The culprit? Latency.
As online video has moved away from proprietary formats (such as real-time messaging protocol, RTMP, and real-time streaming protocol, RTSP) and more toward chunked HTTP, the result has been disastrous—latency has increased 10-fold, if not more. But perhaps even a bigger problem than that is a lack of industry best practices or standards on how to improve it. WebRTC. Websockets. Chunk sizes. Although everyone is tackling the problem as best they can, they are doing so inside of their own walled gardens. So NBC Sports, for example, may have a latency of a few seconds for Olympic coverage, while another content distributor may have minutes of latency for a football match. This makes for a fragmented consumer experience and, more importantly, a breakdown in trust. You see, people trust broadcast television. Traditional cable delivery just works. Sure, broadcasters have a built-in delay but, from provider to provider, the delay is the same. Consumers know what to expect.
It’s clear that this problem needs to be solved, though, and given that most big streaming providers use CDNs, that’s where the innovation needs to happen for two reasons. First, because the majority of video, according to Cisco’s Visual Network Index Report, will be delivered by CDNs. Second, because if the CDNs improve their delivery latency, there’s a better chance of it affecting multiple content distributors and, in essence, providing a more consistent end-user experience regardless of where the stream originates. The question, though, is whether or not the big CDNs will be able to make the kind of changes in the way they deliver content. Limelight Networks recently announced a buffer reduction guarantee. Its ability to do so probably results from very low-level TCP stack optimization. And then there are newer entrants to the CDN market, such as Instart Logic and PhenixP2P. The latter has developed its own stack and deployed it on top of Google’s infrastructure to reduce latency for streaming to near zero.
Of course, there’s no silver bullet to solve the latency issue. Delivery is critical, but even incremental changes to workflow can add latency to the delivery, which makes solving the problem not just a technology issue. It’s about business processes. It’s about network relationships and peering. It’s about optimizations. There are so many variables that can impact latency, it’s difficult to chase them all down. That makes delivering a great streaming end-user experience really hard. But it has to be done, and it has to be done now, if streaming video is ever going to replace traditional broadcast delivery.
[This article appears in the July/August 2017 issue of Streaming Media Magazine as "Latency: The Final Frontier?"]
Join us Thursday, December 7 for "Overcoming the Latency Hurdle in Delivering Streaming Video," a webinar with Limelight Networks
The SRT protocol, which can provide latency at 500ms, is going open source. The SRT Alliance encourages developers to contribute improvements.
It's one of the biggest challenges facing live video streaming today, and it's remained frustratingly stubborn. Explore the technical solutions that could finally bring latency down to the 1-second mark.
Ensuring broadcast quality for viewers is about more than just reducing buffering. Publishers need to improve the efficiency of back-end operations, as well.