Streaming Video Latency: The Current Landscape
Learn more about low latency at Streaming Media's next event.
Watch the complete panel, LS104. The Future of Video Transport, in the Streaming Media Conference Video Portal.
Read the complete transcript of this clip:
Oliver Gunasekara: This was my attempt to just try and map the landscape for distribution. I'm not talking about contribution SRT, or RTMP. I've really tried to focus on you're running at scale and you want to deliver video to consumers.
Today we know the norm is really traditional ABR. HLS being the dominant and DASH. But the latency is around the 20 seconds; it can go higher. That is kind of the norm today out there. And if you look at linear TV, obviously that's driven over the cable network, or a satellite network, and it's around seven seconds.
That's a big problem, because if your neighbors are watching the sports game, and you're streaming it, they're gonna be cheering way before you know about it. And so there's immense pressure to get OTT to being below seven seconds, to being below linear. And that's where there's a lot of activity now in CMAF and chunked.
A lot of people are beginning to do that on DASH. HLS is coming. There are some proposals for the low-latency HLS. Let's see what Apple does at their developer conference in a month's time. Hopefully, we're optimistic that it will be formally standardized. But that can get you, with a lot of work, to around three seconds.
You can see people like Twitch are already doing this. Twitch has an advantage in that they control their complete environment, from their CDNs for ingest, their encoding infrastructure, their CDNs to deliver, and their client. But they are now getting in low latency, sub-three seconds, which is pretty impressive.
I think we're going to see a lot of the industry moving into that area to be below linear. And then you come to what I would call ultra-low latency, and I've kind of defined it here in three areas. There's 500 milliseconds, because there are many applications that people have talked about, sports betting, et cetera, where 500 milliseconds is really good for communication for interactivity. And WebRTC is gaining in traction on top of that.
Then there is cloud gaming, where you really want to be around 60 milliseconds, because otherwise there's a bit too much lag. And then the third extreme is what we call Cloud XR, where you might have a head-mounted display, and as you move your head you want it to update. And ideally here, you'll only be sub-20 milliseconds. And so my feeling, going back to your original question, is that I think we will move to these standards, WebRTC, DASH, Low-Latency HLS, all but the sub-60 millisecond applications.
Streaming Media's Jan Ozer and NGCodec's Oliver Gunasekara discuss NGCodec's live HEVC 4k60 encoder, and why the company was wrong about the future of H.264.
NGcodec CEO Oliver Gunasekara talks about how his company uses field programmable gate arrays (FPGA) available on standard cloud computers to accelerate high-quality encoding of HEVC and VP9 video, and what this might mean for AV1.