-->

WebRTC Deployment Basics

Learn more about streaming analytics at Streaming Media's next event.

Watch the complete presentation from Streaming Media East, VES204. Deploying WebRTC in a Low-Latency Streaming Service, in the Streaming Media Conference Video Portal.

Read the complete transcript of this clip:

Dr. Alex Gouiallard: Encoders in WebRTC are slow, and one reason for that is most of the people in the streaming industry, like Netflix, that stream pre-recorded content, will focus on the decoder speed. They have all the time in the world to encode, they have multi-pass algorithm, they're gonna use machine learning, per-title encoding, and so on and so forth. They're making the difference based on the fact that they have a lot of time to actually encode. But when you decode, that's the user experience, so it needs to be fast.

The example is AV1, for example you have dav1d, and a few projects related to AV1 codec that are decoder-only, because really that's the only thing that they really care about, it's not the problem. so Netflix and YouTube provide some self-encoded datasets for people to try the decoder against, and Firefox announced, I think, in October or November last year, support for AV1 in Firefox, but, of course, by that they mean support of decoding and not in WebRTC. Right, well you know, fair.

Well, we'll go into other claims by other people. Somewhere. Let's go there right, so, what's the problem already? If you want to decide what is slow, you need to know what is fast, right? So what is the fastest you can go? If you look at the different encoding speed on GPU on a single computer shared memory, you know life is good, the average is 11 milliseconds per frame. That's 4720p, so you can take between 5 and 15, and that's as fast as you can expect to go.

Now what about AV1, and libaom, and things like that? What is the speed we achieve today? So that's the latest, so every week for the AOM which we are a member, we do the benchmark for everybody. That's AOM 1080p, 720, and VGA, and that's the Intel encoder. Those are only encoding speed, and you see that libaom is not progressing a lot in terms of speed, in terms of FPS, but AV1 from Intel was actually under work a lot, and not so much recently again. Here is the announcement by Netflix and Intel about the real-time codec. So what's interesting is this, this is a logarithmic scale, so this is one frame per second, this is 10 frame per second, and then you go 20, 30, 40, 50, 60.

So, VGA, nobody really cares for streaming, right, so let's be serious. 720p. You can achieve almost 30 FPS, with 720p today, but that's on a Dell XPS, you know, normal stuff you can buy at Costco, not a special hardware at all. You can already achieve 30 FPS, but the question, the $1 million question is, is 30 FPS real time or not? Unfortunately this is really misleading, and the press release from Netflix and Intel are really misleading. 30 FPS is the throughput, it's not the latency, so you can do 40+ FPS with the Intel encoder, but with 6-7 seconds latency, that's the problem today, this is not a real-time codec.

How do they do that? Well, it's a little bit like bandwidth, you can have one gigabyte per second connection, but it goes over satellite, and so the round trip is one second. Here is the same, they encode a lot of frame in parallel, so they achieve a global throughput of 40 or 60 FPS, but if you compute the latency, which is the time between the output of the capture when you get the raw frame, and the output of the encoder, when you had the encoded frame, then it's seven seconds, and one of the reason for that is they have a huge frame buffer of 72 frames, and so you need to feel that buffer before you can do anything, because that's not a real-time codec.

Codecs that go for the quality, they need to have interframe prediction, they have motion vector, they have things like that, which require to have several frames at hand to be able to do those computation. A real-time codec, well if you need 10 frame in a buffer, and the acquisition of a frame is 10 millisecond, you already lost 100 millisecond, just to fill your buffer at the beginning, it's not acceptable. So most of the real-time codecs will do one frame per second they will do intraframe prediction, and encoding, and so on, but they will remove everything that requires to have several frame in the buffer. So that's the notion of the real-time codec.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

What's Next for WebRTC in 2020

CosMo Software Consulting Founder & CEO Dr. Alex Gouaillard rolls out predictions for WebRTC technology in 2020 in this clip from his Video Engineering Summit presentation at Streaming Media East 2019.

Video: How Does WebRTC Differ From HLS?

Millicast's Alex Gouaillard breaks down the differences between WebRTC and HLS in the streaming pipeline in this clip from his presentation at Video Engineering Summit at Streaming Media West 2018.

Video: Pros & Cons of WebRTC for Live Streaming Playback

VideoRx CTO Robert Reinhardt discusses the benefits and drawbacks of WebRTC in this clip from his presentation in the Video Engineering Summit at Streaming Media West.

Video: Is WebRTC Today's Best Real-World Option for Low-Latency Streaming Playback?

Video Rx CTO Robert Reinhardt discusses the pros and cons of WebRTC in this clip from Streaming Media East 2018.

Video: Is WebRTC the Silver Bullet for Network Latency?

Streaming Video Alliance's Jason Thibeault and Limelight's Charley Thomas address the question of whether WebRTC provides a viable solution for network latency issues in this panel from Live Streaming Summit.