-->
Save your seat for Streaming Media NYC this May. Register Now!

Video: How to Optimize Video Delivery to iPhone Users

Learn more about mobile delivery at Streaming Media West.

Read the complete transcript of this clip:

Nathan Moore: What if I want to stream to iPhone users? Well, if I care about iPhone users, how do I get the video to them? I use a protocol called HLS. Great. I I'm going to use a protocol called HLS, now, how does the HLS object actually get over there? Well, you have, you're on an iPhone, you're on a 4G network. What's wrong with 4G networks? Their connections can be spotty, latencies vary, and they vary wildly, depending upon where you are. If you ever walked through a tunnel with your cell phone and have lost the signal, you know what the problem with 4G is.

If I want to do tens of thousands of these things simultaneously, I can't just do this with one server. And lastly, where are they? This is the last problem that a lot of guys don't think about, because of the old broadcast models — you have just one central broadcast tower, it gets to everybody in the local area. That's no longer the case. The whole wide world can go talk to your server. If you're, very, very far away from you, then they're going to have a very bad time due to the very bad latencies, and if they're very, very close, it's going to be all right.

HLS is also known as HTTP Live Streaming. That tells you it has a dependency upon the HTTP protocol, right, the Hypertext Transfer Protocol, but HTTP itself has a further dependency on TCP, the Transmission Control Protocol, in order to get that HTTP object there in the first place. So it's layer, upon layer, upon layer, upon layer of different protocols, all of which have to inter-operate, all of which have to operate correctly and which, all of which, have to do their job properly, in order for your end user to get a lovely, continuous video.

Assuming you want to stream video from a server to an iPhone, your video is encoded at 1 megabit per second. Obviously these numbers are deliberately munged together to make the math super, super simple — the real world, you'll get slightly different measurements. We know we're going to send to the server to the iPhone, the iPhone takes 1 second — we are 1 second latency — so it takes 1 second to get to the iPhone, 1 second for the iPhone to come back to the server.

So long as TCP can negotiate a speed of 1 megabit per second, the iPhone can play the 1 megabit per second encoded video at its intended rate, right? So we sent ... We do that by sending 2 megabits, pause, takes 1 second to go over — iPhone sends its acknowledgement right back — and 2 megabits divided by 2 seconds to send that 2 megabits. Well, that's the 1 megabit per second that we needed. Great.

What happens if we have loss? If we have loss, we're still encoded at 1 megabit per second, we send 2 megabits of data — iPhone never receives it, so it can't send an acknowledgment; so the server waits 2 seconds, it says, "I know you're a second away, so it should have taken 2 seconds for me to get that packet, and I didn't get it. I better retransmit," right? This time iPhone gets it, sends its acknowledge back, another 2 seconds has passed. Now do the math: We just sent 2 megabits in 4 seconds, which is only a half megabit per second.

If you're watching a video encoded at 1 megabit per second, what happens? Delay, stuttering, buffering, buffering, buffering, buffering ... And this is why things like retransmits are so incredibly important. This is why it is amazing that this stuff works over a 4G network, which is intrinsically lossy, with intrinsically variable latencies, meaning if we ever have to retransmit, we could be waiting several seconds, we could be waiting milliseconds ... It depends. And it's really quite remarkable that this stuff works as well as it does.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Video: How to Reduce Latency for Mobile VR Streaming

Yahoo Director of Engineering Satender Saroha discusses latency issues particular to VR streaming to mobile and technical measures to address them.

Video: Best Practices and Key Considerations for Reducing Latency

Wowza Senior Product Manager Jamie Sherry discusses key latency considerations and ways to address them at every stage in the content creation and delivery workflow.

Video: Do You Really Need Low-Latency Streaming?

Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.

Video: Is Latency Still Hurting Live Streaming?

Ooyala's Paula Minardi and Level 3's Jon Alexander discuss the key issues facing live streaming and VOD providers regarding latency times, buffering, and meeting evolving viewer expectations.