6 Things Every Live Stream Needs
Learn more about live streaming production at Streaming Media's next event.
Watch the complete presentation from Streaming Media West, LS102. Live Streaming Best Practices, in the Streaming Media Conference Video Portal.
Read the complete transcript of this clip:
Robert Reinhardt: I'm always learning what to ask to my clients when new problems present themselves. You know, on a very basic level I sometimes have to explain to my clients what live event production means and you know, you always need something that you're gonna be capturing.
In this situation we've got a camera person in the room along with my desktop that's being screen-grabbed in the back and this archive will be available 'cause they're recording it on a Ninja in the back it looks like. If they were streaming this they'd need to, of course, have an encoder to push it out. You need an endpoint that's gonna receive it, you need a player, and you might need an archive as well as, of course, the internet working in that room.
I would say one of the biggest problems I can have, especially if I'm remote or in a new venue is making sure that the internet's not just working on the day of or the day before or the week before whenever we're doing our speed test but making sure that the IT resources are available should the HDP lease expire, you know, hours before an event happens and then we're stuck without having our internet. Capture sources could be any number of resources that are available in the production kit.
I'm gonna gloss over some of this because it's not as important. Encoders, this is an interesting point because I know so many people among a couple webcaster groups on Facebook where there's technically minded people discussing their solutions. And there's a lot of--I would say, the vast majority--of live event production that I see happening with smaller budgets is happening with software, right? They're using--a lot of field folks are using--OBS because it's free, it can do pretty much everything that FMLE from Adobe did before it and more.
You can even use FFmpeg. I did the FFmpeg workshop yesterday and I was talking about how FFmpeg can capture from a webcam can capture from DeckLink drivers. It can capture, you know, it's amazing what you can do with FFmpeg. I usually use FFmpeg in a pinch. if something's broken it's sort of my last resort to encode something through FFmpeg on a laptop that could drive a live stream.
I joked in my workshop yesterday that I like to do non-profit work, I like to do pro bono work for non-profits that I believe in. One of those is Pacific Wild in Canada and then Pacific Northwest in the United States, and they wanted to have a live stream on social media outlets of the effluent coming out of fish farm processing to show just how much of this, potentially polluted fish by-product was being just thrown right out into the bay where other fish would have to survive. They were literally in a shack. Hiding. It was a covert operation, they were in a shack that barely had two bars worth of signal and they didn't have any encoding gear and so they were originally gonna use this really high-end access camera that had a waterproof casing that they could navigate, didn't need a diver and they were gonna run this thing down but the problem was, where this pipe was coming out it was a good 110 feet below the surface of the water. So they didn't have a long enough cable to tether this camera down to where they wanted to grab it. They only had this ancient, composite video source that could get down there so it was a very low-quality video image but it was all analog at that point and that wasn't our plan. We were just gonna take this H.264 feed coming right out of the access camera, push it on out through their cell connection and make it live on Facebook.
Well, at the last minute it didn't work, so I had to basically send them a Blackmagic Design Intensity Shuttle 'cause that was the only thing I had that would work on their ancient Windows laptop. It had USB 3, thankfully, and I basically remotely controlled their computer with TeamViewer and downloaded a pre-billed binary of FFmpeg and started streaming that way. And it worked. We were able to get a stream out. They were complaining about the quality but there wasn't much I could do about the fact that they went from an HD, really nice-looking camera to an SD source that was over a composite video connection. And by that I mean that old-fashioned, yellow, RCA plug that you rarely see even on equipment anymore.
Again, just going into some definitions. An endpoint that your publishing could be, a CDN like Akamai, it could be your own server that you're running on the cloud. It could be a social media outlet like Facebook or YouTube and I already mentioned streaming server there, something that's running on-prem, at a colo or in a cloud. I will talk more about social media outlets in a little bit.
The player could be, of course, anything that, you know, like I just said there's a lot of player offerings on the market, a lot of them free. You can do a lot with Video.js, like I said, a lot of my clients already come with JWPlayer infrastructure, so I just work with that. It's probably the player I've done--me and my team have done the most development with. And let me go over here. So not every live stream will need these, and this is an important part of the equation to talk about with stakeholders when you're planning a live stream event.
Do they expect there to be some way to prevent others from going into it? Is there, do they need low latency? It's always funny to me when a client will say, "Oh, the stream's delayed by like, you know, it could be up to 30 seconds with a default HLS installation, right?" You got ten-second chunks, three chunks on the server, you might have a little bit of latency on the cloud for deploying that and it's just like, well, is that gonna a be a problem? Oh no, I just think it's delayed. I'm like "Okay. well, if it doesn't matter to your viewing audience then, you know, we could make it more low latency but it might involve some more extra time on development and tweaking our decoding settings."
It's pretty easy to get latency down to six seconds with HLS: just do two-second chunk sizes, two-second keyframe intervals, specify the server to have three, maybe even only two chunks per playlist. There's more overhead in your request that way, but you can get HLS pretty, you know-- it's easy to get it less than 30 seconds if you need to.
Secure playback. I made that as a differentiator from DRM. DRM, of course, is gonna be there for the kind of live streams where people, you know, studio-level content. Most of my clients aren't concerned with the DRM encryption on their streams because again, if it's government or municipalities that's not, it's just not in their rule book.
They might want secure playback in the encryption, though. Especially with the medical stuff that I'm doing and for HIPAA compliancy, having everything encrypted over HLS is something that needs to happen, and also maybe even having this, what we call the limited access. Where you're using some kind of token string in your playback URL to uniquely identify a viewer so that that stream can't get out to anyone else.
VideoRx CTO Robert Reinhardt discusses the key elements of budgeting and bidding live event streams for clients--from labor to equipment to deployment--in this clip from his presentation at Streaming Media West 2019.
Companies and Suppliers Mentioned