Streaming Media

Streaming Media on Facebook Streaming Media on Twitter Streaming Media on LinkedIn
 

Streaming Live Concerts: Backstage with TourGigs

This article explores what it takes to pull off a live multi-camera concert film, backhaul a compressed stream from a crazy location, and deliver it at scale to users on multiple screens. It also looks at different monetization models, rights management, policy enforcement, royalties, and why some shows go perfectly well and others go perfectly wrong.

Data Mining

Our customers are an invaluable source of data. Everybody that has a customer pool big enough to mine should do so. They've purchased a show by a particular band, so we know what kind of music they like, and we know what part of the country they're watching in. These are the things that tour managers like to know.

We’re constantly developing new ways to mine our customer pool for data. That said, with a customer base as big as the one we serve with the rock concerts we do, if you take the combination of whether they're using Mac or PC, or Android or iOS, whether on Comcast vs Time Warner vs AT&T, and whether they've got wireless or wired in their house and you put all these permutations together it becomes a huge number. If you look through our user agent logs, you see people that are using terribly outdated browsers or an iOS that's many versions behind, but you have to support them because they've paid money to watch your stream. We have really two choices: We can either support them or we can give them their money back. We want to try to support them.

We've found that about 1% of our users will have issues even on a technically perfect event. If we have an event with 1,000 people watching, 10 of those people are just not going to be able to get it to work for whatever reason. Sometimes it’s absolutely mind-boggling when we can't solve their problems. But for the majority of that 1% of users we can help and get them watching the show, our time-to-resolution just needs to be in minutes. They don't want to miss a single song. This is a live event that they’ve paid to see, and they want to see it while it's happening. They want to participate in the shared experience, especially if they're on the chat and they see that everyone else is having this great experience and they can't get it to work.

We do support by phone, chat, and email. It's been really successful. When a viewer has a problem and they realize that we're there to support them and do our best to get them online and watching the show they become a really loyal customer. As an industry, if we really expect streaming to supplant traditional broadcast, we need to make it as easy and reliable as broadcast for end users.

Success Stories and Challenges

Here are a couple of brief success stories. In 2015 we did a My Morning Jacket show at Red Rocks (Figure 4, below), a beautiful outdoor amphiteater in Colorado. The promotional engagement that we got from the band and their management was off the charts. The success of the show reflected that. We had our all-star production crew headed by Danny Clinch. We had our A-list shooters and our A-list gear, and it was one of those nights when everything coalesced perfectly. You couldn't have asked for better production. For the backhaul we used the connection that was available at Red Rocks. They’ve invested in a very strong, commercial-grade connection that they've invested in that's plenty fast.

Figure 4. My Morning Jacket at Red Rocks

A contrasting example of a success story is Umphrey's McGee 2014-2015 tour, which their fans call the Couch Tour. We did multiple stops, dozens of streams overall during the run. We heavily relied on the Ka band satellite uplink because we were moving from one venue to the next so quickly and there really wasn't a way to advance bandwidth or advance connectivity at every site. Our crew lived on the Umphrey's McGee crew bus. They were actually integrated into the production crew. This reduced our cost greatly because we weren't paying for flights or hotels. We provided the streams at a low cost both to us and to the audience at home. We used local shooters a lot. We had a list of people in each city we were hitting that we could call up to come and shoot instead of flying out shooters to each site. We made heavy use of robotically controlled Pan Tilt Zoom (PTZ) cameras, and also locked off angles. You can get by with only a few shooters with a live stream if you have enough cameras locked off angles and you direct it correctly.

It would be dishonest to say that every stream went great. Everyone that does live production knows that there are times when things just break and you have failure modes. It's very important to own up to it and to try to improve the technology and improve how we handle these kind of challenges. Otherwise we're going to have tech writers telling us live streaming isn't ready for prime time yet again. I see these kind of headlines in the tech press come across all the time. I think we just need to recognize it.

As I said earlier, we get one shot at this. We have to be on our game. No matter how much we plan and try to be on our game, things will break. Every component in the block diagram in Figure 2 will have a failure mode at some point of its own. We don't always know how that will fail so we can try our best to engineer around that and build in redundancy, but there's going to be some part of the system that breaks that you weren't expecting.

For example, we filmed an outdoor festival where a drunk guy stumbled into the dish, knocked it over, and took the stream offline. That’s much less likely to happen with a satellite truck, of course, but this was way back stage, cordoned off with bike gates, and it still happened. Weather can also cause problems, as can bad cables. We had our chat system crash early on. We've had our payment gateway crash. Once our Facebook API broke, which caused slow loading of our site as it was timing out. We've had the whole site crash because developers got a little eager and tried to push out hot fixes and new features right before an event. We've uncovered obscure hardware bugs and software bugs in the systems that we use. We've had power failures.

We’ve also had to contend with residential cable provider outages. We can link the affected users to the down detector site, but at the end of the day they're going to have a bad experience because of factors beyond our control. Ass one of my developers says, that’s life on the internet and we just have to get on with it.

The Future of Live Streaming

Live content is the most compelling video content out there. Live delivery can create a shared experience where your end users are watching the same live content as other viewers with similar interests and interacting with them on the chat. Concerts lend themselves particularly well to this shared live experience.

The audience for live streaming is going to grow organically. More and more people are getting online. More and more companies are delivering concert streams and raising awareness that this is a great way to experience live music. The Chromecasts and Fire Sticks and all these types of devices are great for enabling this experience and making it easy.

The analytics we get from our audiences are marketing and engineering gold. We're just starting to mine a lot of this marketing data; soon my engineers will start looking at the performance metrics of people watching our streams. 4K and VR Immersive are just starting to ramp up. For those of us streaming live concerts, these represent great opportunities. Concerts themselves are already augmented experiences in multiple ways--the music is really loud, there's a light show, etc. All of these sorts of augmented experience play really well into the VR Immersive augmented reality experience. In a VR setting, we can do things in streaming that aren’t physically possible even for concertgoers at the venue itself, so I’m very excited about what the future holds for live concert streaming.