The Digital Domain: User-Generated Content Live (sort of)
It was a circus, but somehow we managed to get some phone cam footage into a live webcast of Linkin Park's Projekt Revolution show from Detroit.
Learn more about the companies mentioned in this article in the Sourcebook:
I recently had the opportunity to participate in a webcast that proposed to take user-generated content to the next level. Marc Scarpa from Simply New called up with an interesting proposition. He was webcasting Projekt Revolution, Linkin Park’s annual touring extravaganza, from Detroit with a truck full of high-definition equipment. The webcast would be featured on a MySpace page created for the tour. But he wanted to add something special, something that would distinguish this webcast from the myriad other live concert webcasts. Marc is always pushing the limits of technology, and in the past we’d worked on a number of firsts.
This time, he wanted to include footage shot on cell phones, and use those cell phones as additional cameras. The cameras would be carried by fans of the three headlining bands, and chosen from their respective fan clubs. These "camera kids" would gather footage all day long—everything from interviews with other fans to exclusive side-of-stage and backstage access with their favorite artists.
The best part was that this footage would actually be webcast live. The phones would be loaded with PocketCaster software, which enabled them to broadcast the footage to a PocketCaster server. My job would be to work with the camera kids, producing and directing their footage. We’d switch to the camera kids at predetermined times for interest stories, and switch to them during performances for point-of-view shots.
The PocketCaster software enables video broadcast over third-generation mobile networks, as well as Wi-Fi. It supports a number of different encoding profiles, which are tailored for particular mobile networks. Since we wanted the highest possible quality, the plan was to use a local Wi-Fi network so we could broadcast at a relatively high bitrate (300kbps at 320x240). I’ll save a full review of the PocketCaster system for a future issue. Suffice it to say it does what it is supposed to and made me believe that cell phone video is actually cool.
Switching would be done via the PocketCaster studio interface. It’s a browser-based tool that lets you switch between a number of live inputs, as well as a library of previously shot clips. There are a number of preview windows, one for each live input, and one for the clip library. Click on a preview and it becomes the master output. This output would then be scan-converted and sent to an input on the main switcher, which Marc could switch to whenever he wanted.
The DTE Energy Theater is bowl-shaped, so we knew we’d have challenges covering the entire show with a Wi-Fi network. Also, the entire stage was made out of metal, which represented another physical challenge. But with enough repeaters and powerful-enough antennas, we figured we’d be covered. During the walk-through we discovered an existing Wi-Fi network, but agreed to stay off their channels to avoid interference.
Fantastic, right? Wrong. On the day of the show, we couldn’t establish a live connection for more than a couple of minutes. Not only that, but we couldn’t get a signal on the side of stage, or for that matter in front of stage—nevermind fifty yards away. We tore down our entire infrastructure and rebuilt numerous times. Nothing worked. That was when we realized there were seventeen—count ‘em: seventeen—wireless networks within range.
Where in the world did they come from? There are only eleven available wireless channels. No matter how you slice it, there were too many networks. Even if we could figure out where they were coming from, what could we do? Tell people to shut off their networks? Not gonna happen. We tried using the AT&T MediaNet, Verizon, and Sprint mobile networks. No signal. Live just wasn’t going to happen.
So we had the kids archive footage locally to their phones, and run back every five minutes or so. Then we’d load their footage onto a local server, and use this footage as "live." It was a circus, but somehow we managed to pull it off, and managed to get some phone cam footage into the live show. The kids loved it, we heard good reports from viewers, and apparently no one was the wiser. Another one pulled out of a hat, thanks to the colossal efforts of the PocketCaster crew and the infinite patience of Marc, the director.
Live user-generated content is great, but where is it all going to go? We need more bandwidth, and more spectrum—that much is clear. But will it ever be enough? This is a particularly apt time to be asking these questions, with Congress about to auction off a spectrum that is currently used by audio wireless technology. Where is that technology going to live now? I’m reminded of my friend Dennis Wilen’s law: "Bandwidth expands to fit the waste available." Dennis, you were years ahead with that one.