Join us for Streaming Media West Connect, November 1-5. Reserve Your Free Seat Now!

Making History One Webcast at a Time

Article Featured Image
Article Featured Image

5. Execution: The strategic deployment team was organized to work quickly and decisively, taking the lead as partners with Thirteen/WNET to quickly convey specifications and get a sign-off on the full-production strategy, from signal acquisition, preroll, and network engineering with full redundancy and player/codec integration and selection with a CDN. All of this and more had to be achieved on the budget of a nonprofit organization.

In order to ensure success on this project, the first course of action was to assemble the team. Immediately, I brought in my longtime colleague, PowerStream Networks CTO Paul Kromrei, to work as my lead partner on the project. Introductions were made via email to the leads who were assembled and assigned across Thirteen’s internal internet and the executive and IT departments.

A simple project in Basecamp was set up to ensure that everyone who was brought into the loop stayed in the loop.

The CDN selected was PowerStream, which met the georestriction requirement and was implemented shortly after the first discovery call. If the end user was within the U.S., the live stream would buffer and play; but for those users outside the U.S., a secondary video clip that provided a disclaimer stating that they had been restricted because of their geography was streamed.

Fifty colleagues in multiple geographic locations attempted to hit the georestricted stream once it was configured to ensure that the end-user locations were being processed and passed or blocked correctly.

Firewall Tunnel
Thirteen/WNET directed us to the head of its IT department. He implemented all the change requests to the firewall appliance, from the ISP down to the switch leading to our encoders, and sat with us in the server room during the testing period to ensure that requirements were met 100%.

We required a series of internal static IP addresses with ports set to 100Mbps full duplex, which were readily provided. The encoders were set to point to lucky port 7777, and all other ports remained closed for security purposes.

Dual ISPs were provisioned within a few days, with dual-public-facing static IP addresses aligned to lucky port 7777, transport from which supported up to five live streams pulled into primary, secondary, and tertiary data centers housing media servers located in Columbus, Ohio; Detroit; and Miami.

When the route was traced from the primary media server to the encoder, there were a mere four hops from Miami to the encoder in New York.

Kromrei personally attended to the creative networking needs required to make the behind-the-scenes efforts infallible. The solution included 10x "hot" failover, and Kromrei stayed up late with us executing a rigorous series of QA tests and scenarios that included a series of 20 negative must-pass test case scenarios and multiple disaster recovery switchovers among the dual encoders, ISPs, and POPs.

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues