Rich's Media: Close Enough for Government Work
This article first appeared in the December 2009/January 2010 issue of Streaming Media magazine. Click here for your free subscription.
With more than 1.8 million employees (not including the military or the U.S. Postal Service), the U.S. federal government is the nation’s largest employer. If there is one organization that always needs more and better communications and training, it’s the federal government. We’ll examine several cases that illustrate how the government has embraced streaming media in more detail in a future column, but for now, here’s a little streaming history.
Some readers may recall that there was a time (about 1994) when a networking technology called asynchronous transfer mode (ATM) was going to cure cancer and win the war. Among the tools in the ATM war chest was OC-3c, which provided 155Mbps of optical networking capacity (much more than mere 10Mbps Ethernet could provide), and 100Mbps Ethernet, which was too expensive at the time and had not yet been standardized.
In a remarkable collaboration, the new standard for video transmission, MPEG-2, was formally organized into 188-byte segments so that it would perfectly fit inside an ATM cell. It retains this structure today. After all, the key characteristic of ATM that made it so appealing was that it could support data, voice, and video. This could not be done with Ethernet and IP at the time. Ethernet networks were largely deployed using shared media hubs, not switches; routers couldn’t multicast; and "video" was thought to be a $50,000 PictureTel teleconferencing system.
It was a fair bet that ATM would become the universal networking technology, replacing the DECNet Ethernet coaxial or the Cabletron hub in the enterprise and seamlessly integrating with Sprint, MCI, AT&T, and other carriers’ ATM backbone networks. As the networking boom unfolded, Quest did with fiber what Eisenhower did with concrete, all using ATM—much of what was done was driven by the expected wave of video that would require ATM.
There were ATM successes. The Department of Defense mandated ATM, and high-speed worldwide ATM networks (DREN, for example) carried lots of ATM-based video traffic. I personally installed MPEG equipment at government installations to deliver live, classified mission video across the country and to remote locations, and it worked flawlessly. I believe it is still being used today. There were commercial successes too. Several Wall Street firms used video-over-ATM for their daily "morning call" briefings to traders. When the capital of Germany moved from Bonn to Berlin, the old and new capitals were interconnected via ATM and video streamed using the old reliable 188-byte cells. ATM is still very much with us, as most DSL lines are actually structured as IP-over-ATM. It is an important carrier technology, but rarely do we see it in its original form.
By the final years of the Clinton administration, the public internet was in full swing and video-over-IP was mainstream (no pun intended). Switched Ethernet with 10Mbps to the desktop and 100Mbps workgroup uplinks were common. Voice-over-IP had helped clear the way for Video-over-IP in the enterprise by forcing the IT staff to deliver a truly engineered network, as opposed to the random growth most enterprises had experienced. ("Hey, we need another computer here. Just shove in another hub to get more ports.")
It was also on the Clinton administration’s watch that the National Security Council upgraded its enterprise network and deployed enterprise streaming media in the White House. Prior to this, every time a world emergency or a major political event took place, White House staff members would run down the hall to a room filled with TVs (like the rest of us, they often get their breaking news from cable TV). So they would huddle in this room, watch the news unfold, rush back to their desks to take some sort of action, and then rush back to the room filled with TVs to get a news update. Staffers explained this process to me when I visited the White House once; they needed to know if there was a better way to be updated on the news.
There was. They installed a streaming system in the White House that gave everyone immediate desktop access not only to commercial TV stations but to classified video feeds as well. This includes the West Wing’s "situation room," where enterprise streaming video gives the president and his advisors immediate and unfettered access to live world events.
Of course, video-on-demand on public-facing government websites is not new. Whitehouse.gov posted RealNetworks video on its site in the 1990s. In later years the site moved to Windows Media, and it now uses mostly Flash (files). But in December 2007, Whitehouse.gov streamed its first live event—the lighting of the National Christmas Tree and the Pageant of Peace, which is held annually at the White House. How? Was it via a government-provided, high-speed, iron-armored fiber, secretly buried under the Washington Mall and accessed through a biometric scanning device? Nope. I stood in the snow during the pageant with a battery-powered encoder and pushed a stream over a conventional Verizon cellular modem to a commercial CDN, where it appeared flawlessly on the White House homepage. It turns out that the White House is just like any other enterprise (but with much tighter security): The president’s people do what works and seek the highest return for the lowest cost.
Employee alignment, where you seek to have a common culture and provide timely training information, is high on the list of reasons for enterprise streaming system, especially when an organization has many branch offices. The State Department has quite a few branch offices, but they’re called embassies instead (see www.usembassy.gov). Embassies can’t just order up more bandwidth (I wonder if you can even get a DSL line in Yaoundé, Cameroon) to deploy a single bitrate high-speed live stream everywhere, so it’s a bit more of a challenge than for the typical commercial enterprise. Fortunately, multiple bitrate encoding and related technologies are available, and challenges can be overcome.
Nevertheless, the State Department finds great value in delivering its internal State Department Channel to all employees worldwide, along with the regular electronic distribution of training material. Multicasting local TV to desktops along with TV from home and other feeds helps keep everyone up-to-date, and the ability for each ambassador to instantly broadcast to embassy staff, especially during emergencies, can make a huge difference.
State government does a lot of enterprise streaming too. I’m sure many readers have visited Orlando, Fla., for a theme park holiday. As you travel the highways along the ride to Orlando, you may notice cameras overhead, monitoring the traffic. There are hundreds of them; they each connect to a streaming encoder that sends live traffic video back to a central monitoring facility at enterprise streaming rates of about 5Mbps each to provide broadcast quality video. If that sounds like a lot of bandwidth, it’s not (it’s only 5% of a 100Mbps Ethernet port).
This is an enterprise network. We (the public) own the highways, and we sell right-of-way use to carriers to bury a fiber in a trench along the road. While they are at it, they drop in a fiber for government use. So these sorts of enterprise networks have optical bandwidth that allows them to stream at high rates via multicast. Back at the monitoring facility, multiple desktops and "video walls" display the traffic and allow the staff to reroute traffic around problem areas. Local TV stations also receive the multicast, and they, in turn, show the traffic to you.
It is true that streaming in all layers of government is becoming commonplace, and it helps to solve many problems. If only it could reduce the deficit.