-->
Save your seat for Streaming Media NYC this May. Register Now!

How to Choose a Video Capture Card

This article first appeared in the 2007 Streaming Media Industry Sourcebook. To subscribe to Streaming Media magazine and receive free access to the digital version of the Sourcebook, click here

Just when it seems safe to make a video capture card purchase, something new seems to come along. What’s a content creator to do, and what criteria should one assess to make the best purchase? And is a capture card even needed now that computers are more powerful? This article will provide several options to consider.

To properly assess content capture, we’re going to break down the field into three areas: live capture and transmission; asynchronous capture and delivery (live capture and encoding for later playback); and edited on-demand content—in other words, content that may be created in pieces, edited together, and then output for consumption.

Live Capture and Transmission
This area is the most misunderstood and also the one with the least amount of innovation, at least in recent years. Perhaps the lack of innovation is due to financial difficulties facing particular industry leaders or the simple fact that we’re in a transition from standard-definition content to high-definition content as the primary medium. One thing is certain, though: Live streaming has never been easier or more cost-effective to do.

To properly assess live streaming capture cards, one must consider three areas: the types of inputs—both audio and video—that will be captured; the format or formats and bit rate or bit rates that must be generated; and the decision regarding whether all content will generate from a single server or be mirrored on multiple servers. Let’s look at each of these three areas.

Inputs
GIGO (Garbage In, Garbage Out). Even with a high-quality camera as the acquisition device, too many content creators settle for the use of composite inputs and substandard cabling for use in their streaming media projects. The rule of thumb for analog capture is that every 3 decibels of noise (snow, artifacts, etc) results in a doubling of the bit rate required to achieve equivalent quality. The converse is also true: eliminate noise or artifacting in the acquisition device, and the bit rate will fall while the compressed content quality remains high.

Consider using a camera with a 3-CCD (or newer CMOS) capture chipset, and then use either S-video or component video outputs to connect the camera to the streaming media card’s input. Better yet, with the advent of digital video cameras—especially those that use USB 2 or FireWire/i.Link connectors—consider connecting directly to the capture card or the computer if it contains internal FireWire connectors in order to maintain a direct digital path from the camera to the streaming software. On really high-end cameras, the option to use SDI (serial digital interface) should also be considered, as this is offered on several streaming capture cards. A note of clarification, though: with the exception of very new cameras that use the H.264 codec, any content captured with a digital camera will still be re-encoded with a streaming codec. The reason for this re-encoding is strictly a matter of bandwidth: most digital video cameras capture at 25 megabits per second, or almost 50 times the average sustained bandwidth of a consumer’s cable modem or DSL.

On the audio side, if possible, don’t use the on-camera microphone for audio capture. Doing so adds additional extraneous audio noise that is very difficult for the audio codec to translate, and extremely distracting to consumers when they listen through low-end desktop or laptop speakers. Instead, place an external microphone as close to the subject of your video, and use wireless microphones if necessary.

Inputs on streaming audio boards and analog-to-USB audio capture devices typically fall into two categories: balanced and unbalanced. Unbalanced connectors are typically RCA connectors or the noise-prone 1/8" (3.5mm) stereo jack—the same types of connectors used on VCRs or headphones, respectively. Balanced connections, on the other hand, are typically 3-pin XLR connectors, the type of connector used on a professional microphone. The XLR connector is normally attached to a streaming media capture card by way of a breakout cable, as the connector itself is too large to fit on a standard PCI card. Some cards will also generate enough power via the XLR connector to power an external microphone through a process known as phantom powering.

Formats and Bit Rates
When the proper connectors have been determined, the next step in live capture is to determine the codec and bit rate of the content that will be streamed. The use of multiple codecs and bit rates used to require a 1:1 ratio of inputs to capture cards, along with a significant amount of external gear. Fortunately, companies such as Viewcast have created software solutions like SimulStream, which allows multiple bit rates or even different codecs to be simultaneously captured and streamed from a single video and audio input. At the time of this writing, the most popular live codecs were WindowsMedia, QuickTime and Real, but the recent advent of a live streaming SDK for the On2 VP6 codec (better know as Flash Video 8) will probably propel that format to one of the top three codecs required for live streaming, especially given the fact that the installed base of Flash players far exceeds the number of installed Real Players.

Delivery
Once the inputs, codecs, and bit rates have been determined, the last step in the live streaming scenario is to choose whether to deliver directly from the streaming capture device or to offload delivery to other servers that have more robust streaming and bandwidth capacity. This decision is typically based on three key criteria: number of simultaneous users, number of chosen codecs, and processor power/bandwidth available at the point of capture.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Companies and Suppliers Mentioned