-->
Save your seat for Streaming Media NYC this May. Register Now!

Choosing a Video Codec

This article first appeared in the 2006 Streaming Media Industry Sourcebook, which was available free to all subscribers to Streaming Media magazine. For your free subscription, click here.

Choosing a codec used to be simple. Real charged for the server but was still cost-effective for many content distributors; Windows Media was free but offered limited server and playback support. QuickTime’s quality trailed the big two considerably at any data rate south of the gargantuan files used to transmit Star Wars, Episode One: The Phantom Menace via progressive download. These well-defined differences made it fairly easy to choose a compression technology based on financial, qualitative, or religious reasons (for QuickTime devotées, of course—and yes, we’re aware that QuickTime is really a "wrapper," not a codec in itself, but most of us use "QuickTime" as a convenient shorthand).

The codec world has gotten more complicated. Playback compatibility requirements have expanded, and you might be willing to trade UNIX playback for compatibility with a 3GPP cell phone. The availability of digital rights management (DRM) protection is critical to those selling their content, and royalty costs are becoming a consideration. Flash has emerged from a vector-based format for creative Web advertising to a mainstream video technology with a surprisingly powerful compression scheme. After years of playing quality catch-up, MPEG-4 video is starting to shine, especially some of the new implementations of the Advanced Video Coding (AVC). But will the new ones all play on your Windows or Mac machine?

Video quality at a given bit rate, while historically the most important comparative metric between codecs, may now take a back seat to device compatibility, player ubiquity, royalty cost, or DRM scheme. But that doesn’t lessen the importance of understanding how your chosen codec stacks up against the relevant competition; after all, the most visceral observations your viewers will make depend precisely on quality. And whether your codec is at the top of the heap or the bottom of the barrel, the ability to eke out the last bit of potential quality via shooting techniques, encoder choice, pre-processing, or choosing between techniques like variable and constant bit rate encoding has never been more important.

In the interest of full disclosure, I’ll acknowledge now that I’m the author of three research reports published by StreamingMedia.com that benchmark codec and encoder quality. As such, I’ve spent the better part of the last four months playing files produced by a number of codec vendors and encoding, playing, and re-encoding test files that I produced myself. You know the drill: this article is designed to describe the studies, providing sufficient relevant information to make it worth your time to read the article while not giving away the vital information contained in the study. If I provide sufficient incentive to convince you to buy one or more reports, so much the better. But at the very least, this article should provide some food for thought for your next encoding session.

Formalities out of the way, let’s get started.

How We Tested
Let me start with a bit of history. Back in 1993, I produced my first codec survey in a product called the Video Compression Sampler. This product analyzed five codecs (Cinepak, Indeo, Video 1 [in 8- and 16-bit], and Xing’s proprietary MPEG and RLE), using four simple test clips ranging from a talking head to high motion, encoded at seven different data rate/resolution/frame rate combinations. Experiments also included on the disc swelled the number of files to about 300.

I encoded all the test files on an 80486 computer that took roughly 45 minutes to encode each 20-second clip. There were no frame shots and no analysis; I sold the CD with a dual-window player program that let you load and play the files side by side and draw your own conclusions.

This time, I started with a test file with 42 different scenes divided into five categories: business, action, entertainment, animation, and pan and zoom. I sent the file to a number of vendors (Apple, Microsoft, Nero, Real, and Sorenson Media) with instructions to produce files in five basic configurations, targeting 56Kbps modems, 3GPP cell phones, LANs (100Kbps and 300Kbps), and broadband (500Kbps).

From these codec benchmarking efforts, I produced files with the Macromedia Flash 8 Video Encoder, Autodesk Cleaner XL 1.5, Canopus ProCoder, Sorenson Squeeze, and On2 Flix Pro Encoder. In the end, the study compared files in three groups:
Flash, which included the new VP6 codec encoded with both Sorenson Squeeze and On2’s Flix Pro Encoders, files produced by the Macromedia Flash 8 Video Encoder, and files encoded using the Sorenson Spark and Wildform codecs.
• MPEG-4, which included Apple’s H.264 technology, Sorenson’s MPEG-4, files produced by Nero using the Ateme codec, and the MainConcept Encoder.
• Proprietary, which included RealVideo, Microsoft’s Windows Media, and the best of the Flash and MPEG-4-based codecs.

Each study had three major goals:
First, to identify the highest-quality codec in the group and rank the other participants.
• Second, to describe how to optimize files produced by the major contenders in each category through preprocessing and encoding alternatives such as noise-reduction techniques, and scaling and deinterlacing in third-party programs, as well as using various encoding programs like Cleaner XL, Canopus ProCoder, or Sorenson Squeeze.
• Third, to identify the problems commonly experienced by the major contenders and discuss shooting and scene setup techniques to minimize these problems.

To compare the codecs and rate the encoding alternatives, I evaluated each of the 42 test scenes in each encoded file for still-frame quality, playback smoothness, color, and temporal quality (the lack of motion artifacts). Each main study included 25 different scoring categories (five classes of videos times five encoding configurations), plus assorted other analyses to assess encoders and encoding configurations. •Overall, I produced and/or evaluated more than 585 video files and grabbed and analyzed more than 8,000 frames from the video files in groups of four, all dutifully rated in an Excel worksheet that’s more than 1MB in size. That’s probably just the average size of a Dennis Kozlowski expense report, but pretty large for just data entry and formulae. I compiled the results into three downloadable PDF reports with the aforementioned screenshots and encoded tests files also available for download with the purchased report.

And the Winner Is?
If it were only that simple . . . In a way, it is. The overall best codec was Real Networks’ Real Video. Though it didn’t win every category in every comparison, it held up remarkably well across the board.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Companies and Suppliers Mentioned