Amazon Elastic Transcoder: Review
Amazon’s Elastic Transcoder is a service that can encode files living in the Amazon cloud for delivery into the Amazon cloud. In this overview, I’ll walk you through the workflow for using the service and discuss the service’s performance, quality, and pricing. Just to set expectations, this isn’t a full out, “bang it till it breaks” competitive review as much as a “Here’s how it works, and by the way we compared some aspects to other services and here’s what we found.”
Most comparisons are to Zencoder because I’m retesting all cloud vendors for a presentation at the Streaming Forum, and I completed the work for Zencoder first. I’ll fold the results from this review into the handout at the Forum, which you’ll be able to download after the conference.
Amazon pricing is strictly pay as you go, with no commitments. Encoding prices vary by region, but you pay for each minute of audio and video output, with different prices for SD and HD video. The basic structure and pricing for U.S. East is shown in Figure 1.
Figure 1. Here’s the pricing structure for the U.S. East Region.
Note that each output is charged separately, irrespective of whether it involved a transcode (very CPU intensive) or transmux (not very CPU intensive). To explain, my tests involved producing twelve MP4 files, and twelve HTTP Live Streaming (HLS) streams using the same H.264 parameters. You would use this schema to encode one set of files for delivery to computers via RTMP Flash Adaptive Streaming, and another for delivery to mobile via HLS.
The first conversion to MP4 is a transcode, while the conversion of that MP4 file to HLS output is a transmux. Amazon charges for both as if they were unrelated, and, in fact, they probably are unrelated in the Amazon system. That is, the Elastic Transcoder probably encodes the HLS output as a totally separate job.
In contrast, when working with Zencoder and other cloud vendors, the starting point for the HLS output is the previously encoded MP4 file. Not only does this accelerate the work, it affects the price, because Zencoder charges only 25% of the usual rate for transmuxing MP4s to HLS and other outputs.
My test involved encoding ten files averaging about 50 minutes apiece to the aforementioned 24 outputs. With the Amazon cloud, that cost about $270. With Zencoder, at their best pricing of $.0125/minute for SD, and twice that for HD, the same job would have cost $131.38. At a 10,000 minute per month commitment level, which is reasonable if you have one job at 6,000 minutes, Zencoder’s price of $.03/minute ($0.06 for HD) would increase the total cost to $315.32. For onesies and twosies, Zencoder’s price is $0.05 for SD and $0.10 for HD, more than three times Amazon’s price.
As an overview, if you’ve not worked with Amazon Cloud Storage before, S3 stands for Simple Storage Service, basically a storage location in the cloud. Rather than using folders, S3 uses buckets to subdivide your storage location; similar concept, different name. The Elastic Transcoder is a transcoding service that can only retrieve files from, and deliver files to, S3 storage. If you want to encode files on local hard drives, or deliver them to your own servers, a different cloud platform, or a content delivery network, Elastic Transcoder can’t help.
Like many cloud encoding services, Amazon offers both a user interface (UI) and an Application Programing Interface (API) for automating encoding operations and workflows. My tests all used the user interface, but performance and quality findings apply to both the UI and API.
Working with the UI involves three components—pipelines, presets, and jobs, which appear on the upper left once you enter the service (Figure 2). Briefly, Pipelines establish queues for your various encoding jobs, and identify the input and output buckets. Presets identify the encoding parameters to be applied to a file, while jobs apply a preset or presets to a single source file in the specified bucket.
Figure 2. You’ll spend your time shuttling through various pipelines, jobs and presets.
As you can see in Figure 3, there are limits on the number of Pipelines, Presets, and Jobs in multiple ways. As a practical matter, the four pipeline limit is probably the only one that will impact you, and then only if you need multiple buckets to contain different types of content. A simple workaround is to upload your content into folders within a bucket, which a pipeline can access, as opposed to multiple buckets.
Figure 3. Limitations on Pipelines, Jobs, and Presets.
Working in the UI is generally straightforward, though there are lots of minor deficits and rough edges. Let’s start with a look at pipelines. Creating one is simple; you click Create New Pipeline to open the window shown in Figure 4. However, you can’t edit pipelines; if you want to make a change, you have to delete the pipeline and start over.
Figure 4. Creating a pipeline.
One useful pipeline option is the ability to grant all users the right to view the video once it’s encoded, essentially auto-publishing the video file. You access this by clicking Add Permission on the bottom of Figure 4. One frustration is that if you want to configure in notifications upon events like errors or completion, you can’t simply enter in an email address or phone number; you have to sign up for another Amazon service. I’m sure this is a much more secure approach, but it’s frustrating for Amazon newbies like myself; I spent 30 minutes trying to get this setup and got nowhere.
Working With Presets
The Elastic Transcoder ships with a number of system presets, and you can create your own. However, again, you can’t edit presets; to make a change, you have to create a new preset based on the old preset, make the change, save the new preset and delete the old.
We've upgraded our testing methodology to accurately reflect what broadcasters and other high-volume video publishers ask of their encoding systems. Our first subjects: Elemental Server and Telestream Vantage Lightspeed Server.