Buyer’s Guide to Content Delivery Networks 2015
CDN-hunters, read this first for an explanation of the basic key performance indicators to explore in order to make the right choice for any size company.
Learn more about the companies mentioned in this article in the Sourcebook:
As the CDN world has become commoditized in terms of bandwidth pricing, I have begun to feel this annual Buyer’s Guide article has undergone much the same transformation. Reading back over previous years’ insights on the subject reveals little in the way of significant change. The hairs one wants to split are invariably use-case specific, so I can only provide a broad narrative about the basic key performance indicators (KPIs) you want to explore and provide a starting point from which you can dig deeper.
CDNs increasingly differentiate themselves with value-added services that encompass almost the entire scope of the other Buyer’s Guides in the Sourcebook, ranging from encoding to publishing models and format support. That scope is certainly too wide to cover in a Buyer’s Guide that tries to help a generic CDN buyer.
While I see fundamental changes in the economic landscape and how CDN buyers are actually sold to, as well as some remarkably different technical methods to achieve the same content delivery goals, we almost invariably come back to the same core basic KPIs.
CDNs were, of course, originally a commercial solution. When transcontinental or intercontinental telecoms were sparse—particularly before 2002—there was a strong economic incentive to use proxy servers at each end of these premium long-haul links. That was what gave birth to the CDN between 1997 and 2002.
But as those fibres opened up and competition dragged the pricing down, the CDNs began to distinguish themselves by offering comprehensive support for occasional-use services that they ran on their servers. They were certainly among the first—if not the very first—SaaS providers.
Over the past 15 years, however, it has become possible to rent Infrastructure as a Service, and deploy niche, variable-use servers in response to demand from any enterprise, with little or no capital commitment.
This has meant that CDNs have had to focus more and more on why they are better than a DIY CDN that a publisher could create itself.
To this end, CDNs now either compete by having differentiated ecosystems that can offer more of a turnkey solution to the larger enterprise, or they compete on hair-splitting differentiation about how their servers perform relative to the networks on which they provide services.
As I mentioned above, the scope of what each CDN offers in its ecosystem is too broad for this article. But the core the network delivery hair-splitting at least has a number of fairly clear KPIs that give some grounds for comparison. In fact, that comparison is becoming so formulaic that there is an emerging CDN brokerage marketplace on the rise, in which readings of these KPIs are collected in huge quantities and aggregated into data sets that publishers can mine in real time to make decisions about which CDN is best for a given user request.
Much like stock exchanges facilitate the rapid comparison of financial data to enable software-based transactional decisions, these CDN brokerages are akin to neutral “clubs” that it is in everyone’s benefit to join, even if it creates extreme competition.
So long as the CDN is confident in its proposition and can “play the game,” then it stands to benefit from publically sharing the deep insight into its business that this shift requires.
As CDN federation was emerging between 2008 and 2013, the issue of billing through your competition was never really addressed properly. The neutral brokerage model is much more logical; a publisher has multiple CDN provider accounts, and the brokerage works out which is the best for that publisher’s end users to be serviced by. The resulting bandwidth is sold directly to the publisher from each CDN, but the extent to which any one CDN gets used is determined by the broker—and that decision-making process is what I want to focus on, since it is essentially automated CDN buying.
Analytics Aggregators Simplify the Process
Where in previous years I would recommend you, as a publisher, look at latency, throughput, and geolocation in detail as you select your CDN provider, these brokers now do that for you, and not just once as your establish your workflow, but pretty much every time your users request a file or stream.
I spoke to Pete Mastin at Cedexis, who asserted that Cedexis is more akin to Google Analytics than to my analogy of a stock exchange. Cedexis aggregates data and makes it available to customers in near-real time to then underpin its own load balancing requirements. It just happens that this load balancing between multiple CDN providers has an effect just like the commercial exchange model I have highlighted.
There are two core aspects to this proposition.
The first is Radar.js, which is a small script that all those who sign up (for free) to Cedexis then embed in their webpages. Whenever a page is executed by a client browser it pulls a small (100KB) file to the client. The file measures round-trip time, latency, and throughput, and then posts those results back to Cedexis. Radar.js is executed billions of times a month in many millions of web browsers and creates an incredibly detailed picture of how both CDNs and ISPs are performing. A recent blog post from Mastin demonstrates how the dataset can be used. For geeks such as Streaming Media’s Dan Rayburn—who regularly cites Cedexis data in his blogs—and myself, this is an indulgent insight into the real state of content delivery over the internet.
Where have all the CDNs gone? None of the most prominent CDNs want to be thought of as CDNs anymore. What does that mean for buyers or for the industry?
CDNs aren't what they used to be. Here's what you need to know as you select one (or more) to deliver your content.