-->
Save your seat for Streaming Media NYC this May. Register Now!

NAB 2019: Telestream Talks Cloud Transcoding and Hybrid Workflows

Transcript

Jan Ozer: Jan Ozer here with Ken Haren from Telestream. He's a Marketing director. We were having a conversation about Vantage customers, so Vantage has traditionally been an on-premises solution and how a lot of those customers are moving a lot of the workflows to the cloud via Telestream Cloud via an acquisition you, you picked up a couple of years ago. Why don't you tell us about that, Ken?

Ken Haren: So we acquired a company a couple of years ago that's a cloud services company. They had a cloud transcoding service and it was a very capable cloud transcoding service. If I need to take files and flip them into some other format, I can drop them in and get them flipped out, but, from a workflow standpoint, most of these cloud services don't enable the kind of really sophisticated workflow automation that Vantage does, right? On the flip side, Vantage has traditionally been an on-prem, video transcoding solution and so what we have been working on and what we've been launching is Vantage Cloud Port, takes sort of the best of both worlds. It takes these service-based actions that execute in Vantage, wraps them in a microservices architecture and made them deployable and orchestrated through our Telestream cloud product. So now, I can build workflows that have actions that specify where and how to execute that action based on rules. So if I have content, for example, that is currently parked in Amazon S3 buckets in the Northern Virginia region, I want to avoid moving that content out of that region in order to execute my workflows, so if I need a transcoder, QC, or standards convert that content, I don't want to have to always pull that content into a separate cloud region or into my on-prem facilities to support that workflow.

So with Vantage Cloud Port, I can be very smart about when I design my workflows, if my content lives on-prem I want to execute that workflow, usually, where that content lives, I am gonna get better performance, I don't have to move the content to anywhere, but for all the customers that we have that have either completed or in the process of migrating a significant portion of their content libraries into the public cloud, this is very problematic. So now we have this way to facilitate, not just executing the content workflows in the cloud but being smart about where we provision the resources to do that workflow execution, always sort of enabling that in the same regions and in the same cloud service where the content is resident, and then completing that workflow execution as part of a very sophisticated, very capable Vantage workflow, right?

Jan Ozer: So what do I buy from you? Do I just pay a few cents an encode or am I still buying Vantage?

Ken Haren: So Vantage Cloud Port is really targeted at our existing Vantage marketplace or customers that are looking to deploy Vantage in-house, right? So, I still have a presence of Vantage on-prem, I can still have all the same capabilities. My existing Vantage systems can be enabled for Vantage Cloud Port. For the cloud processed content workflows, every time I execute a workflow in the cloud there is a pricing matrix that specifies how much that workflow's gonna take to execute, right? So we look at what formats are you trying to derive, what are the things you're doing in the workflow and we collapse all that into a single pricing unit that you pay on a output minute basis. So it combines this perpetually licensed on-prem environment with a scalable, cloud-deployed microservices architecture that's priced on a-- you consume that on a price-per-output-minute basis.

Jan Ozer: So what have you seen in terms of Vantage licenses going away and what do you, how do you manage that as a business and where do you see it going in the next, you know, 2-5 years?

Ken Haren: That's a good question. I would say there are some customers that are very aggressive about their cloud deployment and are in the process of completing their cloud migration and in those instances, an on-prem presence doesn't make a lot of sense and so the traditional model of buying a software license and paying a support model, around that software license, we think goes away for those customers, but I would say the vast majority of our customers, we did, actually, some research earlier this year and we wanted to sort of, in aggregate, the current Vantage footprint, in the world, right? How much capacity does that enable for our customers and it's ridiculous, it's something like a billion output hours are available for processing in the current Vantage environment that is currently deployed. We don't have customers, well, we have some customers that are completing their cloud migrations and saying that everything we do we're getting, we're de-materializing our entire facilities, every we do are going to execute in the cloud. For those, I think that something like Vantage Cloud Port becomes the only model that they're gonna use is a per-use basis type of deployment, but for the vast majority of our customers they will retain a hybrid architecture. They want the ability to burst to cloud for those workflows that it makes sense for, or to support capacity spikes that maybe they don't want to have to buy and perpetually manage infrastructure around, and this is a great solution for that, but they are not going to walk away from this much deployed capacity and, quite frankly, there are workflows, for many of our customers, there are other ancillary systems in the media supply chain that aren't going to be moved to cloud or just, that they don't have plans to do and so retaining that on-prem infrastructure is going to be important to those customers. So from our perspective, the right answer is we should be agnostic to the deployment model. If you are in the cloud on a usage-based plan, that's great. If you are on-prem and a completely, perpetually licensed plan, that's great. Or if you are somewhere in between with a hybrid deployment we have great solutions for you there as well. I would say that's the theme we're seeing right now.

Jan Ozer: Okay, so beyond companies having a lot more media in the cloud, what's driving the need for these, these additional cloud encodes?

Ken Haren: So, the first thing is that the media is in the cloud, right? So, how do I get content from my partners? Often times I give them access to, drop it into a shared storage environment that I have deployed in the cloud, so you want to execute workflow where the content lives. But the number one, sort of, driver for the need to provision and scale on demand that we see in our customer bases, the number of distribution endpoints that they are publishing to, and not just the number. I can plan around if I know last year I sent my content to ten endpoints that had ten unique parameters around how I brand the content, how I sort of slice and dice the content, what format I deliver to that endpoint, and so I have ten workflows I need to execute to deliver to ten endpoints, right? If I know that I can sort of build out capacity accordingly. What we are seeing right now is that the number of new distribution deals that get signed, either to deliver to my own direct-to-consumer platforms, to deliver to electronic sell-through or digital distribution platforms that traditional affiliates, the cable systems, the VOD packages I sent, international distribution has been a big driver, and these tend to be coming up frequently with very little, sort of, planning that I don't have time to sort of go back to my marketing team that just signed this deal to send to this new digital platform that we are really excited to send to and say well its going to be six months until I can get the content library transcoded for that distribution. I need to be able to scale that on-demand and not just scale it, I also need to be able to come back to that marketing director and say hey, you signed this deal with this new digital platform and this is how much it's going to cost you to get our 75% of our content library transferred and delivered into that platform and here's the bill for that, right? So the production systems need to be very flexible and how they can, sort of, scale to support this, I don't want to say chaotic time but there are certainly a lot of new, distribution endpoints that companies are looking to...It's almost like there's this big experimentation going on, right? How do I get in front of the audience? How many platforms? Is it social media? Is it these new digital platforms? I'm going into international markets. Some of the walls that traditionally, sort of, bound where content and how the programmers reached their audience are changing and all of those things are contributing to a supply chain that needs to be able to, not just execute where the media lives, that's very important, I want to avoid those egress fees when I have content that lives in the cloud, but I also need a content supply chain that scales to meet those capacity needs without having me to provision or over-provision my infrastructure.

Jan Ozer: Okay, real quickly. Is IMF going to help?

Ken Haren: So IMF is actually, we are hearing that quite frequently from our customers. Initially, sort of, with Netflix behind it that now is one of the packages I have to support but as I have new publishing endpoints, it's just such a convenient way of normalizing what I am sending out into the marketplace. I need to cut a version of this programming for the airlines that I am sending it to for the, for the Netflixes and Hulus that I am sending it to, for the iTunes for the music store, and each one might have different branding, different audio elements, different video elements, maybe cut in slightly different ways. Having to store and manage unique assets for every one of those distribution end-points, in unique formats for everyone of those distribution endpoints, is something the market doesn't want to support. Right now that is the case, right? But they want to move away from that and IMF seems to be a great way to enable that by describing what needs to happen to the content in the metadata, I can inform both my workflow systems and the end-points that I am publishing to what it is that I am providing them and so we've heard from our customers, I wouldn't say it is an overwhelming, you know, cry for this but everyone wants help with this and IMF seems to be a standard that people can kind of get behind and we are seeing momentum growing on that.

Jan Ozer: Okay, listen. I appreciate the time and have a great show.

Ken Haren: Yeah, thank you Jan, thank you.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

IBC '19: Telestream Says High-Quality Streaming Is Table Stakes

"We want to be the defacto standard for video processing, workflow automation, and quality management," says Telestream CEO Scott Puopolo.

Telestream and Tektronix Combine, Forming a Co-Owned Business

Under the new arrangement, Telestream gains more than QoE services. It also gains R&D talent and the ability to reach new markets.

NAB 2019: Brightcove Talks Cost Savings and QoE Improvements From Context-Aware Encoding

Per-title encoding is on the way out as Brightcove and others demonstrate the value of a more holistic approach. Streaming Media's Jan Ozer interviews Brightcove's Yuriy Reznic at NAB 2019.

NAB 2019: LiveU Explains What 5G Will Mean for the Future of Cellular Bonding

In this interview from NAB 2019, LiveU clears up some of the hype around 5G (it won't make an impact until next year and there's no special health risk) and talks about the testing it's already doing with 5G modems.

NAB 2019: Twitch Talks VP9, AV1 and its Five-Year Encoding Roadmap

Twitch uses a head-and-tail encoding strategy where popular content (the head) is encoded one way and less popular content (the tail) another. In this interview, a Twitch engineer explains what's on the video gaming powerhouse's five-year roadmap.

NAB 2019: NGCodec Talks Hardware-Based High-Quality Live Video Encoding

Streaming Media's Jan Ozer and NGCodec's Oliver Gunasekara discuss NGCodec's live HEVC 4k60 encoder, and why the company was wrong about the future of H.264.

NAB 2019: Phenix Talks Low Latency Solutions for the Oscars and Online Betting

Streaming Media's Jan Ozer interviews Phenix Technologies' Kyle Bank on the show floor at NAB 2019, and hears why low-latency CMAF isn't good enough for Phenix's demanding customers.

NAB 2019: Encoding.com Talks About its 2019 Global Media Format Report

Streaming Media's Jan Ozer and Encoding.com's Greg Heil discuss findings from the 2019 Global Media Format report, such as why better codecs don't always find wide adoption.

NAB 2019: NETINT Talks High-Density H.265 Encoding

Streaming Media's Jan Ozer and NETINT's Ray Adensamer discuss NETINT's Codensity T400, which is aimed at companies that need to do large live video encoding jobs at scale.

NAB 2019: NPAW Talks Analytics and Quality of Engagement

At NAB, NPAW showed off its new smart ads service, which not only measures whether or not an ad is delivered but looks at its streaming quality and how it impacts overall viewer engagement.

NAB 2019: Epic Labs Talks LightFlow Per-Title Encoding

Epic Labs debuted LightFlow, one of the most exciting services to break at this year's NAB. LightFlow combines per-title encoding with network modeling, as well as per-device-type encoding. In this video interview, Epic Labs founder and CEO Alfonso Peletier explains the benefits it offers.