-->
Save your FREE seat for Streaming Media Connect this August. Register Now!

How to Choose a Video Quality Metric

Learn more about video quality metrics testing at the next Streaming Media West!

Read the complete transcript of this video:

The tools I use or recommend: I used the Moscow State University video quality comparison tool. I use the SSIMplus VOD monitor very frequently, and I use the Hybrik (now Dolby) media analyzer very, very frequently. The thing about Dolby is they've got a wonderful media analyzer tool but it's cloud encoding tool, and you can't just access the media analysis functions. You have to use the cloud encoder. And since they charge by the month they've got a different kind of charge scheme. Not a lot of people can't use that. Unless you're using their system for encoding, you can't get to the media analyzer, and I'm not going to spend a lot of time on that. It's great for me because if I need to process a hundred files or a thousand files overnight, I can just upload them into the cloud.

I can let my, my AWS account spin up a hundred machines and I can process those files in a very short period of time. If I tried to do that on my workstation, I could probably do it in four or five days. But again, that's not something you can access unless you're a HybriK customer. So the tools we'll spend our time on are the VQMT and the SSIMplus VOD monitor.

With the VQMT tool, I load the source file--a drag and drop here--and then I can load one or two encoded files. And what we're showing here is the difference between constrained VBR and constant bitrate encoding. So I drag the source file here, the first process file here, that's constrained VBR, the second process file here, that's constant bit rate. And then I choose the metric. In this case I'm using VMAF. And then I press start, and then it produces a results visualization.

It's really critical. We'll look at how I go through the analysis in a moment.

But what this tells me is the per-frame VMAF score over the duration of the file. And because I've got two files up here, the constrained VBR is in red, the constant bit rate is in green. And what I notice here is that there's a lot of downward spikes in the green file. So even though the scores are relatively similar, I can look at this chart and see that there are some problem areas in the CBR file that probably represent visual quality differences that customers might notice.

The top graph shows the entire files. The bottom chart shows a highlighted region. So if I drag my mouse over this, it would show this region down here so I can zero in on the frames that I want to. And then once I've zeroed in, I click this, and I can see the actual frames. So I go from a pure number to the ability to say, "Hey, there's a real difference here."

This is a beta version of version 12, which is just out. What they added in this version--for people who have this product but haven't seen the newest version--is the ability to see the actual VMAF score for the frames when you're looking at it in this mode. So this tells me that the CBR file has a VMAF rating of 47 and the constrained VBR file over there has a VMAF rating of 69. So basically that tells me that, even though the average value is about the same, there are serious issues in the CBR file that are going to degrade viewer satisfaction.

Here is the CSV file. You can get your output in either CSV or JSON. That's the source file. Those are the two files we saw analyzed. And this is kind of interesting. The VMAF mean score, we're seeing for the CBR is higher--the harmonic mean which takes into account the lower variability of the CBR score. We're seeing the VBR score is higher, And this is new for the Moscow State University tools: They added the harmonic mean to reflect files that have a lot of problem areas. And then you can see the minimum value. That's really nice because that's a mathematical representation of the ugly frames we just saw--maximum value and then the location of those two files, and then the standard deviation, which again is very valuable because the higher the standard deviation, the higher the variability in the quality. So you can see right here that the VBR file has a standard deviation of seven, the CBR file 10, so there's a lot more quality variability in that file. That's another useful gauge in terms of how many problem areas there may be in that file. And then the variance numbers is just the standard deviation squared.

What do I like about the Moscow State tool? It's affordable. It's GUI and command line. It's very visual. You can see the results very easily. It handles most major algorithms except for SSIMplus. And you could read my review of the, of the technology at Bit.ly/VQMT_review.

On the con side, it can only compare files of like frame rate. So if I have a 60 frame per second (fps) source and a 30fps output file, I can't get a quality rating for that file. You can do that with SSIMplus. It's one of its key advantages.

And then, from my perspective the other thing about Hybrik--in addition to processing a lot of files--is it lets me download one CSV file that I can just populate into a spreadsheet, and I'm done. With the Moscow State University tool. If I analyze a hundred files, I've got to open up 100 CSV files and then copy and paste the results in. I'm not smart enough to know how to use JSON to automate that kind of input, but if you're using CSV, it takes a long time to get the data in.

So here's the SSIMwave VOD Monitor. It's based on the SSIMplus algorithm, which is by the same people who invented the SSIM algorithm. So it's an advancement on that. We saw that it rates videos on a scale that corresponds with, with human perception. It's got multiple device ratings. It can compare different resolutions and different frame rates and they're here at Streaming Media West if you're interested in the product. The other thing that the SSIMplus VOD Monitor can do is it can compute the BD-Rate functions that we learned a few minutes ago. With the Moscow State University tool, if I want to compute BD-Rate, I have to create these CSV files, enter into Excel, apply the macro with the SSIMplus VOD Monitor.

They've got a comparison mode. In this case, we're looking at the difference between HEVC and H.264. So we've got two encoding ladders. This is the H.264 encoding ladder. This is the HEVC encoding ladder. We see that at every point across the data rate spectrum, the blue line for HEVC is higher. This number here tells us that if you use the HEVC ladder, you're going to save on average about 45%--the same quality at 45% lower data rate.

All you do is select the files you want to analyze and you can produce these numbers very quickly. It's a really good feature if you're doing a lot of codec analysis or a lot of encoder analysis.

I talked about Hybrik a couple of times already. You can get some of these numbers through open source tools. You can compute VMAF with a FFmpeg and VMAF master. There's two articles on my website (bit.ly/VMaster and bit.ly/ff_vmaf_win) that give you the code you need to do it and tell you how to do it. The problem with using the open source tools is you get the numbers only. You don't get the ability to look at the files beneath the numbers. And I find that very, very critical.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Deep Perceptual Preprocessing with a Difference

Per Hultin discusses meeting the challenge of a rising volume of content and the growing carbon footprint of online video streaming.

It's Time to Retire PSNR

Peak signal-to-noise ratio is poor predictor of subjective video quality. So why are PSNR comparisons used in almost all codec comparisons?

Cutting Bitrates, Keeping Quality

In the COVID-19 era, the right approach can check both boxes, and decisions made in response to the current demand on bandwidth can have long-term business advantages for OTT services

Companies and Suppliers Mentioned