Video: What's the Best Way to Test Video Encoding Quality?
Learn more about objective quality metrics at Streaming Media West.
Read the transcript of this clip:
Jan Ozer: If you want to do a standards-based analysis of any kind of video comparison, there are standards for that. That's the gold standard. You bring people in a room, you sit them a certain distance from a TV, you use side-by-side TVs, you use triple blind studies. All that's very specified if you want to do that. It's also very time consuming and very expensive when all you want to know is which keyframe interval to use. That's kind of the standards-based.
The informal is the golden eye. And the golden eye is kind of what we used to do. We kind of stare at the videos side by side.
But what got me into the objective quality side, was three summers ago, when I had a consulting agreement to help a company choose an HEVC codec. So we set up the test grid and it was going to be something like 180 different comparison. If you have 180 different comparisons, you can't do this. And you can't even really do this, not in any kind of time-efficient or even reasonably effective basis. So that's when I started using objective quality metrics.
And what we're doing, this is the best, this is kind of the second best, although it's pretty cumbersome. This is the third best ... So if you look at a model like here, like PSNR, these are pretty much tuned to understand the difference between the source and the compressed file. So what they do is they, here's a source file, here's the compressed file, what are the differences? And differences, you add up the differences, you run some math on it, and that produces a score.
The problem is, a lot of times those differences aren't things that people can see and a great example of that is attention-focused compression. So imagine you had a big frame, and I'm in the frame, and one compression technology was able to say, "Hey, all we care about is what's going on in the center, so we're going to blur everything else," and another technology said, "No, no, no we've gotta keep everything sharp," so the math-based, pure math-based with no perceptual quality portion would say the attention-rated portion was bad, because all the background was different than the original.
But really, if you include a perceptual-quality analyzer, if you look at the perceived quality by the viewer, you would say that approach is better, because that's all the people are seeing. So these are all math-based and then the higher-quality analysis tools like VMAF from Netflix and some of the others are, they include a level of perceptual quality analysis. They say, "Okay, this is how people subjectively compare these types of videos, so we're going to include that in the formula."
Streaming Learning Center's Jan Ozer discusses per-title encoding strategies including constant rate factor (CRF) encoding in this clip from his presentation at Streaming Media West 2018.
Streaming Learning Center's Jan Ozer lays out the basics of objective quality metrics for encoding assessments in this clip from his presentation at Streaming Media West 2018.
Streaming Learning Center's Jan Ozer makes the case for custom, per-title encoding on the Netflix model, focusing on resolution issues in part 1 of this 2-part series from his Streaming Media West presentation.
Streaming Learning Center's Jan Ozer makes the case for custom, per-title encoding on the Netflix model, focusing on bitrate issues in part 1 of this 2-part series from his Streaming Media West presentation.
Streamroot's Nikolay Rodionov walks viewers through the key elements of an effective A/B testing workflow for testing video players during development.
Streamroot's Erica Beavers explains what A/B testing is and how it can benefit organizations building custom video streaming players.