Save your FREE seat for Streaming Media Connect this August. Register Now!

How to Measure Video Encoding QoE

Article Featured Image

Briefly, the Mux viewer experience score is a high-level metric for evaluating overall platform performance, with a 100 equating to an acceptable experience, and a 0 meaning a frustrating experience. Mux computes viewer experience for each video and then consolidates the score for the defined period. Mux bases the score on the four component metrics on the left of the graph—playback failures, startup time, rebuffering, and video quality—based on Mux’s view of how each impacts QoE. When viewing the analytics, you can filter by any of these four components, as well as by any of the individual parameters that are incorporated into each component.

On the upper left side of Figure 3, you can see the Filters box, from which you can add a filter to define your data further (Figure 4). For example, if you wanted to see the results only for a specific browser, OS, or CDN, you could do so by applying a filter for that parameter.


Applying a filter to your metrics

Beneath the graph are three tabs, Breakdown, Insights, and Video Views, which you can access for the overall viewer experience score or for any of the four components (Figure 5). Note that you can sort the data by name, viewer experience score average, total views, or impact on the overall score for the selected metric (in this case viewer experience) and then export the data to an XLS file.


Firefox is delivering a noticeably poorer quality of experience than Chrome for this Mux customer.

You can also run comparisons for any of the items within each category and display them in the graph itself. For example, in Figure 5, I’m comparing overall viewer experience between Chrome and Firefox, which you can see selected in the Breakdown below, with the color-coded chart comparing performance above.

The Insights tab is designed to help you quickly identify the relevant categories that contribute the most to the negative score in the selected metric. For example, in Figure 6, I’m looking at Rebuffer Frequency, which is one of the five components to the overall Rebuffering Score. I see that the overall score is .49 rebuffer events/minute, which sounds like a lot, but is well under the .9 rebuffer events/minute experienced by Mux users as a whole.


Insights identifies the major contributors to a negative score in each category or subcategory.

FBy clicking Insights, I see that the largest contributor to the negative score was the set of viewers watching on the Android OS, which experienced 4.16 rebuffer events/ minute, and the Chrome Mobile browser (3.62/minute). This tells me that I may have issues with the Android player. The next item in the Insights panel (which I’ve blurred out) is a specific video. It exhibited 4 rebuffer events/minute, which may indicate an encoding issue.

Click Video Views, and you can access the more recent video views in the defined time frame, but only the application name and OS were shown, not the title of the video or any statistics. I found it more useful to access video views from the left tab on Figure 3, since that identifies the title as well. Either way, you can click into the view shown in Figure 7 to get a feel for the video playback experience and see more details beneath the graph regarding the player used, viewers, and other metrics, like total rebuffer duration. The one thing I found missing was a way to access the viewer experience score for that particular view, which would have been informative (Mux plans to add this feature in the future).


Details regarding this video view

What Went Wrong on Dec. 2?

With this as a prologue, let’s dive into figuring out what went wrong on Dec. 2 (Figure 8). Using the time frame control at the upper-right corner, I can call up the data for that day and click Insights to see which experience contributed the most to the negative score. It looks like one viewer on a Linux system was having a very bad day.


On the trail of what went wrong on Dec. 2

Insights identifies the major contributors to a negative score in each category or subcategory.

By clicking video/ogg, I can dig even deeper into the data and see the videos that this viewer attempted to play. By clicking a video, I can access the specific details of that attempted play (Figure 9), including player, OS, CDN, ASN, and video source. Something was obviously very wrong.


This experience was obviously a very poor one.

At this point, I might look in the Errors tab to see if the system reported any errors that day. This reveals the information shown in Figure 10. It looks as if the media called for by the player simply wasn’t available. Since there was nothing to deliver, the playback failed.


The system logged an error with data that allowed further diagnosis of the problem.

Clicking the Code 4 error opens the screen shown in Figure 11, which allows me to sort by Browser, Country, OS, and other parameters. After clicking through to the Source Type, I see that the Ogg video caused all five errors, likely signaling some availability issue with the Ogg-encoded version of this video. Armed with this information, it should be relatively simple to diagnose what happened and resolve the issue. You could also set up an alert so that you’re notified of this or similar errors.


Hmmm. Looks like some problem with the Ogg-encoded video.

Overall, Mux provides an insightful, high-level health assessment of your delivery infrastructure while allowing you to easily diagnose any problems all the way down to the single user experience. Considering the cost and ease of integration, the service is a no-brainer for any site that’s streaming mission-critical video and not currently monitoring QoE. 

[This article appears in the 2018 Streaming Media Industry Sourcebook as "How to Measure Video Encoding QoE."]

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

3 Innovative Technologies for Improving OTT QoE

To create high-quality, broadcast-like streaming for the home, the solutions are already here. Discover what content-aware encoding, CMAF for low latency, and edge scaling can offer.

Encoding & Transcoding 2018: Part 1

Encoding and transcoding are at the heart of every OTT and online video workflow. The first article in this three-part series gives an overview of the technologies and a look at three major players in the space: Harmonic, AWS Elemental, and Telestream.

A Hybrid Approach Guides the Changing Face of On-Prem Encoding

Demand for on-prem encoding is waning, and vendors are responding with innovative hybrid approaches that offer the best of on-prem and the cloud.

NAB 2018: Mux's Jon Dahl Talks QoE and Mux Video

Mux CEO Jon Dahl talks about the company's QoE flagship QoE product and Mux Video, a new API to video hosting and streaming.

Mux Branches Out With Mux Video, Promises Easy Optimization

When it comes to encoding settings, don't just "set it and forget it." Mux Video dynamically encodes videos when needed to provide the best results.

Video Quality Control Startup Mux Takes $2.8M in Seed Funding

Mux makes it simple for media companies to learn exactly what problems their viewers are experiencing, and then find solutions.

Companies and Suppliers Mentioned