Save your seat for Streaming Media NYC this May. Register Now!

3 Innovative Technologies for Improving OTT QoE

Article Featured Image

OTT video is hot. A recent report from Parks Associates predicted that video streaming services will accelerate globally over the next 5 years, with more than 310 million connected households having at least one OTT service by 2024—equating to some 586 million subscriptions overall.

Yet, a major challenge OTT service providers face is that consumers expect the OTT experience to be the same as broadcast TV. With consumer enthusiasm high for live video streaming, one would think the quality of experience (QoE) would be exceptional—at least on the same level as broadcast. But that hasn’t always been the case. Scalability issues sometimes cause end-users to experience poor qualitywhile watching live OTT services. This article will explore the challenges and three future-forward technologies that are helping operators deliver the best quality of experience (QoE) to subscribers on every screen.

OTT QoE Landscape and Associated Challenges

Pure OTT services are delivered over unmanaged networks shared with internet traffic, which can cause QoE issues. When a major sports tournament takes place, billions of viewers across the world are watching on a range of connected screens, including TVs, PCs, smartphones, and tablets. Delivering high-profile sports events is a challenge because viewers expect to receive a consistent, high-quality experience across all screens, with latency comparable to broadcast. There are multiple points in the delivery workflow that can create QoE issues for live OTT services. 

Figure 1 offers a generic view of the video delivery chain, with the headend including the compression and the origin server, followed by the CDN, and access network, showing all of the points in the chain that can create scalability problems.

On the signal delivery side, several issues can occur when a high number of subscribers want to suddenly view an event at the same time. Scalability issues can happen at the origin server, CDN, access network, and during the final stage of delivery to the home. At the origin server level, issues occur because the server is designed to support valued-added features like start-over TV, catch-up TV and targeted ad insertion, and when there are a high number of concurrent users watching video, it creates a workload on the origin server, eventually creating HTTP 404 errors.

Scalability issues generally occur at the CDN level when video service providers haven’t properly anticipated how big an event will be and the maximum network load is reached. During the last mile of delivery (i.e. at the access network), various situations can arise that impact QoE. Typically, these problems vary depending on the type of network (i.e., fixed line, DSL or fiber or mobile) and its associated technology. At the final stage of delivery to the home (i.e., a gateway in the home), contention can happen when too many devices connect to the same access point or when the connectivity to the client is bad, thereby lowering the bitrate and impacting QoE.

Another challenge with delivering OTT services is latency. Latency, or the delay that exists between live production and the end-user display, is a common OTT issue that is highly noticeable during live sports events, especially as broadcast or social network references happen within a few seconds.

Early streaming formats, primarily developed for SVOD, were designed to avoid re-buffering when rendering video on a player. But to make that work everywhere on any device, memory buffers had to be used in the workflow, especially in the player, introducing end-to-end latency. Apple’s original HTTP Live Streaming (HLS) protocol, released in 2009, recommended using 10-second segments and specified that players should not buffer less than three segments. This explains why many OTT services show a typical latency of 40 seconds or even more. Apple later revised its recommendation to six-second segments, which still equates to 18 seconds of delay on the client side. A latency of 20 seconds or more is not a problem for SVOD services, but it is an issue for live sports. A latency of ~5 seconds is required during live events, or the delay can become quite noticeable and frustrating for viewers.

Video compression technology can help to improve the QoE by reducing the amount of data that is moved from the origin server to the end-user, but this comes at a cost. The lower the compression bitrate, the easier it is to deliver unicast to connected devices with superior QoE. But if the bitrate is too low, the quality will suffer and QoE will be degraded. OTT service providers can resolve this issue by using the most advanced codec (i.e., HEVC), but the current licensing terms have slowed down deployments. Therefore, the industry needs to find smarter ways to distribute content. 

Let’s have a look on three future-forward technologies that are helping operators deliver the best QoE to subscribers on every screen: content-aware encoding, CMAF for low latency, and edge scaling. 

Content-Aware Encoding

Content-aware encoding (CAE) is an innovative technology that is currently used by Netflix (named per-tile or per-segment encoding) and other leading OTT service providers around the world to address HEVC licensing issues and deliver the best picture quality using AVC codec. With CAE, operators can have encoders adjust encoding parameters in real-time based on video complexity. The per-tile encoding technique works similarly to VBR for statmux, except only one program is encoded and the video quality measurement is more refined since it is based on the Human Visual System. 

To maximize accuracy of the video quality measurement, CAE is trained offline using artificial intelligence. For VOD, CAE can be used in one pass, as is done for live. This provides the highest scalability (i.e., encoding speed) but not the lowest compression ratio. Alternatively, this can be done in several passes, where each encoding parameter set is encoded in parallel and the decision is made at the end of each encoding batch. 

Table 1 demonstrates the consistent savings that can be achieved with CAE compared with CBR for HD content.

Table 1. Bitrate savings between CAE and VBR

In addition to providing massive bandwidth savings, CAE also enhances QoE. Since the video is compressed in a more advanced way, additional HD profiles can be received. When the bitrate is low, less buffering effects occur.


The MPEG Common Media Application Format (MPEG-CMAF) was primarily developed to resolve interoperability issues within the streaming industry, as it reduces the amount of different media file formats required. MPEG-CMAF is a media container standard based on fMP4 (ISOBMFF). Since it can be used by both MPEG-DASH and HLS delivery formats with a common encryption scheme CENC (Common Encyption), MPEG-CMAF greatly simplifies the distribution of OTT at scale.

The CMAF toolbox also provides some interesting features, including an option for Low Latency Chunk (LLC). This tool  was included in the initial MPEG-CMAF specification to enable the delivery of segments by small chunks (e.g., 200 ms). This means the decoding process can start before a complete segment is encoded, packaged, and received. Enabling high performance and an end-to-end latency of three seconds or less, MPEG-CMAF with LLC is the answer to OTT service providers’ prayers, putting them on a level playing field with broadcasters for delivering live sports.

In order to take full advantage of CMAF LLC, OTT service providers need to support the technology at all steps of the delivery workflow, including packager, origin, CDN, and player. CMAF LLC has the backing of the entire industry, including CDN and player vendors. However, it is worth noting that players that support CMAF but not LLC would still be able to decode the video after reception of the full media segment. In this case, the latency will increase by a few seconds. We have measured the DASH CMAF LLC on various players and network conditions, and we are confident the ~5s broadcast delay will be reachable once production services are deployed.

Edge Scaling

As OTT service providers look for additional ways to boost QoE, scaling at the edge of the network has emerged as another path. The concept of scaling at the edge is simple—service providers deploy edge nodes inside the ISP access network. For live events, predominantly sports, scaling close to the subscriber will provide the greatest QoE. However, the downside is that if the service provider dimensions and builds for these isolated peaks in traffic then the deployed nodes will be underused much of the time.

Adding storage to these nodes will enable caching of non-linear (i.e., VOD, catch-up TV) services to be done closer to the end subscriber. Netflix, for example, provides a large library of non-linear content that is positioned within the ISP networks via its OpenConnect program, enabling great QoE for Netflix subscribers. While this will reduce network impact and improve QoE, these services are not usually as time-sensitive as live events, where the impact of social media can degrade the viewing experience by giving away the final score. 

Looking Ahead: What Are the Next Steps?

Given the number of commercial offerings consumers have today when it comes to video streaming services, OTT service providers don’t have a choice—the QoE they deliver must be exceptional on all devices and should be as consistent as possible over time. By deploying the latest technology innovations, including content-aware encoding, MPEG-CMAF LLC, and edge caching, service providers can deliver superior QoE with low latency matching broadcast. 

CAE, in particular, has shown to deliver up to a 50% reduction in bandwidth for OTT live services compared with more traditional encoding. Since it is 100% H.264 compliant, OTT service providers don’t need to make any changes to today’s ecosystem to use it. Whether it's on appliances, virtual machines, cloud infrastructure, or SaaS, the technology works. As OTT video consumption continues to grow, momentum for CAE technology is expected to build.

Since CMAF LLC is standardized and mature enough to be deployed, fine-tuning the entire workflow is the next logical step. Although each component can be optimized on its own, optimizing the end-to-end system to deliver the best possible QoE will give additional momentum to the ecosystem.

On the CDN side, the most widely seen approach moving forward is that of a “hybrid CDN.” This is where deployed edge nodes for live scaling or caching of non-linear content are combined with a public CDN for off-net reach. The benefits of such an approach include that the edge nodes don’t have to be dimensioned to meet the full demand of the service but will allow for better use of the deployed edge nodes for the average traffic pattern. In addition, this method allows for a pre-determined level of “burst” use, with the public CDN handling the remainder of the traffic.

For specific live events, the traffic could be prioritized to use the deployed edge nodes, and the public CDN will handle the other services. The addition of a second or even third CDN provider to this hybrid model will enable a true multi-CDN system that makes optimal use of deployed infrastructure and the geographical coverage that the CDN providers offer. In such a scenario the traffic control system must be able to determine the optimal delivery path as well as load balance incoming traffic requests while still maintaining the integrity of the system and supporting the surge of concurrent user requests as a premium event begins.

[Editor's Note: This is a contributed article from Harmonic. Streaming Media accepts vendor bylines based solely on their value to our readers.]

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

What Makes a Good QoE Metric?

SSIMWAVE Chief Science Officer Zhou Wang discusses the shortcomings of traditional QoE metrics in this clip from his Video Engineering Summit presentation at Streaming Media West 2019.

NAB 2019: Brightcove Talks Cost Savings and QoE Improvements From Context-Aware Encoding

Per-title encoding is on the way out as Brightcove and others demonstrate the value of a more holistic approach. Streaming Media's Jan Ozer interviews Brightcove's Yuriy Reznic at NAB 2019.

QoE Working Group to Deliver Standards Document by End of Year

A working group overseen by the CTA is creating recommendations for measuring performance quality, and some of the biggest names in the industry are participating.

How to Measure Video Encoding QoE

An insightful new service called Mux Data makes quality of experience monitoring and analysis easy. This illustrated guide explains how to use it when diagnosing problems big and small.

Companies and Suppliers Mentioned