Save your seat for Streaming Media NYC this May. Register Now!

How to Monitor Many Live Event Streams at Once

What are some of the challenges of monitoring multitudes of live streams at once, both for performance and quality? Eric Schumacher-Rasmussen, Chair, Streaming Media Conferences, and CMO, id3as, says, “I want to make sure we talk about not just large scale live events, but organizations that are doing massive numbers of live events, either concurrent events daily, dozens of live events at the same time.” He asks Peter Wharton, Chief Strategy & Cloud Officer, TAG Video Systems, to start off by discussing how TAG handles the logistics of multiple live events.

“If you're monitoring thousands of live events or even dozens, you don't necessarily want to build monitor walls with a thousand images on them and have your operator stare at them all day long unless you want to pay psychiatrist bills as well!” Wharton says. He emphasizes that the best way to track the performance of multistream events is not by literal live feed video monitoring but rather by keeping track of analytics. “When I look at some of the customers that do really neat systems today, there's hardly any video sometimes on the wall,” he says. “It's mostly graphs, performance, and data delivery, actually telling you when something's falling off on the edge, where you need to start looking…So, it's about using big data really also to drive our workflows and not just eyes on glass.”

Adam Miller, CEO, Nomad Technologies, says, “That just made me think of something about the AI question asked earlier. One of the things that we're doing for some of our customers now is we're analyzing for black frames. And it's just simply just an alert. Just that simple. In our industry, that's almost always bad. So if it's a black frame for five, 10 seconds, to your point, monitoring by example or by exception, tell somebody the alerts go off the AI, if you will, Jim jumps in…”

Wharton comments on other ways AI can be even more helpful by learning and adapting quickly to evolving issues. “It understands that the jitter and the satellite feed should not be the same thing as the jitter coming from a cloud feed, which is going to be a lot higher because it's through public internet,” he says. “And so you start having systems that understand and adapt, and then they actually only notify you of things that really matter because you can spend a lot of time fine-tuning and probing and monitoring systems to get all those things right. And I'd love to get to that point where I actually have AI do that for you and keep it constantly up to date.”

David Hassoun, Chief Technologist, Dolby Cloud Media Solutions, Dolby.io, emphasizes the importance of data accuracy if analytics are at the core of effective multistream monitoring. “You get out what you put in, right?” he says. “You have to really focus that you're actually getting accurate information when it matters most. Because oftentimes it's not going to always be right out of box…”

Wharton agrees and says that too much information and alerts all at once don’t help improve functionality if it isn’t clear what the primary focus should be on at any given time. “Where, you know, the wall is full of LEDs and looks like a Christmas tree going off,” he says. “It's like, what's the point? If you don't get rid of all the red lights going off…then you don't have a monitoring system because your operators are not being tuned to not look at those things.”

Hassoun says, “And that goes for every one of your systems, all the way down to your Quality of Experience (QoE) dashboards and how you're going to monitor all these streams because there comes to the point, especially when you have these type of situations, you can't have enough eyes on glass, especially skilled. Oftentimes they understand all the pieces and parts and they can't even see all those different views so easily. Or it's an army of people, which can happen, right? But it becomes a really big challenge. So you have to make sure that you put in that time and effort that you're getting the right information of what matters most that you can act upon and then you prioritize from there. We’ve got a hundred streams going concurrent right now, but these are the ones that are going to get the most traffic. We're going to focus our attention on these guys from eyes on glass and distribute elsewhere and pray that our monitoring works right.”

Corey Smith, Sr. Director, Advanced Production Technology, CBS Sports Digital, Paramount, also emphasizes that AI must be used in the right way for its maximum effectiveness. “I think AI is an incredible tool used in the right way,” he says. “So if you have a big data cube and you're running all this analytics both from the client and from your edge origin and everything in between, why not let the AI bots actually control the micro transactional changes in your traffic flows across your different provider ecosystems? Because the AI can actually do predictive failure points on how the heat map is working across your exact particular platform and your delivery footprint.”

Wharton says, “It sees these changes over time, and it goes, ‘oh well, this is going up over time. This is not a normal pattern.’ And so it gets you to an alarm before you actually get the point you actually have failure.”

“You take that out of the human element, right?” Smith says. “Because a human's going to sit there and watch it trend and figure out where it's going to go, where the AI system can basically say, okay, this is trending over here. Let me make micro-adjustments over here to start peeling traffic off to these other networks and therefore making a cleaner event for your customers.”

Learn more about multistream live event management at Streaming Media East 2023.

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

How to Monitor Cloud-Based Low-Latency Streams at Scale

What is the essential monitoring checklist for maintaining efficiency, resiliency, and performance for large-scale low-latency streaming using cloud-based workflows? Dolby.io Director of Product Strategy and 2G Digital Post Optimization's Allan McLennan discuss the key components and tools for an effective live stream monitoring strategy in this clip from Streaming Media Connect.

How to Build Unbreakable Live Streams at Scale

Delivering reliable streams is both more challenging and more critical when streaming at scale. So how do you develop workflows to ensure your large-scale streams won't fail? Experts from Dolby.io, TAG VS, Paramount, and Nomad Technologies weigh in on network stress-testing, monitoring, and more best practices at Streaming Media West 2022.

How to Monitor and Troubleshoot Your Live Streaming Workflow

End-to-end workflows for live streaming at scale are complex affairs, with troubleshooting challenges at each stage when delivery breaks down. An expert panel from Paramount, Amagi, TAG VS, and Nomad Technologies discusses key best practices for troubleshooting large-scale streams when the pressure is on.

Latency vs. Quality for Live Streaming at Scale

How much streaming reliability and quality are worth trading for ultra-low latency, and when is one at a premium over the others? Amagi's Brian Ring, Dolby,io's David Hassoun, Nomad Technologies' Adam Miller, Paramount's Corey Smith, and Norsk CMO Eric Schumacher-Rasmussen discuss in this panel from Streaming Media West 2022.

How to Localize Live Event Streams

What are the most dynamic approaches for localizing live event streams? Dan Turow of Evertz and Marisa Elizondo of fuboTV talk about the ways their organizations are working to best integrate local and user-preference-based content and experiences into live event streams.

Paramount's Adtech Challenges for Live Streaming at Scale

What are some of the biggest adtech challenges for live streaming at scale? Jarred Wilichinsky of Paramount talks about the ways his team works to mitigate technical issues, such as minimizing latency, load testing, and correcting audio levels, along with ensuring that the ads themselves meet acceptable legal standards and practices.

What's the Best Chunk Size for Low-Latency Live Streaming?

The best chunk size for low-latency streaming is dependent on a number of factors based on different use cases, and there is often a need for some compromise and tradeoffs in quality or speed. Nadine Krefetz, Consultant, Reality Software, Contributing Editor, Streaming Media, asks three industry experts what their chunk size preferences are for their requirements.

How to Use AI in Video Workflows

Ethan Dreilinger of IBM Watson, Carlos Hernandez of SSIMWAVE, and Gordon Brooks of Zixi talk about how the key to effectively applying artificial intelligence in video workflows is knowing the differences between AI and automation and the ways they can best work together

Companies and Suppliers Mentioned