Does Facebook's Video AI Suppress Some Clips? BET Says Yes
Video artificial intelligence (AI) can be used to surface content viewers are interested in, and it can also be use to hide content platforms don't want widely shared. Based on its own testing BET says Facebook is using AI to suppress clips that feature tune-in information, forcing the network to spend money promoting clips that would otherwise enjoy strong organic sharing.
Speaking at a panel discussion on video AI and automation hosted by Wibbitz and Vidrovr, Ken Gibbs, vice president of digital video and social content for BET said his network's tests indicate that some platforms are using AI to scrub videos for certain information, especially day, date, and time tune-in data, then suppress those videos. In a post-panel interview, he said suppression primarily occurs on Facebook.
"Content that's got day, date, and time—we're at a network, that's what we want to push," Gibbs said during the panel discussion. "We want you to enjoy it organically. But AI's being used to send signals to the platform to let them know that, 'Hey, this is important to their business. As a result, they've got a vested interest in it being seen. So, suppress it. Force them to put money behind it.'"
Interviewed after the panel, Gibbs explained that his observations are based on years of analytic data for shared clips. Looking at that data, he sees that Facebook AI is now scrubbing for on-screen tune-in information.
"We worked out the performance of video over the past few years and months on the social platforms, and you can tell based on the results—the viewing and the engagement and reach on the content—that content that has pertinent information like day, date, and time actually on the screen does not perform as well as that same piece of content without it," Gibbs said.
Facebook could be suppressing clips to force networks to spend money on promotion, or it could be surfacing clips that keep members on the site rather than sending them away.
"That's the message that [networks are] trying to communicate over the platform. They want you—their fan, their follower—to watch this show on this day and this time. That's why you spent the money to even create that clip and that content," Gibbs says. "The platform right now, we believe, is incorporating AI to understand this and identify this information. As a result, they're not put these clips in front of the user as much as they would what they are calling, I believe, 'meaningful conversations' that are just meant for people to enjoy there on the platform."
BET tested its theory by sharing similar clips, some with tune-in information and others simply asking viewers to check the program out. The clips without on-screen tune-in information enjoyed far more organic sharing and were surfaced more often by Facebook.
Pictured left to right: Alex Siegman, AI Technical Program Manager, Dow Jones; Alex Feldman, Strategic Partnerships, Comscore; Ken Gibbs, VP Digital Video and Social Content, BET.
Facebook use is falling and changes in video consumption are what's driving the trend. The social network now discourages passive consumption of video.
For events like the royal wedding and the World Cup, machine learning and AI are taking center stage.
The CTO of Penthera explains how to push the envelope with deep personalization, using content discovery to empower the streaming video user experience.
Limelight's Jason Hofmann, Citrix' Josh Gray, and REELY's Cullen Gallagher discuss best practices for training AI systems at Streaming Media East 2018.
To really engage home viewers, let them control the on-screen action with their second-screen apps, and have a great time doing it.