YouTube Details 4 Steps for Reducing Extremist Videos on Platform
Over the weekend, YouTube published a blog post identifying four steps it will take to reduce or remove visibility from extremist videos on its site. The issue is a constant challenge for YouTube, where 400 hours of video are uploaded every minute, and brand safety concerns earlier this year caused some advertisers to pull ads.
Improving technology is the first step. In the post, Kent Walker, general counsel for Google, says automated video analysis uncovered half of the terrorism-related videos the company deleted in the last six months. Going forward, YouTube will put more engineering resources into identifying and removing extremist videos. Because no machine model is going to identify every extremist video, YouTube's second step is to boost the experts in its Trusted Flagger program, adding 50 non-governmental organizations (NGOs) to the 63 already in the program. These experts will determine which videos are violent propaganda.
Much extremist content on YouTube won't quality for deletion, but will still be inflammatory. Those videos already can't carry ads. For its third step, YouTube will no longer allow comments or endorsements on them, so they'll be harder to find. Lastly, YouTube will use targeted online ads to steer potential ISIS recruits to videos that debunk terrorist recruiting efforts. Walker says previous tests of this system got high click-through rates and led viewers to watch over half a million minutes of anti-terrorist videos.
Because this issue impacts all social platforms, YouTube is joining with Facebook, Microsoft, Twitter, and others to create a forum for sharing and creating anti-extremist technology.
"Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge," Walker writes. "We are committed to playing our part."
Following yet another creator crisis, YouTube is getting tough with problem channels while promising faster decisions and communications.
The leading video destination was plagued by multiple content scandals this year. Will an increase in moderation assuage advertisers?
Extremist groups turn to video sharing sites to radicalize supporters and spread their message, but a YouTube strategy will help counter those efforts.
After dealing with angry advertisers, YouTube had to deal with angry video creators. New guidelines help communicate what brands are looking for.
Channels in the YouTube Partner Program now need 10,000 lifetime views to qualify for ads, and must be reviewed for policy compliance.
It's been a challenging week for the leading online video destination, as hundreds of advertisers have pulled ads after learning they supported hate videos.