[go: up one dir, main page]

Jump to content
Our commitments

Does YouTube contribute to radicalization?

We work hard to protect users from extremist content. We address extremist content by removing videos that violate our hate speech policy and violent criminal organizations policy. Our recommendation systems significantly limit the reach of borderline content and harmful misinformation that brushes up against the policy line but does not cross it.

Curbing extremist content

What policies address extremist content?

Any content designed to incite violence or foment hatred towards certain groups of people is against our hate speech policy. Additionally, content promoting or glorifying terrorism is against our violent criminal organizations policy. We remove this content when flagged to us.

How does YouTube deal with content that does not violate policies but could still be considered harmful?

Sometimes, there is content that brushes up against the policy line but does not cross it. We call this borderline content. Our recommendations systems help limit the spread of borderline content and because of this, we’ve seen more than a 70% drop in watch time of this content coming from non-subscribed recommendations in the U.S.

What is YouTube doing to specifically tackle content that promotes violent extremism and terrorism?

Content promoting or glorifying terrorist and other violent criminal organizations does not have a home on YouTube. YouTube has automated systems that aid in the detection of content that may violate our policies, including our Violent Criminal Organizations policy. Once potentially problematic content is identified, human review verifies whether it violates our policies. If it does, the content is removed and is used to train our machines for better coverage in the future. Machine learning now helps us take down extremist content before it has been widely viewed. Between October and December 2019, approximately 90% of the videos uploaded that were removed for violating our Violent Extremism policy were taken down before they had 10 views.

The YouTube community also helps us spot this content. We have a designated “promotes terrorism” flag underneath every video on YouTube that users can select when they report content. We also work with violent extremism experts through our Trusted Flagger program. Teams carefully evaluate flags 24 hours a day, 7 days a week.

We are also a founding member of the Global Internet Forum to Counter Terrorism (GIFCT), where we work with other tech companies to keep terrorist content off the web while providing training and other resources to smaller companies facing similar challenges.