INTELBRIEF

June 10, 2019

IntelBrief: YouTube Making Moves to Combat Extremism Online

A YouTube sign is shown across the street from the company's offices in San Bruno, Calif.  (AP Photo/Jeff Chiu).
  • Thousands of videos and channels that promote extremist beliefs will be removed from YouTube, according to the company.
  • To date, tech companies have mostly focused on removing jihadist-related content but more recently, social media giants have come to target other hateful and violent ideologies, including white supremacists and neo-Nazis.
  • While many see this as a belated move and wonder why it has taken so long, it is nevertheless a step in the right direction.
  • A major challenge for tech companies is enforcing these policies and monitoring their sites to determine what kind of speech crosses the line.

Thousands of videos and channels that advocate for and promote extremist beliefs, including many related to far-right ideologies, white supremacy and Neo-Nazism, will be removed from YouTube, according to the company. In a statement, YouTube acknowledged that 'It's our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,’ even as the company continues to struggle with how to balance its medium with freedom of speech and where to draw the line in identifying hate speech, racism, and blatantly false content.

To date, the tech companies have primarily focused on removing jihadist-related content from their sites, due in part to the rise of the so-called Islamic State and the way that group used social media to organize and spread its propaganda. But more recently, social media giants have come to target other hateful and violent ideologies, including white supremacists, misogynists, and 'incels,’ or involuntarily celibate men who promote violence against women. Additional videos will also be banned, including those related to blatant conspiracy theories, those that deny the occurrence of the Sandy Hook Massacre where twenty-six people were murdered (including twenty young children); the terrorist attacks of September 11, 2001; and misinformation about vaccines and their alleged links to autism. These conspiracies have been popularized by the likes of Alex Jones and Infowars and have occasionally had real-world consequences, including the PizzaGate incident, where an individual traveled to a pizza restaurant with a rifle, claiming he was there to investigate links between the restaurant and a human trafficking ring directed by then-U.S. presidential candidate Hillary Clinton.

While many see this as a belated move and wonder why it has taken so long, it is nevertheless a step in the right direction. Analysts studying this issue believe that social media companies consistently downplayed the issue for fear of losing money. For tech companies, there is a clear monetary incentive in the form of advertising revenue that is partly responsible for the unfettered growth of unsavory channels advocating a wide range of racist and hateful opinions. There may also be an added element of perceived political pressure. Personalities on the right pole of the political spectrum claim that YouTube’s actions are biased, a notion given credence by U.S. President Donald Trump, who claims that YouTube and other tech platforms have unfairly singled out conservative viewpoints for censorship.

A major challenge for tech companies is not just crafting and implementing policies, but enforcing these policies and monitoring their sites to determine what kind of speech crosses the line. In the past, lax enforcement has led to a groundswell of criticism that these tech platforms, not only YouTube but also Twitter and Facebook, have allowed extremists to meet, collaborate, and metastasize on their sites. Beyond refining its rules governing hate speech, YouTube has also vowed to take a closer look at its recommendation algorithm. Other tech companies, including Facebook, have recently become more active in moving to ban ideologies that promote violence and hate from their platforms, but it seems inevitable that those seeking to encourage this kind of virulent propaganda will inevitably find their way back online, adapting to newly imposed rules and attempting to devise ways to circumvent restrictions governing online speech and behavior.

.

For tailored research and analysis, please contact:  info@thesoufancenter.org

 

[video width="960" height="540" mp4="https://thesoufancenter.org/wp-content/uploads/2019/06/IB-0610.mp4" poster="https://thesoufancenter.org/wp-content/uploads/2019/06/AP_18093824064931.jpg"][/video]

SUBSCRIBE TO INTELBRIEFS