July 7, 2020
IntelBrief: De-Platforming Terrorists, Violent Extremists, and other Online Threats
.On June 30, 2020, Facebook announced a major strategic network disruption over multiple platforms related to the social media presence of the anti-government movement known as the Boogaloo Movement. According to the Facebook press release detailing the takedown, 220 Facebook accounts, 95 Instagram accounts, 28 pages, and 106 groups were part of the de-platforming. Until last week, Facebook was the social media platform of choice for those that self-identify as Boogaloo Bois. Also last week, YouTube took an enforcement action against a channel operated by Stefan Molyneux – whose channel’s message boards often tout white supremacy and male chauvinist theories. These specific actions, and others taken, such as YouTube’s removal of Red Ice TV and Black Pidgeon Speaks, illustrate that momentum is growing for the removal of toxic (and often threatening) white supremacist content. Major Silicon Valley companies are turning the corner by upholding their own community and safety standards, but many questions remain unanswered, including what may be driving these new enforcement actions and the broader consequences of the removals.
Observers could reasonably contend that Facebook in particular took action because of the torrent of negative publicity it has suffered in the wake of its unwillingness to do something about President Trump’s provocative post in the wake of George Floyd’s death. In both a tweet and a Facebook post, the President used incendiary language like, ‘when the looting starts the shooting starts,’ that harkened to the days when a racist 1960s-era Miami police chief tried to intimidate African American citizens. In response, Twitter added a public interest notice to the President’s tweet explaining that it had glorified violence. Facebook eschewed similar quick action and the public’s ire was profound. Additionally, 800 companies worldwide, including some high-profile corporations like Unilever and Coca-Cola, decided to boycott advertising on Facebook or pull ads from the site because it had not done enough to stem the flow of hateful content and disinformation. The combination of a public relations disaster and the loss of its most important source of revenue—advertising dollars—may have accelerated Facebook’s decision to ultimately start labeling violative content by politicians and implementing a significant network disruption against the Boogaloo movement. The issue is not always as straightforward as portrayed in the media. Individuals within the Boogaloo movement slowly transitioned, and only recently turned to more violent measures. The first significant successful Boogaloo attacks were allegedly carried out in June 2020 in Oakland and Santa Cruz, California. Thus, despite being criticized for not acting quickly enough, Facebook removed Boogaloo related material in less than a month from the time of the group’s first attack.
There are several potential consequences associated with social media takedowns of anti-government and white supremacist online content. First, Facebook did not remove every Boogaloo related page and followers of disabled pages simply can move to extant groups. Presumably, Facebook did not remove those accounts and group pages because they were not violating community standards. If more hardline Boogaloo Bois migrate to the remaining pages, those pages could see an increase in language that encourages violence. Second, Boogaloo Bois retain a presence on a number of other mainstream social media platforms. Since the Facebook disruption was not coordinated with other social media platforms, Boogaloo adherents can just expand their presence over alternative platforms like BitChute, Gab, and TikTok. Third, it also would be natural if there is a significant expansion in hard-core Boogaloo individuals moving to encrypted communications – especially in the wake of recent foiled Boogaloo attacks in Arkansas and Nevada, which is indicative that law enforcement is closely monitoring the movement.
Policy recommendations from The Soufan Center’s September 2019 report on transnational white supremacy extremism remain relevant today. While many of those recommendations, such as using sanctions to counter some white supremacist groups -- which helps to diminish their presence in the virtual world -- have been adopted in a piecemeal fashion, more can still be done. Ad hoc social media- led network disruptions are a positive step, but what would be more effective is if YouTube, Twitter, and Facebook coordinated network disruptions together. Doing that could make it more difficult for bad actors to re-coalesce online. While coordination may be challenging, the Global Internet Forum to Counter Terrorism’s (GIFCT) Interagency Advisory Committee may have the capability and influence to harmonize action among Silicon Valley’s companies. GIFCT, in fact, just announced a new director, Nicholas Rasmussen, on June 23. As the former Director of the U.S. National Counterterrorism Center (NCTC), Rasmussen’s experience in coordinating complex action may make it more feasible to implement a concerted and unified organizational effort to counter anti-government and white supremacist propaganda online.
For tailored research and analysis, please contact: email@example.com