INTELBRIEF

September 14, 2020

IntelBrief: Disinformation in Europe: Progress and Remaining Challenges Ahead

European Commissioner for Inter-institutional Relations and Foresight Maros Sefcovic answers a question during a news conference at the European Commission headquarters in Brussels, Sept. 9, 2020. (Olivier Hoslet, Pool Photo via AP)

Bottom Line Up Front

  • On September 10, 2020, the European Commission released two reports related to social media companies’ efforts to battle disinformation in Europe.
  • The reports painted an uneven picture of progress as tech companies adopted procedures to adhere to the EU’s 2018 Code of Practice on Disinformation.
  • Recent events in the United States highlight social media companies’ lack of consistent implementation of measures to fight the spread of disinformation.
  • Europe remains much further along than the United States in encouraging practical action by social media companies against the spread of disinformation.

On September 10, 2020, the European Commission (EC) released two reports related to the technology sector’s effectiveness in battling the proliferation of disinformation. Both reports contribute to ongoing analyses of tech giants’ implementation of the EC’s 2018 Code of Practice on Disinformation. The first report, the September 10 EC Staff Working Document 180 (SWD 180) provides a general assessment of implementation of the Code of Practice. The second report specifically examines the tech sector’s efforts to fight disinformation during COVID-19. In evaluating implementation of the Code, the analysis centers around Silicon Valley’s progress, or lack thereof, around several critical pillars. First, reducing the spread of false information via online advertising. Second, enhancing transparency of political advertising and labeling of political messages laden with falsehoods. Third, taking action to disclose information pertaining to malicious actors who adopt tactics to amplify disinformation. Fourth, establishing parallel mechanisms to provide users an alternative to an information resource known to be false. Fifth, and finally, engaging with third-party fact checkers and independent researchers to determine the veracity of content and to improve user media literacy.

In determining progress related to these five core focus areas, SWD 180 paints an uneven picture, noting on the one hand that the Code prompted ‘concrete actions and policy changes by relevant stakeholders aimed at countering disinformation.’ Conversely, the report details additional steps that need to be taken to achieve further progress. For example, SWD 180 noted that a lack of common operating principles, procedures, lack of transparent metrics to determine impact, a lack of common definitions, and failing to work proactively with the advertising sector has resulted in the inconsistent adoption of key Code precepts. The EC’s report also bemoaned the lack of access to key data that would make a comprehensive evaluation of the Code’s impact nearly impossible.

Despite these structural challenges, the EC’s September 10 evaluation of the technology sector’s efforts to combat COVID-19 related disinformation is positive. Specifically, the EC said that major social media platforms had made improvements by increasing the visibility of information provided by the World Health Organization or other governmental health organizations that do not peddle false information. The EC also detailed the proactive efforts to remove mis- and disinformation related to COVID-19 that was particularly damaging during the early portions of the COVID-19 crisis. Google’s efforts to ensure searches on COVID-19 resulted in ‘hits’ that were from reputable sources of information that were fact-checked by EU approved organizations were highlighted as a success. Also of note was the EC’s enthusiasm for Facebook and Instagram’s creation of COVID-19 information centers that directed ‘2 billion people globally’ to credible health authorities who deliver unbiased guidance regarding the novel coronavirus.

The EC’s systematic evaluation of the disinformation space through an examination of social media companies’ interaction with the 2018 Code is commendable. In contrast, the United States is severely lacking in evaluating the performance of U.S.-based companies. While the U.S. can (and has) indirectly benefited from the EU’s more rigorous approach, Europe’s leverage over American-based entities remains limited. While Congress and the Executive branch would do well to require the substantive analysis of the impact on disinformation within U.S. communities as the EU has done, that is unlikely to occur when key political leaders wield fake news as a cudgel against political enemies. Thus, corporations will remain key to take ownership over these challenges by implementing their own community, trust, and/or safety guidelines. Last week’s decisions by Twitter and Facebook to semi-synchronize actions to remove the U.S. based extremist group known as the Oathkeepers from their platforms due to inciting violence and spreading COVID-19 disinformation exemplifies Silicon Valley’s ability to act, albeit often not with alacrity. Another recent challenge facing U.S. based social media companies is the spread of misinformation relating to the wildfires in the Pacific Northwest. Across multiple platforms, false information that Antifa and Black Lives Matter protesters were setting fires intentionally prompted armed vigilantes to take to the streets to combat arsonists that never existed. While third-party fact checkers identified the wildfire related misinformation, Facebook and Twitter failed to remove the false content before it spread to hundreds of thousands of users. While adopting a consolidated Code of Practice on disinformation like Europe may not provide a silver-bullet solution to the spread of false information, it could help slow the spread. Alternatively, social media companies may want to consider expanding local fact-checking mechanisms. In cases with the challenge of the West Coast’s wildfires, local experts are best suited to advising on ground truth. After all, the EC’s Codes on Disinformation and European-based fact-checkers will not be able to solve highly localized U.S. disinformation-generated discord.

SUBSCRIBE TO INTELBRIEFS