Skip to main content Accessibility help
×
Home

Suppressing Atrocity Speech on Social Media

  • Emma Irving (a1)

Extract

In its August 2018 report on violence against Rohingya and other minorities in Myanmar, the Fact Finding Mission of the Office of the High Commissioner for Human Rights noted that “the role of social media [was] significant” in fueling the atrocities. Over the course of more than four hundred pages, the report documented how Facebook was used to spread misinformation, hate speech, and incitement to violence in the lead-up to and during the violence in Myanmar. Concluding that there were reasonable grounds to believe that genocide was perpetrated against the Rohingya, the report indicated that “the Mission has no doubt that the prevalence of hate speech,” both offline and online, “contributed to increased tension and a climate in which individuals and groups may become more receptive to incitement.” The experience in Myanmar demonstrates the increasing role that social media plays in the commission of atrocities, prompting suggestions that social media companies should operate according to a human rights framework.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Suppressing Atrocity Speech on Social Media
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Suppressing Atrocity Speech on Social Media
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Suppressing Atrocity Speech on Social Media
      Available formats
      ×

Copyright

This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.

References

Hide All

1 Human Rights Council, Report of the Independent International Fact-Finding Mission on Myanmar, UN Doc. A/HRC/39/64, para. 74 (Aug. 24, 2018) [hereinafter FFM Report, Abbreviated Version].

2 Human Rights Council, Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar, UN Doc. A/HRC/39/CRP.2 (Sept. 17, 2018) [hereinafter FFM Report, Detailed Version].

3 Id. at para. 1354.

4 For more on these challenges, see Richard Wilson & Matthew Gillet, The Hartford Guidelines on Speech Crimes in International Criminal Law 123–29 (2018).

5 Facebook Fined £500,000 for Cambridge Analytica Scandal, BBC News (Oct. 25, 2018); James Titcomb et al., Facebook Security Breach Exposed 50 million Accounts to Attackers, Telegraph (Sept. 28, 2018).

7 See, e.g., Max Fisher & Amanda Taub, How Everyday Social Media Users Become Real-World Extremists, N.Y. Times (Apr. 25, 2018).

8 Craig Timberg & Tony Romm, Facebook CEO Mark Zuckerberg to Capitol Hill: “It Was My Mistake, and I'm Sorry”, Wash. Post (Apr. 9, 2018).

9 UN Office of the High Commissioner for Human Rights, Guiding Principles on Business and Human Rights: Implementing the United Nations “Protect, Respect and Remedy” Framework, UN Doc. HR/PUB/11/04 (Mar. 21, 2011) [hereinafter UN Guiding Principles].

10 Id. Principle 13.

11 Id. Principle 17.

12 Human Rights Council, Report of the Special Representative of the Secretary-General on the Issue of Human Rights and Transnational Corporations and Other Business Enterprises, John Ruggie, UN Doc. A/HRC/14/27, para. 55 (Apr. 9, 2010) [hereinafter Report of the Special Rapporteur].

16 Jurisdiction over corporations is provided for in Article 46C of the Malabo Protocol; for further information on incitement under international criminal law, see Richard Wilson, Incitement on Trial: Prosecuting International Speech Crimes (2017).

18 Kai Ambos, International Economic Criminal Law, 29 Crim. L.F. 499, 506–10 (2018).

19 Malabo Protocol, supra note 14, art. 46C.

20 The Malabo Protocol requires fifteen ratifications to enter into force. As of July 2019, fifteen states have signed it but none has ratified it.

21 See Malabo Protocol, supra note 14, arts. 46E, 46E bis, and 46F.

22 European Commission Press Release IP/16/1937, European Commission and IT Companies Announce Code of Conduct on Illegal Online Hate Speech (May 31, 2016). More companies signed up in 2018. See European Commission, Countering Illegal Hate Speech Online #NoPlace4Hate (Oct. 18, 2018).

23 European Commission, Tackling Illegal Content Online: Towards an Enhanced Responsibility of Online Platforms, COM(2017) 555 final (Sept. 28, 2017). With respect to terrorist online content specifically, which may constitute hate speech and incitement but which is a much broader category of content, the European Union has initiated steps to impose binding measures on service providers to remove content. Council of the European Union Press Release, Terrorist Content Online: Council Adopts Negotiating Position on New Rules to Prevent Dissemination (Dec. 6, 2018).

24 Section 1(3) of the Network Enforcement Act (Netzdurchsetzungsgesetz [NetzDG]), defines unlawful content in relation to provisions of the German Criminal Code (Strafgesetzbuch [StGB]), including §§ 91a, 111, and 130, which relate to hate speech and incitement to violence.

25 Network Enforcement Act, supra note 24, § 3.

26 Id. § 4.

28 FFM Report, Abbreviated Version, supra note 1, para. 74.

29 FFM Report, Detailed Version, supra note 2, para. 1718.

31 Article 19, Self-Regulation and “Hate Speech” on Social Media Platforms (2018) [hereinafter Article 19 report]; Global Civil Society Initiative, Manila Principles on Intermediary Liability (May 30, 2015).

32 Report of the Special Rapporteur, supra note 12, para. 17. On the problem of overbroad removals, see Avi Shapiro, YouTube and Facebook Are Removing Evidence of Atrocities, Jeopardizing Cases Against War Criminals, The Intercept (Nov. 2, 2017).

33 Bernhard Rohleder, Germany Set out to Delete Hate Speech Online. Instead, It Made Things Worse, Wash. Post (Feb. 20, 2018).

34 Report of the Special Rapporteur, supra note 12, paras. 15–24.

35 Described in detail in Article 19 report, supra note 31.

36 FFM Report, Detailed Version, supra note 2, para. 1724.

37 Id.

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed