JUSTICE - No. 65
34 No. 65 JUSTICE generalizations, or behavioral statements (in written or visual form) — that include: ... Jewish people and rats [presumably meaning Jewish people as rats] Jewish people running the world or controlling major institutions such as media networks, the economy or the government. 25 Facebook also has a posting from April 2020 titled“An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19.” 26 The update says nothing about antisemitism in particular or hate speech in general. It does state: “To date, we've also removed hundreds of thousands of pieces of misinformation that could lead to imminent physical harm,” suggesting that the test they are using for hate speech is misinformation that could lead to imminent physical harm and not, say, misinformation that could lead to eventual genocide. Facebook does post updates to the Community Standards component of its Transparency Reports. The update of November 2020, in its Hate Speech subsection, repeats the caution against comparing Jews to rats but says nothing further on antisemitism. That component of the update also says nothing about COVID-19, as if the two subjects, COVID-19 and hate speech, were completely unrelated. 27 One can see that the standard working definition of antisemitism, including its examples, 28 is not used by internet providers in the interpretation of their terms of service. Holocaust denial is a form of antisemitism set out in the examples of its most widely accepted definition. With Holocaust denial, internet providers remove or restrict access or block content, not because they consider it hate speech, but only because and where it is illegal. If there is a finding of illegality in any particular jurisdiction, internet providers will respect that finding for that specific jurisdiction. Thus a Facebook posting states: If, after careful legal review, we determine that the content is illegal under local law, then we make it unavailable in the relevant country or territory. For example, Holocaust denial is illegal in Germany so if it is reported to us we will restrict this content for people in Germany. 29 Using the law as a basis for blocking or removing or 25. Facebook, “Community Standards – Hate Speech,” F ACEBOOK , available at https://www.facebook.com/ communitystandards/hate_speech/ 26. Guy Rosen, “An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19,” F ACEBOOK (Apr. 16, 2020), available at https://about.fb . com/news/2020/04/covid‑19‑misinfo‑update/ 27. Facebook,“Hate Speech, Community Standards, Recent Updates,”F ACEBOOK (Nov. 2020), available at https://www. facebook.com/communitystandards/recentupdates/ hate_speech/ 28. Supra note 1. 29. Facebook Help Center, “What is a legal restriction on access to content on Facebook?” F ACEBOOK , available at https://www.facebook.com/help/1601435423440616 30. “Google Transparency Report,”G OOGLE (June 22, 2020), available at https://transparencyreport.google.com/ youtube policy/removals?hl=en banning a post is a convenient shortcut for internet providers, but not always appropriate, because laws may violate human rights as well as respect them. For instance, a country may have a law that nothing critical of the government can be posted on the internet, a plain violation of international freedom of speech standards. That sort of law is quite different from a law prohibiting hate speech on the internet, a law that conforms to international human rights standards. Internet providers need to distinguish between rights-respecting and rights- violating laws. Furthermore, internet providers cannot avoid respecting local laws locally. However, they can defer to local rights-respecting laws globally should they choose to do so. In the latest Google transparency report, there is even less detail. 30 Google reports the percentage of videos removed fromYouTube because their content was hateful or abusive. There is no breakdown between the hateful and abusive and nothing within the hateful category to indicate what it includes. The European Commission’s code of conduct and trusted flagger system are positive, but suffer from an over-generalized code of conduct and insufficient disclosure to the public. Other countries should adopt the positive and avoid the negative. With modifications, all countries should adopt the European Commission system. There needs to be greater elaboration of the code of conduct to include specific reference to the generally adopted working definition of antisemitism, with its examples of Holocaust denial and anti-Zionism.
RkJQdWJsaXNoZXIy MjgzNzA=