JUSTICE - No. 66
18 No. 66 JUSTICE — introduce three categories of speech: (1) fully protected speech; (2) partly protected speech — that is, speech that governments may curtail for legitimate reasons, subject to certain conditions, in response to compelling public interests or in order to protect the rights of others; and (3) outright impermissible speech. Antisemitic speech typically falls under the third category, as it would qualify as a form of “advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence,” which Article 20 requires states to prohibit by law. 5 The Responsibility of Online Platforms When analyzing the application of international human rights law to online antisemitic speech and other forms of online hate speech, one needs to understand that such speech engages the legal and ethical responsibility of private actors. Not only are the individuals who post offensive content liable for their action under the civil law or criminal law of one or more states, but the platforms disseminating such content might also incur legal and ethical responsibility. What’s more, unlike the U.S. government, online platforms based in the U.S. are not bound by the language of the First Amendment; thus they can impose more stringent limits upon online expression than what the First Amendment provides, in accordance with their terms of use and community standards. In fact, section 230 of the U.S. Communications Decency Act 6 encourages, but does not compel, internet service providers to apply content moderation vis-à-vis objectionable contents (this arrangement is dubbed in the law as “good Samaritan” blocking and screening). In other words, under U.S. law, the question of whether U.S.-based online platforms choose or engage in content moderation remains a questions of company policy, which in turn is driven mostly by economic considerations and ethical sensibilities. Significantly, online platforms could incur civil or criminal liability in other jurisdictions, whose laws governing online contents do not provide for an Article 230-like immunity arrangement. In fact, the recent trend in Europe has been to pass legislation that imposes liability on online platforms, including U.S. based platforms, for failing to promptly remove harmful contents after they have been alerted to the existence of such contents on the platforms. As of 2017, the German Network Enforcement Act (NetzDG) requires the removal of manifestly unlawful content within 24 hours of notification, and other content prohibited by German criminal law within 7 days. Non-compliance is liable to fines of up to 50 million Euros. 7 France passed, in 2020, even more stringent legislation against hate speech (Avia Law), imposing heavy fines on online platforms that fail to expeditiously remove upon notification unlawful contents. 8 (The Law has been partly set aside, however, by the Constitutional Court which considered it incompatible with freedom of expression principles. 9 ) The EU Draft Digital Services Act also requires the expeditious removal or disabling of access to illegal content as a condition to immunity from liability, while also requiring online platforms to observe freedom of expression. 10 Public opinion pressure against the lax standards that online platforms have put in place against hate speech and particularly antisemitic speech, is pushing these protection of national security or of public order (ordre public), or of public health or morals”); ibid. , Art. 20 (“1. Any propaganda for war shall be prohibited by law. 2. Any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence shall be prohibited by law”). 5. It is noteworthy that upon ratification of the ICCPR, the U.S. entered a reservation to Article 20, so as to clarify that it does not accept any obligation to curb hate speech that conflicts with the constitution and the laws of the U.S. reservations, declarations, and understandings. International Covenant on Civil and Political Rights, 138 Cong. Rec. S4781-01, Sec.I(1) (daily ed., April 2, 1992). The rejection of, or at least distancing by the U.S. from the applicable international standard, serves as a backdrop to the regulatory challenges of combating online hate speech involving U.S.-based technology companies. 6. 47 U.S. Code § 230. 7. Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken, 1 Sept. 2017. English version available at https://germanlawarchive.iuscomp.org/?p=1245 8. Loi visant à lutter contre les contenus haineux sur internet, May 13, 2020, available at https://www.assemblee- nationale.fr/dyn/15/textes/l15t0388_texte-adopte- seance#B2298414350 9. Conseil Constitutionnel, Décision n° 2020-801 DC, June 18, 2020, available at https://www.conseil -constitutionnel. fr/decision/2020/2020801DC.htm 10. Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/ EC, COM/2020/825 final.
RkJQdWJsaXNoZXIy MjgzNzA=