JUSTICE - No. 66
20 No. 66 JUSTICE that limit the dissemination of hate speech and other illegal contents, for many years the platforms have embraced, perhaps under the influence of First Amendment ideology, a minimalist attitude toward speech regulation. A clear example of this attitude was the long-standing rejection by the platforms of the role of arbiters of truth, 18 even when confronted with blatant cases of malicious and patently false speech such as Holocaust denial. The more the platforms move toward adopting robust content moderation policies to meet stakeholders’ expectations for a less toxic product, the more they assume the role of “speech police.” 19 Since they lack democratic credentials, the more the platforms impose extensive restrictions on online speech, the more their speech policies suffer from a legitimacy problem in the eyes of affected constituencies. Reliance on international human rights law may help in providing platforms a modicum of legitimacy to their hate speech policies. The establishment of the Facebook Oversight Board 20 represents another effort by one of the leading online platforms to obtain greater public legitimacy for regulating online speech. Through introducing review of borderline decisions by a diverse group of independent experts, Facebook may have been aiming to legitimize its hate speech policies and to generate some deference for platform decisions from national regulators and courts. 21 Initial decisions made by the Board in 2021 are promising, to the degree that they rely on international human rights law to resolve the cases before them, and in the ambitious policy recommendations they make to push for greater clarity of the applicable rules and more transparency in the application of policies. 22 The Board, however, was initially authorized only to review takedown decisions; hence, its initial interventions were all in the direction of extending the scope of freedom of expression available online. Such decisions did not suitably address the accusation that online platforms are not doing enough to fight online hate speech, since, if anything, the decisions tilted the balance toward giving offensive speech the benefit of the doubt. It remains to be seen whether the new power of the Oversight Board to also review refusals to take down offensive content 23 may change the perception of platforms being excessively lenient toward hate speech and, in particular, antisemitic speech. In any event, the limits of the Oversight Board model as a tool for combatting online hate speech and applying international human rights standards online, need to be acknowledged. First, the Board only reviews what one might define as a drop in the ocean of content moderation decisions: It oversees decisions by two platforms only (Facebook and Instagram), and since it focuses on borderline cases, it considers only a handful of cases out of millions of daily postings that include many thousands of controversial postings. Second, the Oversight Board reviews one category of content moderation decisions — to remove or not to remove offensive contents. However, sophisticated content moderation policies can include other non-binary alternatives, such as slowing down content virality, drawing the attention of users in real time to community standards, and introducing counter-messaging to offset offensive messages. Third, the Board does not provide strong “due process” guarantees for the adequacy of platform procedures for challenging content moderation decisions and for the internal decision-making process following such a challenge. Hence, it largely neglects 18. Tom McCarthy, “Zuckerberg says Facebook won't be ‘arbiters of truth’ after Trump threat,” T HE G UARDIAN , May 28, 2020, available at https://www.theguardian.com/ technology/2020/may/28/zuckerberg-facebook-police- online-speech-trump 19. David Kaye, S PEECH P OLICE: T HE G LOBAL S TRUGGLE TO G OVERNTHE I NTERNET (Columbia Global Reports, 2019). 20. Brent Harris, “Establishing Structure and Governance for an Independent Oversight Board,” F ACEBOOK (Sept. 17, 2019), available at https://about.fb.com/news/2019/09/ oversight-board-structure/ 21. See Tomer Shadmy andYuval Shany,“Protection Gaps in Public Law Governing Cyberspace: Israel’s High Court’s Decision on Government-InitiatedTakedown Requests,” L AWFARE , April 23, 2021, available at https://www. lawfareblog.com/protection-gaps-public-law-governing- cyberspace-israels-high-courts-decision-government- initiated 22. Case Decision 2020-003-FB-UA, Jan. 28, 2021 (Armenia/ Azerbaijan Hate Speech), available at https:// oversightboard.com/decision/FB-QBJDASCV/ ; Case Decision 2020-004-FB-UA, 28 Jan. 2021 (Brazil Adult Nudity), available at https://oversightboard.com/decision/ IG-7THR3SI1/ 23. Oversight Board,“The Oversight Board is Accepting User Appeals to Remove Content from Facebook and Instagram,”O VERSIGHT B OARD (April 2021), available at https://oversightboard.com/news/267806285017646-the- oversight-board-is-accepting-user-appeals-to-remove- content-from-facebook-and-instagram/
RkJQdWJsaXNoZXIy MjgzNzA=