JUSTICE - No. 66

22 No. 66 JUSTICE The bottom line is that in the current normative and institutional environment, online platforms determine their own hate speech standards – independently of international human rights standards – and that given the dominance of online speech in contemporary society, this means that the platforms determine through the application of their policies the general societal speech ecosystem. Moreover, such choices entail — in a world governed by section 230 and other de facto immunity arrangements — only limited accountability (and involve less than full transparency), 26 even in situations where the platforms fail to effectively enforce their own standards. The big question confronting policy makers on a national and international level is whether the norms governing the global marketplace of ideas and the most prominent venues for exercising freedom of expression should continue to be set and enforced almost exclusively by private for-profit companies. The desired alternative would be more inclusive standard-setting and law- application processes that are international human rights law-friendly and involve the greater public and its democratic representatives. At a more systemic level, it is important to acknowledge that the exceptional risk associated with online antisemitic speech and other forms of online hate speech – scale, scope and speed – is the result of policy choices: use of algorithms that assign virality of contents on the basis of certain traffic features, strive for personalization of contents, use filter bubbles and mix news from trusted sources with less trustworthy user-generated contents. 27 These “architectural features” of the online platforms are part of their business model, and are not intrinsic features of their users’ freedom of expression. Addressing these background conditions does not require a choice between freedom of expression and censorship (unless one sees these features as forms of expression resorted to by the platforms), but rather a decision to render the practices of the platforms compatible with international human rights standards. The Way Forward So how does one proceed from here to address more effectively online antisemitism and other forms of online hate speech? One possible avenue delineated in a study project I directed for The Israel Democracy Institute/ YadVashem on needed reforms in the major platforms’ online hate speech policies, includes sixteen policy recommendations for the platforms to consider. 28 The most significant recommendations are that: (a) social media companies are legally and ethically responsible for harm caused by hate speech on their platforms; (b) the definition of what constitutes hate speech must be revised in accordance with international human rights law standards, giving prominence to the harm principle, including associated patterns of tension, discrimination and violence, and harm caused by revictimization of hate crime victims due, inter alia, to the denial of such crimes; (c) regulating hate speech should not be a binary matter but include nuanced responses (such as limits on virality and counter-messaging) and pre-emptive interventions (such as reminding users of community rules when offensive phrases are used); (d) complaint procedures must be improved and rendered more accessible and independent; and (e) companies should engage in broader efforts of consultation with stakeholders and be more transparent about the application of their hate speech policies. These recommendations are not a panacea for the challenge of defining and combating online antisemitism and other forms of online hate speech, but they could mark a step forward toward more effectively combatting offensive speech in accordance with international human rights law norms. Moreover, they could help states and non-state stakeholders to formulate more clearly their expectations from the online platforms and the policies they follow. n Yuval Shany is Hersch Lauterpacht Chair in Public International Law, Hebrew University of Jerusalem and Vice-President for Research, Israel Democracy In st itute. Between 2013-2020, he was a member of the UNHuman Rights Committee and served as Chair of the Committee between 2018-2019. 26. Spandana Singh and Leila Doty,“TheTransparency Report Tracking Tool: How Internet Platforms Are Reporting on the Enforcement of Their Content Rules,”N EW A MERICA (April 8, 2021), available at https://www.newamerica.org/ oti/reports/transparency-report-tracking-tool/ 27. I am grateful to Dr. Tomer Shadmy for these points. 28. IDI–YadVashem Recommendations for Reducing Online Hate Speech: A Proposed Basis for Policy Guidelines for Social Media Companies and other Internet Intermediaries (2019), available at https://www.idi.org. il/media/13570/recommendations-for-reducing-online- hate-speech.pdf

RkJQdWJsaXNoZXIy MjgzNzA=