JUSTICE - No. 71

45 Spring 2024 standardized procedures for reporting illegal content, the same access to complaints and redress mechanisms across the single market, the same standard of transparency of content moderation or advertising systems, and the same supervised risk mitigation strategy where very large online platforms are concerned. The DSA will be directly applicable across the EU from February 17, 2024, fifteen months after entry into force. By this time, Member States are required to designate competent national authorities, referred to as Digital Services Coordinators (DSCs). These authorities will be responsible for ensuring that the services established on their territory are compliant with the new rules, enforcing the rules applicable to smaller and very large online platforms, and participating in the EU cooperation mechanism of the DSA. A DSC will be appointed as an independent authority within each Member State, with strong powers to perform their tasks impartially and transparently. They will also double as an important regulatory hub, ensuring coherence and digital competence across the Union. The DSCs will cooperate within an independent advisory group, called the European Board for Digital Services, which can support analysis, reports and recommendations, as well as coordinate the new tool of joint investigations by DSCs. Large Platforms and Search Engines The DSA also includes effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions based on the obligatory information that platforms must now provide to users when their content is removed or restricted. It places obligations on very large online platforms and search engines, reaching at least 45 million users, to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate risks of disinformation or election manipulation, cyber violence against women, or harm to minors online. These measures must be carefully balanced against restrictions on freedom of expression and are subject to independent audits. Following their designation by the Commission, large online platforms and search engines will have to perform an annual risk assessment and take corresponding risk mitigation measures based on the design and use of their service. Such measures will need to be carefully balanced against restrictions on freedom of expression and must be subjected to an independent audit. Additionally, the proposal sets out a co-regulatory framework where service providers can work under codes of conduct to address the negative impacts of the viral spread of illegal content as well as manipulative and abusive activities. The DSA fosters a co-regulatory framework for online harms, including codes of conduct, such as a revised Code of Practice on disinformation, and crisis protocols. Under the DSA, so called “dark patterns,” which are intended to deceive users into taking certain actions which they might otherwise not take, are prohibited. Providers of online platforms are now required to organize or operate their online interfaces in a way that does not deceive, manipulate, or otherwise materially distort or impair the ability of users of their services to make free and informed decisions. Supervision of large online platforms and search engines will fall to the EC, which is the sole entity authorized to supervise and enforce the DSA obligations that apply to these providers. In addition, the EC, together with the DSCs, will also be responsible for supervision and enforcement of any other systemic issue concerning very large online platforms and very large online search engines. The new enforcement mechanism, consisting of national and EU-level cooperation, will supervise how online intermediaries adapt their systems to the new requirements. Each Member State will need to appoint a DSC, who will be responsible for supervising the intermediary services established in its Member State and/or for coordinating with specialist sectoral authorities. Each Member State will specify the penalties in its national laws, including financial fines, for any infringement, ensuring they are proportionate to the nature and gravity of the infringement, yet significant enough to ensure compliance. With respect to large online platforms and very large online search engines, the Commission has direct supervision and enforcement powers and can, in the most serious cases, impose fines of up to 6% of the global turnover of a service provider. Conclusion The DSA is ambitious in its scope and was published after years of wide-ranging consultation. It should meet many of the concerns European Jews have regarding antisemitism’s increased promotion online. It places new and necessary responsibilities on the platforms, which they had vigorously resisted. Still, it will take years until states can recruit and train enough staff who can understand and apply the regulations. In addition, it still fails to deal with content that is harmful, but which falls below the criminal threshold, which in its scale and impact