JUSTICE - No. 66
40 No. 66 JUSTICE This assignment of state tasks to private parties entails the risk of legally incorrect assessments of the reported contributions. In addition, the threat of a fine within the NetzDG also creates unintentional tension. The operators of the social networks are to be urged to quickly delete and report the posts in question. In principle, this can be achieved. However, in view of the threat of sanctions, there is a danger that the operators will report non-illegal posts as illegal in order to minimize the risk of fines. This impacts on the efficiency of law enforcement and impinges on the freedom of expression on the internet. In principle, the operators can behave in either of two ways: a. To classify almost every user-reported post as illegal due to the lack of legal know-how. They can consequently be reported to the law enforcement authorities under the NetzDG 2021. This would very likely lead to government agencies being overloaded due to the large number of reported cases. 34 An example of such rash classification of legitimate posts as hate speech is the deletion of posts and the blocking of Jörg Rupp's account. The left-wing politician was active in refugee aid and in the fight against right- wing extremism. On January 5, 2018, he posted a song lyric on Twitter. In connection with the other tweets linked by Rupp, as well as his political orientation, it was clearly recognizable that the tweet was satire. Nevertheless, Twitter deleted the post and blocked the account. Only after several interventions was the account released by Twitter. 35 This example shows that the assessments of the posts by the operators of the social networks are not free of errors, and probably the posts are examined for illegal content with the help of software programs. In this way, attempts are being made to implement the requirements of the new NetzDG in a cost-effective manner. However, this could be at the expense of legal contributions, which may be blocked for no reason. b. The second way is for the posts reported by users to be classified very cautiously as “punishable” by the social network – and therefore not be forwarded to the authorities. This exposes the operator of the social network to the risk of fines because it does not sufficiently implement the requirements of the new NetzDG. The reason for this is the fear of the operators that the social networks, which, as described in the above case, act too vigorously against innocent posts, may become unattractive for users. The results of both ways (a and b) are not satisfactory, as they require the utopian perfect legal assessment of the posts by the operators of the social networks. They need to find the exact balance between over-blocking and acting with restraint. However, the operators are neither competent to do this nor are they obliged to do so in terms of prosecution theory, since the criminal assessment of actions is one of the core tasks of the state. V. Outlook and Alternatives It is currently not clear when the revised law for the new NetzDG will be discussed by the appointed Mediation Committee and when it can then be issued by the Federal President. Even once the new NetzDG (2021) has, after this revision, become constitutional, some questions and points of concern remain. The assignment of control tasks to private companies through the current NetzDG effectively meant that the state would be overwhelmed by the situation. It signals an “attitude of helplessness,” with the state shying away from offering its own solution to the problem. In this context, a state-owned initiative would not only be desirable, but also legally much more secure. The task of evaluating German laws should not be entrusted to predominantly American corporations. An alternative idea is a state-owned reporting platform on which users can report relevant contributions across networks. These could be carefully reviewed by competent employees, who, if they so conclude, could then contact the respective operator of the social network in the event of a crime having been identified. The illegal post would then be deleted by the operator, while at the same time the author's data stored by the network would be passed on to the law enforcement authorities for prosecution. Such a solution could also pick up reports from smaller networks. Ideas of such systems already exist. 36 However, they are run by private associations and operate on a volunteer basis. By adopting this already existing infrastructure, the state could build up a corresponding platform in a less time-consuming manner, relying on processes that 34. Supra note 22. 35. Markus Reuter, “Moderation nach Gutsherrenart: Wie Twitter Accounts ohne Einordnung des Kontexts sperrt,“ N ETZPOLITIK , Jan. 1, 2018, available at https://netzpolitik. org/2018/moderation-nach-gutsherrenart-wie-twitter- accounts-ohne-einordnung-des-kontexts-sperrt/ 36. Cf. Hassmelden website, available at https://hassmelden. de
RkJQdWJsaXNoZXIy MjgzNzA=