Slovak Mimicry of Online Content Moderation on Digital Platforms as a Result of the Adoption of the European Digital Services Act

ABSTRACT 

The global nature of digital platforms, particularly social media, highlights the lack of a unified legal framework to regulate the content which is distributed to users. This issue is not only about the quality of the content but often concerns its problematic nature, which may conflict with the legal systems of various countries, especially the member states of the European Union. Examples include hate speech, terrorist content, discriminatory material, or images depicting child sexual abuse. Digital platforms frequently argue that they are not responsible for the nature of this content, as they merely facilitate its publication and do not create it themselves, thus claiming they should not be held legally accountable. This article examines the research question of how the recently adopted European Digital Services (Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act), 2022) might change the current paradigm using various legal tools. The Act aims to effectively regulate online intermediaries and platforms, including marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation services. Through a critical analysis of the provisions of the Digital Services Act, related legislation, court decisions, and the actual behaviour of digital platforms, the authors reassess the effectiveness of different mechanisms intended for moderating content on these platforms. The primary objective is to determine the shift in the legal boundaries of digital platforms’ responsibility for shared content, particularly regarding newly defined obligations related to user safety and new information requirements for digital platforms, such as reporting to European supervisory authorities. Special attention is given to the increased legal protection of minors using digital platforms, particularly regarding the absolute prohibition of profiling them for online advertising, as stipulated in Article 28 of the Digital Services Act. This provision complements the relevant rules set out in Article 22 of GDPR. The general tightening of conditions for presenting advertisements online is intended to curb the use of personalized advertising, which often relies on the (impermissible) profiling of ad recipients using special categories of personal data, such as racial or ethnic origin, sexual orientation, biometric data, and more. In the article, the authors also discuss potential challenges in the practical implementation of the Digital Services Act in individual member states of the European Union, considering the specifics of national legislation. To illustrate these challenges, the article provides an analysis of the legislative realities in Slovakia as a model example.

KEY WORDS 

Digital Platforms. Digital Services Act. Moderation of Online Content. Online Advertising. Social Networks.

DOI https://doi.org/10.34135/mlar-24-02-06