Megrendelés

Simona Veleva[1]: Digital Services Act - Anticipating Challenges in Regulatory Implementation (ELTE Law, 2024/2., 143-159. o.)

https://doi.org/10.54148/ELTELJ.2024.2.143

Abstract

The current article examines the legal aspects and challenges anticipated during the implementation of the Digital Services Act (DSA) into national regulatory frameworks. As the DSA represents a ground-breaking legislative initiative aimed at governing digital services within the European Union, this article explores some practical matters in terms of its implementation, the choices of the Member States countries concerning a Digital Services Coordinator (DSC), as well as potential hurdles faced by national regulatory authorities (NRA) in the implementation process, taking into account the right to freedom of expression, access to information, and the principle of liability exemption for intermediaries.

The research explores the legal implementation of the DSA across EU Member States, with a focus on the harmonisation of the act and the challenges posed by differing legal traditions and regulatory approaches, emphasising the importance of the European Commission in the regulation of very large online platforms (VLOPS) and very large online search engines (VLOSE), the current practices and the future tendencies in this regard. In addition, the article also examines some complex, specific tasks related to the enforcement of the DSA's provisions. The NRAs face specific technical and logistical challenges regarding their new monitoring competencies, which the article also addresses.

Keywords: Digital Services Act, regulation of intermediaries, freedom of expression, Digital Services Coordinator, Very large online platforms (VLOPs), Very large online search engines (VLOSEs)

- 143/144 -

I. Introduction

The rapid development of the digital market and information society services, especially intermediary services, has revolutionised how individuals communicate, access information, and conduct business. It has led to a whole new digital services package being prepared by the European Commission designed to address the challenges arising from these trends, making the European Union's (EU) market compatible with other global markets and, at the same time, harmonising the rules in all Member States. With the focus and primary intention of protecting the fundamental rights of users, the Digital Services Act (DSA)[1] is applicable to designated platforms with more than 45 million users in the European Union (this rule was established based on 10% of the EU population[2]), as of 25 August 2023. As of 17 February 2024,[3] the DSA has applied to all platforms and intermediary services. Therefore, all Member States need to make certain amendments to their own legislation in order to meet these new obligations and to further nominate and prepare compatible and adequate national regulatory authorities (NRA) which can apply the respective amendments.[4] The DSA has the ambitious goal of establishing clear rules and responsibilities for online platforms and service providers while promoting a safe, transparent and accountable digital environment. However, at the same time, DSA changes the regulatory landscape established by traditional legal paradigms. This task is not easy, given the vast area of regulation which DSA will enforce and in relation to several other legislative acts, some of which were recently adopted and which will also need to be properly applied, such as the Copyright Directive,[5] the Audiovisual Media Services Directive (AVMSD)[6] and the General Data Protection Regulation,[7] the upcoming European Media Freedom Act (EMFA)[8] and the

- 144/145 -

recently adopted Artificial Intelligence Act (AI Act).[9] They all interact with this new act and impose new challenges concerning their proper application by national authorities. The current article examines these new challenges, addressing both legal and practical problems that national regulators will face. Applying the DSA will impose particular legal problems in terms of proper and timely enforcement. Such problems will be both procedural (for example, the need for judicial review regarding the final entry into force of administrative acts, which might require significant time) and substantive (for example, the need for changes in horizontal legislation to empower administrative bodies with new powers). Although traditional freedom of expression constructions are still intact, implementing DSA will require developing existing standards to cover different forms of expression in the online environment. The proper implementation of the DSA will only be effective if big platforms are cooperative and aim to properly comply with it and if Member States implement serious legal and administrative measures to address the challenges and ensure the national NRAs have enough tools to be competitive and provide effective, efficient and timely regulation. Further, the DSA and the problems related to it perfectly illustrate how freedom of expression has developed over the past decades due to technological development; therefore, traditional freedom of expression standards have also developed, changing how people communicate and participate in public discourse.

II. Scope of Regulation of Internet Intermediaries

The primary legal framework establishing and guaranteeing freedom of expression on an international level within Europe is imposed through different international treaties, but mainly by Article 19 of the Universal Declaration of Human Rights (UNDR), Article 11 of the EU Charter of Fundamental Rights and Article 10 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR), which stipulate a clear path for Member States and a traditional relationship between the state and private individuals. In the practice of the judiciary, mainly the European Court of Human Rights (ECtHR), the regulation of freedom of expression is well described and developed and is centred around the 'three-part test' which state bodies (mainly NRAs and afterwards the court) apply to define illegal content: namely, whether a measure of public power (i) protects legitimate aims, (ii) is prescribed by law, and (iii) imposes only such restrictions and sanctions that are necessary in a democratic society.[10] This traditional construction, well defined by the case law of the different jurisdictions, is applicable to the online

- 145/146 -

sphere as well,[11] especially in recent years since NRAs have also expanded with greater responsibilities in relation to the online sphere. Back in 2018, the Council of the EU issued a recommendation on the roles and responsibilities of internet intermediaries.[12] On the EU level, the Audiovisual Media Services Directive expanded the scope of regulation of online content, especially when media service providers provide non-linear media services. Only a few years after its amendment in 2018, it became clear that both markets and users' behaviour had changed, and a new legal approach was necessary. This is also evident in the approach towards the stricter regulation of influencers, for example. As of 2024, eleven NRAs in Europe define influencers as non-linear media service providers[13] and regulate them using different approaches. Most of them consider it necessary to determine that services (channels) that exceed certain thresholds shall be considered mass media services. The Netherlands, for example, specifies a minimum of 100,000 followers or subscribers on one individual platform.[14] For Spain, the threshold is 1,000,000 followers, which number is determined by the population of the country itself (Spain's population is around 47 million people), and a new decree was adopted in May 2024 that defines so-called high-profile influencers.[15] It is clear that more Member States will follow suit in the coming years.[16] Therefore, the new EMFA will also play a crucial role, as it will be interrelated with all other pieces of legislation, and often in many Member States countries, executed by the same NRAs. All this is part of an extra layer of regulation, and in the doctrine, it is defined as secondary regulation that, unlike traditional primary regulation (regulation of content), stipulates content moderation rules. This changes not only the way the market is constructed but also how the entire social and political debate occurs. Slowly but steadily, it also changes the traditional construction of the exercise of power through which the state can interfere with individuals' private sphere to protect different legitimate values, also protected by law. This is because more and more responsibilities are imposed on private parties who will need to take active measures to restrict content based on the applicable legal acts. This is, however, a problem for the NRAs. They face more challenges because

- 146/147 -

the scope of regulation is inevitably expanding. While traditional monitoring is based on a decision or a signal and includes radio and television, along with the most popular nonlinear media services, covering influencers will require a much broader scope and most probably also involve the use of AI in monitoring systems. However, AI tools are not equally developed in different EU languages, creating new problems related to the equal application of such tools and their effectiveness.

The third layer of regulation under EU law is defined as tertiary regulation, which includes the regulation of regulators.[17] The regulation of NRAs is essential for the prevention of misuse or misinterpretation of all these provisions, especially in terms of the general provision of the Directive on Electronic Commerce[18] (e-Commerce Directive) that prohibits Member States from imposing the responsibility for intermediaries to generally 'monitor the information which they transmit or store [and the] general obligation actively to seek facts or circumstances indicating illegal activity' (Article 15).[19]

These last two layers of regulation are developed and implemented in the Digital Services Act as well, which has the potential to truly reshape the media environment and especially the responsibility of the so-called Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) that are dominating large portions of the digital market but also have a huge impact on the way human rights are exercised, especially freedom of expression. The arrival of the DSA marks a new moment in addressing the challenges posed by the digital sphere.[20] Amending the e-Commerce Directive, without, however, revoking the liability regime for online intermediaries, is a product of efforts to ensure the safety, integrity and fairness of platforms and services, safeguard users' rights and foster a more transparent and responsible online community. The DSA is primarily focused on navigating new responsibilities for intermediaries and introducing new approaches to illegal content. At the same time, the DSA is trying to uphold the balance between innovation, freedom of expression and protecting individuals from online content and conduct. Its significance lies in upholding principles of human rights in the digital era and in its potential to revolutionise the digital landscape entirely. So far, no legal act has imposed a blanket rule

- 147/148 -

for intermediaries to remove all illegal content, no matter what its nature. In parallel, DSA has implications for both online governance and societal well-being. This underscores the importance of having regulations during a period of technological advancement and digital transformation. However, its proper application by Member State authorities and the huge scope of regulation that needs to be covered also amplify differences in the chosen approach and raise some challenges which need to be addressed. Outlining these problems will enhance their understanding and ensure the proper application of the new legal provisions.

III. Defining and Restricting Illegal Content

Probably the main and hardest goal of the DSA and the entire DSA package is actually creating a safer, more transparent digital environment in which malicious players do not have the tools and possibility to undermine democratic values and, in some cases, even the capacity to change and manipulate the public debate. At the same time, DSA seeks to limit harmful, illegal content by protecting various legitimate aims.[21] This would be possible if certain responsibilities were imposed not only on the intermediaries but also on the NRAs in the Member States, which would actually have to monitor and sanction them. However, defining and actually managing the restriction of illegal content is a difficult task, given the challenges of defining, interpreting and monitoring content and, at the same time, preserving the liability exception already mentioned [specifically noted in Article 3(4)(a) DSA]. The exception is also further defined and confirmed in Article 4 DSA, which prescribes that service providers are not liable for transmitted information when they did not initiate it, did not select the recipient, and did not select or modify the information contained in the transmission. At the same time, the main challenge will be to actually identify the illegal content.[22]

Some speech or content might not be illegal in some Member States but might be in others, and the proper navigation between these interpretations and the impact any removal might have is significant. This will be one of the hardest struggles, both for intermediaries but mainly for the regulators that have to monitor them. Implementing ambiguous provisions and applying them accordingly will be one of the main challenges for the NRAs, especially in terms of Article 17. DSA defines quite broadly the concept of 'illegal content'.[23] Recital 12 stipulates that illegal content is content that 'under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory

- 148/149 -

content, or that the applicable rules render illegal in view of the fact that it relates to illegal activities'. The Recital further gives different and illustrative examples of illegal content (sharing images of child abuse or non-authorised use of copyright-protected material), but at the same time, proper interpretation of what can be defined as 'illegal hate speech' or 'unlawful discriminatory content' might differ widely from country to country, including given the local context, the specifics of the speech, and the cultural diversity.

Article 9(1) DSA prescribes an obligation for providers of intermediary services to remove or act against specific items of illegal content issued by national administrative authorities. Paragraph (2) stipulates some restrictions, noting that the order needs to be prescribed by law and associated with specific reasoning explaining why the content is illegal, but also, most importantly, that the 'territorial scope of the order ... is limited to what is strictly necessary to achieve its objective'. Still, this strict limitation is also open to wide interpretation and raises the question of the type of criteria used. If the content is illegal according to the regulator or the court's jurisdiction, this does not mean it will be illegal in other Member States where the service or the information is accessible. At the same time, regarding graphic or video content, especially that which is copyrighted, this order should be applicable in all Member States. These issues need to be properly addressed, and they will be based on the specific content and the grounds for their unlawfulness. While a national regulator can decide on the legality of content only based on their own national law or interpretation of the EU law, other national regulators or jurisdictions might not consider it illegal. This factor was considered in a dedicated report by the European Parliament.[24] The parliament 'highlights that in order to protect freedom of speech [...] hosting service providers should not be required to remove or disable access to information that is legal in the Member State that they are established in, or where their designated legal representative resides or is established'. This is the reason why, in most cases, geo-blocking will only occur in the country in which the respective authority issued the order, but the possibility for its wider application is still open. The European Commission explained that 'where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal'.[25] However, disputes in interpretation might arise, including regarding hate speech and its proper definition, discrimination and disinformation. For all these reasons, it becomes imperative to adopt a unified approach towards non-linear media service providers within Member States. The AVMSD prescribes that an 'on-demand audiovisual media service' (ie, a non-linear audiovisual media service) means an audiovisual media service provided by a media service provider for the viewing of programmes at the moment chosen by the user and at their individual request on the basis of a catalogue of

- 149/150 -

programmes selected by the media service provider.[26] According to this definition, different NRAs include different types of services, but the main difference is that some include a wide range of online providers, including influencers, and others do not. At the same time, this definition is improved and enhanced by EMFA, which prescribes that 'media service' means a service as defined by Articles 56 and 57 TFEU, where the principal purpose of the service or a dissociable section thereof consists of providing programmes or press publications, under the editorial responsibility of a media service provider, to the general public, by any means, in order to inform, entertain, or educate.[27] This creates even more reason for all NRAs to expand and unify their scope of regulation. Only in this way can DSA be properly applied by a unified approach with regard to the legality of the content and actually achieve its aims. Otherwise, vast areas of speech in the online sphere will remain unregulated, giving space for 'forum shopping' in terms of regulation.

IV. Platform Accountability

The European Commission adopted a series of designation decisions under the Digital Services Act, designating overall 24 VLOPs and VLOSEs as of July 2024.[28] Some of the latest decisions of the Commission involved platforms which clearly provide pornographic content (PornHub, Stripchat, XVideos, and XNXX) that were specifically added during the second wave of designated decisions. The forms of new compliance that these platforms need to adhere to would include, among others, much stronger protection of minors, user-friendly mechanisms to flag illegal content, analysis of the systematic risks and placement of mitigation measures, as well as the redesign of services in order to limit their content being viewed by children, providing more transparency and accountability. Compliance will need to involve serious measures for preventing minors from accessing content, given the accessibility that these sites enjoy right now. The supervision of these platforms will be a joint task of the European Commission and the Digital Services Coordinators of the Member States of Establishment. The Commission is responsible for the supervision, enforcement and monitoring of compliance of the VLOPs and VLOSEs in relation to the systematic risks and how big platforms will address them, but for everything else, the Commission

- 150/151 -

and the national authorities share the competences.[29] For this reason, the Commission and the Digital Services Coordinators sign agreements concerning their coordinated efforts to regulate content. So far, agreements with France (Autorité de regulation de la communication audiovisuelle et numérique, Arcom), Ireland (Coimisiún na Meán), Italy (Autorità per le Garanzie nelle Comunicazioni, AGCOM), and the Netherlands (Autoriteit Consument & Markt, ACM) have been concluded. Both regulators and the Commission are intended to be supported by an outside body, the European Centre for Algorithmic Transparency (ECAT), which was launched in April 2023 by the Commission. Outsourcing monitoring and due diligence are also specific problems since the entire process of monitoring and sanctions is increasingly mediated, despite the fact that, in this case, the Centre is part of the architecture of the Commission. This creates the risk of limiting content without the possibility of challenging and restoring it fast enough - or the opposite: being unable to take down and limit disinformation or illegal content efficiently and promptly while it is spreading and creating harm. In this sense, DSA is a general tool for the entire accountability network, built up among all the actors involved in the governance of algorithms, shaped by application structures and the structures of control of algorithms.[30] Particular challenges will emerge in the future for platforms that have a significant number of users but do not reach 45 million or whose identification of the flow of users is hard to distinguish. Currently, there are a few appeals against the European Commission regarding the application of the DSA to them. Zalando filed a complaint,[31] and so did Amazon in two separate fillings. The first complaint[32] on 5 July 2023 raised allegations that Amazon's designation as a VLOP is based on discriminatory criteria and violates the principle of equal treatment. Amazon filed a claim on 6 July 2023, requesting interim measures by the President of the General Court to order the suspension of the operation of the contested decisions, arguing that the obligation to provide users with an option for every one of their recommender systems is not based on profiling, in accordance with Article 38 DSA, and, second, the obligation to compile and make publicly available the repository required by Article 39 of that regulation.[33] These types of problems should not be underestimated. Balancing platform transparency and the legitimate interests of platforms regarding the confidentiality of information will be an issue in the future that both the Commission and the NRAs must navigate in a way that does not undermine or set back platforms on the European market compared to those in other competitive markets.

- 151/152 -

Additionally, in order to address all these new responsibilities that platforms have, they will be required to sign contracts with third parties, usually fact-checkers, not only in terms of combating disinformation but also in relation to general moderation. This is also one of the main obligations under the Code of Practice on Disinformation,[34] which the DSA strongly supports.[35] At the same time, this also raises serious concerncs regarding the core values of the right to freedom of expression. As a consequence, the platform Meta, for example, currently signs contracts with third parties that will provide fact-checking, and although in most countries, they are well-established organisations with a good reputation, the main problem of the manner of outsourcing arises. Outsourced monitoring operations may lack the contextual understanding or nuanced judgement necessary to accurately distinguish between permissible content and prohibited material. As a result, legitimate content could be erroneously targeted, leading to censorship or infringement of freedom of expression. Although fact-checkers will not be able to take down content, Meta will label it non-verified, and their algorithms will provide much less visibility.[36] Further, there is a risk of inadequate response to disinformation or illegal content. Delays in identification, assessment, and action may allow harmful content to be disseminated. Without prompt intervention, platforms may struggle to effectively contain the dissemination of false information or harmful materials. This can potentially undermine trust in the moderation process. Users and stakeholders may demand greater transparency and accountability from both platforms and external monitoring entities to ensure that decisions are made fairly, impartially and without bias.[37] The general problem of disinformation lies in its specific nature and the fact that, in essence, it is not illegal. At the same time, institutions, civic society organisations and other actors expect platforms and NRAs to be able to combat and limit it. This is a challenging task, and some scholars even talk about the structure of the Algorithmic Marketplace of Ideas,[38] which changes the traditional concepts of free speech and, in this regard, the need for and means of its lawful limitation, including through regulation.

In order to address these challenges, content moderation and users' rights should be given careful consideration in regard to the outsourcing process's implications. Platforms should establish effective mechanisms for oversight, review and appeal to safeguard against censorship and ensure due process. Moreover, cooperation between platforms, regulators and civil society organisations is essential when developing best practices and standards for outsourced monitoring that prioritise accuracy, fairness and accountability. Ultimately, a

- 152/153 -

balanced approach is needed that upholds freedom of expression while effectively combating harmful content.

In this regard, it is interesting how Article 14 will be applied, which creates the obligation for the providers of intermediary services to provide much more transparent information in their terms and conditions, including information on any restrictions that they may impose. This also includes 'algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system'. Legitimate questions and concerns arise regarding its effective enforcement, especially algorithmic decision-making in relation to content review. As some scholars argue,[39] algorithmic decision-making appears to impose restrictions primarily relating to the scope of content aggregation. Platform liability is possible only through transparent, accountable and contestable decision-making undertaken through an algorithm that properly applies 'digital due process'.

In light of the DSA's emphasis on transparency and user autonomy, the terms of use should encompass not only algorithmic parameters for content aggregation but also for recommender systems. Article 14(1) explicitly refers to tools utilised in content moderation and, in this regard, excludes the sorting of information. Its primary objective is to prevent 'over-moderation' by platforms, thereby averting the unjustified removal of otherwise lawful content by providers during the filtering of material deemed illegal.

By enlisting online intermediaries as 'watchdogs', DSA effectively delegates online enforcement to algorithmic tools.[40] Still, it needs to be underlined that, unlike Article 17 of the e-Commerce Directive, hosting service providers are not obliged to use automated content moderation tools, and this was a vital part of the DSA adoption debate.[41] Further, the Regulation (Recital 58) outlines that if automated methods are used in the process of internal complaint handling systems, human review is necessary. This is an essential safeguard that is in place, but at the same time, such an approach raises the question of timely and prompt resolution.

Overall, coordinators must develop and refine moderation practices that effectively identify and remove such content without overstepping the boundary of censorship or infringing on free speech.

- 153/154 -

V. Navigating Disinformation within the DSA Framework: Legal and Practical Considerations

One of the main goals of DSA is to address disinformation and to tackle and limit it. This is one of the hardest and most challenging tasks that the act prescribes. This challenge is evident from the Recitals, and namely, the act addresses online disinformation among illegal content and 'other societal risks' (Recital 2). Its limitation is part of the drive for a more predictable and trusted online environment (Recital 9). Recital 69 notices that 'in certain cases, manipulative techniques can negatively impact entire groups and amplify societal harms, for example by contributing to disinformation campaigns or by discriminating against certain groups'. However, DSA mainly establishes four categories of systematic risk, which should be assessed by the VLOPs and VLOSEs. The first is the dissemination of illegal content; the second is related to the actual or foreseeable impact of the service on the exercise of fundamental rights; and the third is the actual or foreseeable adverse effects on democratic processes, civic discourse and electoral processes, as well as public security. Finally, the fourth category is related to an 'actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person's physical and mental well-being, or on gender-based violence' (Recital 83). These risks may arise from coordinated disinformation, and platforms need to address them properly and promptly. Without a doubt, the war in Ukraine and Russian disinformation campaigns pose risks and threats to the European Union.[42] At the same time, DSA does not limit disinformation per se, and this is exactly the type of information that can be both legal and harmful. In this sense, disinformation might be considered illegal only in cases when it contravenes the nine legitimate aims prescribed by Article 10(2) ECHR or abuses rights[43] as prescribed in Article 17 ECHR in regard to the denial of the Holocaust[44] or crimes against humanity. Therefore, media regulators have so far applied Article 3 AVMSD and the general rule that obligates media service providers not to suffer the creation or provision for distribution of any programmes and any broadcasts inciting to national, political, ethnic, religious or racial intolerance, extolling or condoning brutality or violence, or any broadcasts which are adverse to, or pose a risk of impairing, the physical, mental, moral and/or social development of children. Though these obligations only apply to media service providers and not intermediaries in general, DSA prescribes in Article 34(2), the obligation for VLOSEs and VLOPs, when conducting their risk assessments, to take into account how the risks 'are

- 154/155 -

influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.' Along with the Recitals, DSA implies that VLOPs and VLOSE must prescribe in their terms and conditions the specific prohibition of disinformation or disinformation campaigns (Recital 84). The absence of a standardised definition could lead to inconsistency in how platforms interpret and enforce moderation policies, resulting in a fragmented regulatory landscape. This could pose challenges for users and content creators seeking clarity on acceptable content standards, as well as for the responsible regulators, which in most cases are also the Digital Services Coordinators. To address these challenges, there is a need for clearer guidelines or definitions of disinformation within the DSA framework. This could help platforms develop more targeted and effective moderation strategies while safeguarding freedom of expression.[45] At the same time, it is imperative that NRAs only minimally intervene in the field of disinformation. Any measures should be the least invasive ones[46] and free of debate; empowering users and media literacy should be the main tools for combatting disinformation.

In summary, while there is a clear intention to address disinformation within the DSA, the lack of a standardised definition poses significant challenges for both platforms and regulators in terms of effectively moderating online content without inadvertently stifling free speech or promoting censorship.

VI. Transparency and Accountability

In addition, transparency and accountability mechanisms play a key role in ensuring the effective implementation of DSA. Platforms should provide regular and comprehensive reports, which create further transparency and explain how systematic risks are approached and tackled. These reports should include a variety of factors, such as the number and types of items removed or flagged, the reasons behind removal actions and the effectiveness of prevention efforts. Transparent reporting not only builds trust in users and society but also makes it clearer and easier for users to address and protect their rights. This is one of the main achievements of DSA, and if applied properly, it can effectively change the way intermediaries function. It also balances the power between platforms and users. Still, the decisions of platforms need to be properly justified, and there should be a possibility for external review. In this regard, out-of-court settlements, as prescribed by Article 21 DSA,

- 155/156 -

are a useful tool, and the DSC (Digital Service Coordinator) will have a serious task in certifying these special jurisdictions.

Protecting fundamental rights will be the main goal of these mechanisms,[47] including the obligation for VLOPs and VLOSEs to act reasonably quickly if notified by 'trusted flaggers' and to disseminate information to 'vetted researchers'. Although the criteria for both are established in the regulation, DSCs will require serious resources, and a grey area exists for interpretation when establishing both. Despite the detailed procedure prescribed in DSA, without full cooperation by the VLOPs and VLOSEs, neither mechanism will be very successful, and potential conflicts will be possible.

Notably, the DSA uses the term 'manifestly unlawful substances', which is defined as being 'evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded'. This definition is closely tied to the specific obligation of online platforms to suspend the processing of notices and complaints submitted through notice and action mechanisms and internal complaint handling systems [Article 23(2)]. Still, the term is quite vague, and different interpretations are possible.[48] Further, NRAs should be active players in these processes, identifying clear criteria for content removal and not delegating this role solely to intermediaries. This task will be challenging, and despite the new responsibilities of the EU Commission, it is up to the national regulators to take an active role in the communication and actual regulation of the content.

VII. Need for Amendments in Horizontal Legislation and Proper Procedural Measures

The cornerstone in the proper and adequate application of the DSA lies in the appropriate amendment of numerous legal acts on a national level. Generally, most DSCs are also competent bodies under Regulation (EU) 2015/2120 of the European Parliament and of the Council of 25 November 2015, which lays down measures concerning open internet access and amends Directive 2002/22/EC on universal service and users' rights relating to electronic communications networks and services and Regulation (EU) No 531/2012 on roaming on public mobile communications networks within the Union. This particular regulation is a vital part of the proper application of DSA since it requires end-users to have free access to information and content. Any restrictions will be applicable only if they are 'appropriate, proportionate and necessary within a democratic society, and if their implementation is subject to adequate procedural safeguards in conformity with the European Convention for the Protection of Human Rights and Fundamental Freedoms, including its provisions on

- 156/157 -

effective judicial protection and due process' (Recital 13). In this regard, since DSA is such a new instrument, deliberate procedures should naturally be adopted in other legal acts in terms of content regulation that are in compliance with the Audiovisual Media Services Directive, but also in the field of national security, criminal activities, etc.

Media regulation, especially after EMFA enforcement, will require precision and amendments to some of the national legal acts in order to avoid possible overlaps and to address the aims of the DSA.[49] The need for this is vital since right now, the penalties for online activities that are illegal (for example, in the field of non-linear media services or video-sharing platforms) are different (for example, fines) but rarely include removal of content. Hence, there should be special obligations for the respective administrative bodies responsible for the different sectors to require and ask the DSC to remove the illegal content on the respective legal grounds. Without proper adjustment in the other legal acts, which contain provisions safeguarding different legitimate aims, as prescribed in Article 10(2) ECHR, the adequate implementation of the DSA will not be complete. Such in-depth analysis is required in all Member States. Furthermore, in the process of content removal in accordance with Article 9 DSA, the path for a relevant national administrative authority to identify illegal content and then take appropriate measures to limit it is long and may be challenged before the court. Sometimes, court procedures can take significant time. Once the court makes a final decision, the DSC can very quickly issue the order and inform the intermediaries to take down a specific piece of illegal content. However, the entire procedure may take so long that the harm caused by the content is already realised in its full potential. Therefore, specific safeguards should be taken; for example, in a limited amount of cases, preliminary measures or expediting proceedings should be prescribed.

Although not all Member States have adjusted their legislation regarding the DSA, many have prepared internal amendments to meet the new criteria, distinguish the DSC, and implement the Regulation accordingly.

VIII. Addressing Technological Challenges and Resource Allocation. Coordination with Other Competent Bodies

Finally, Digital Services Coordinators must address numerous challenges that are not strictly legal ones but are nevertheless essential to the proper application of the DSA. Recruiting and preparing qualified personnel with the requisite skills and expertise will be essential. DSA mentions the obligation for the Member States to guarantee enough resources for the Digital Services Coordinators and that they should 'have all necessary resources to carry out their

- 157/158 -

tasks, including sufficient technical, financial and human resources to adequately supervise all providers of intermediary services falling within their competence'. In this regard, given that most Digital Services Coordinators are government bodies executing activities associated with electronic communications, their independence and financial sufficiency need to be further guaranteed by Member States. Only in a handful of countries did the legislator decide to appoint the media regulator as a DSC rather than the communication regulator, and this occurred mainly in the countries where the regulators converge. The possibility for a body under the executive power to limit content is possible only after close scrutiny, proper application of the international standards within freedom of expression and a judicial review. Even then, this is not recommendable since guaranteeing freedom of expression should remain the main task for Member States in their pursuit of a safer online environment. Nowadays, platforms have become the public arena for debate, especially political content, and they shift the entire philosophy underlying the public discourse. Still, NRAs have become part of this debate and should be involved only when the 'three-part-test' is applicable.

At the same time, continuous training and capacity building are essential to ensure that DSCs remain updated on the latest regulatory developments, technological advancements, and best practices in digital service management.[50] DSCs, however, especially in smaller Member States, can struggle to provide competative and adequate training and development. A multifaceted approach is necessary for addressing these challenges and building adequate capacity that responds to all these new responsibilities DSCs, as well as other responsible bodies, have to face. This requires staff training, building infrastructure and good collaboration with different stakeholders.

Further, the DSA is a unique piece of regulation in terms of the need for serious and extensive cooperation among different regulatory bodies that can identify any type of illegal content on a national level, cooperate with other regulators in other Member States, and simultaneously with the European Commission. This increases the difficult of its timely and effective execution and makes the role of the DSCs even harder.

Finally, for small and medium companies operating in different markets worldwide, it is becoming harder to navigate and comply with the numerous European acts that cover their services, especially in the digital realm. Although micro and small enterprises are excluded from applying the Regulation, they still have to meet numerous requirements, and administrative burdens should not hinder their growth. This is also an important consideration that the EU legislator needs to take into account, given the need for competitiveness in the EU market compared to other markets.

- 158/159 -

IX. Conclusion

In conclusion, the DSA is a strong and imperative attempt at implementing an overarching approach to technological and market development in the digital sphere. The proper implementation of DSA will raise numerous challenges for the NRAs - especially for the Digital Service Coordinators who must coordinate the sanctions against illegal content. The difference in national legislations regarding the definition of illegal content, along with practical problems related to the timely and proper issue of a final order against illegal content, raise further problems for the DSC and further complicate the harmonised implementation of the DSA, potentially leading to delays and future struggles in enforcement and interpretation across Member States. Furthermore, effective coordination among Member States and the administrative bodies on a national level, along with the vast area of regulation that DSA covers, requires efficiency, serious resources, and a very good understanding of the purposes of the regulation with respect to human rights and especially freedom of expression.

In general, the complex legal framework is in fast-paced competition with the development of the market itself, and while more and more regulations and directives are adopted, the NRAs and the DSCs, in particular, need to adapt through adequate and intense efforts to respond to the attempts of the EU legislator to catch up with the sector. This complicates the digital environment, and the Parliamentary Assembly of the Council of Europe has even proposed a new Internet Ombudsman institution 'either as a separate body or by expanding the remit of an existing body such as a data protection agency, a media regulator, or a conventional ombudsman institution responsible for the protection of human rights'.[51] Such an idea might seem far away right now, but the online sphere and its regulation are indeed becoming challenging in many regards.

Addressing these challenges is only possible through collaboration between national NRAs, policymakers, the European Commission, civil society and business, but mainly with VLOPs and VLOSEs, which must comply with the regulation. This is the only possible approach that can lead to a balanced, effective and competitive market that respects and protects users and supports and further develops the intermediaries that shape the digital environment. The main conclusion, however, remains that NRAs have to adapt rapidly and effectively to the new realities. Content moderation has looked significantly different since the last amendment of the AVMSD in 2018 and will look much more different in 2028. Therefore, the Commission, Member States and NRAs must coordinate efficiently and simultaneously make deliberate efforts not to hinder the operation of the market from slowing down due to all these regulations and to protect human rights. ■

NOTES

[1] Regulation (EU) No 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) [2022] OJ L277/1.

[2] See DSA Recital 76 and Article 33.

[3] See DSA Article 93(2).

[4] See DSA Recital 110 and Article 3(n).

[5] Directive (EU) No 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC [2019] OJ L130/92.

[6] Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive), as amended by Directive (EU) 2018/1808 [2013] OJ L95/1.

[7] Regulation (EU) No 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1.

[8] Commission, Proposal for a Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU COM (2022) 457 final.

[9] Commission, Proposal for a regulation of the European Parliament and of the Council on laying down harmonised rules on Artificial Intelligence (Artificial Intelligence Act) and amending certain Union Legislative Acts COM (2021) 206 final.

[10] Adrienne Stone, Frederick F. Schauer (eds), The Oxford Handbook of Freedom of Speech (Oxford University Press 2021, Oxford) 159-172.

[11] See Editorial Board of Pravoye Delo and Shtekel v Ukraine no. 33014/05, ECHR, paras. 61-64, 5 May 2011.

[12] Recommendation CM/Rec(2018)2 of the Committee of Ministers to member States on the roles and responsibilities of internet intermediaries, adopted by the Committee of Ministers on 7 March 2018 at the 1309th meeting of the Ministers' Deputies.

[13] Analysis and recommendations concerning the regulation of vloggers. Subgroup 1 Consistent implementation and enforcement of the new AVMD framework (ERGA 2021), <https://erga-online.eu/wp-content/uploads/2021/12/ERGA-SG1-2021-Report-Vloggers.pdf> accessed 15 October 2024.

[14] France further adopted a separate Influencer Act, Law No. 2023-451 of June 9, 2023, aiming to regulate commercial influence and combat abuses by influencers on social media.

[15] Spanish Royal Decree 444/2024 of May 1, 2024, available at: <https://www.boe.es/boe/dias/2024/05/01/pdfs/BOE-A-2024-8716.pdf>

[16] European consumer laws such as the Unfair Commercial Practices Directive (UCPD) and Consumer Rights Directive (CRD) also apply to the commercial activities of influencers. These laws are designed to protect consumers from unfair or deceptive practices in commercial transactions, which can include influencer marketing activities.

[17] Joan Barata, The Digital Services Act and its impact on the right to freedom of expression: special focus on risk mitigation obligations (Plataforma en Defensa de la Libertad de Información 2021) <https://dsa-observatory.eu/2021/07/27/the-digital-services-act-and-its-impact-on-the-right-to-freedom-of-expression-special-focus-on-risk-mitigation-obligations> accessed 15 October 2024; Giovanni Sartor, Andrea Loreggia, The impact of algorithms for online content filtering or moderation (European Parliament's Committee on Citizens' Rights and Constitutional Affairs 2020) <https://www.europarl.europa.eu/RegData/etudes/STUD/2020/657101/IPOL_STU(2020)657101_EN.pdf> accessed 15 October 2024.

[18] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L 178/1.

[19] Gergely Gosztonyi, Andrej Skolkay, Ewa Galewska, 'Challenges of Monitoring Obligations in the European Union's Digital Services Act' (2024) (1) ELTE Law Journal 45-60, DOI: https://doi.org/10.54148/ELTELJ.2024.1.45

[20] Ondřej Moravec and others, 'Digital Services Act Proposal (Social Media Regulation)' (2021) 14 (2-3) Studia Politica Slovaca 166-185, DOI: https://doi.org/10.31577/SPS.2021-3.5

[21] Jens-Peter Schneider, Kester Siegrist and Simon Oles, 'Collaborative Governance of the EU Digital Single Market established by the Digital Services Act' University of Luxembourg Law Research Paper (2023) 9, 28-32.

[22] Gergely Gosztonyi, Censorship from Plato to Social Media (Springer 2023, Cham) 89, DOI: https://doi.org/10.1007/978-3-031-46529-1_6

[23] Rebecca Tushnet, 'Best Laid Plans: The Challenges of Implementing Article 17' (23 October 2023) JOTWELL <https://cyber.jotwell.com/best-laid-plans-the-challenges-of-implementing-article-17/> accessed 15 October 2024.

[24] See Report on the Digital Services Act and fundamental rights issues posed, Committee on Civil Liberties, Justice and Home Affairs, A9-0172/2020 <https://www.europarl.europa.eu/doceo/document/A-9-2020-0172_EN.html> accessed 15 October 2024.

[25] See Commission, 'Questions and answers on the Digital Services Act' (2024) <https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348> accessed 15 October 2024.

[26] AVMSD Article 1(1)(g).

[27] EMFA Article 2(1).

[28] Along with the designated VLOPs Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando, Shein, Temu and VLOSE: Bing and Google Search, on December 23, 2023, the new sets of designation decisions added PornHub, Stripchat, XVideos, as well as XNXX, which was added in July 2024. Full list of the decisions is available here: <https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses> accessed 15 October 2024.

[29] Bissera Zankova, Gergely Gosztonyi, 'Quo vadis, European's Union New Digital Regulation Package?' (2021) (2) Business and Law 67-90.

[30] Florian Sauerwein, 'Emerging structures of control for algorithms on the Internet. Distributed agency - distributed accountability' in Tobias Eberwein, Susanne Fengler, Matthias Karmasin (eds), Media accountability in the era of post-truth politics (Routledge 2019, London), 196-211, DOI: https://doi.org/10.4324/9781351115780-13

[31] See Case T-348/23 (2023/C 314/13), Zalando v Commission.

[32] See Case T-367/23 Amazon v Commission.

[33] See Case C-639/23 P(R) Order of the Vice-President of the Court, 27 March 2024.

[34] See Commission Communication on the European Democracy Action Plan COM (2020) 790 final.

[35] DSA Recital 106.

[36] About Fact-Checking on Facebook and Instagram <https://www.facebook.com/business/help/2593586717571940> accessed 15 October 2024.

[37] See Maayan Perel, Niva Elkin-Koren, 'Accountability in Algorithmic Copyright Enforcement' (2016) 19 Stanford Technology Law Review.

[38] Philip M. Napoli, Social media and the public interest: media regulation in the disinformation age (Columbia University Press 2019, New York) 138-146.

[39] János Tamás Papp, 'How the DSA Aims to Protect Freedom of Speech - With Special Regards to Section 14. of the DSA.' - Part I. (2024) Constitutional Discourse <https://constitutionaldiscourse.com/janos-tamas-papp-how-the-dsa-aims-to-protect-freedom-of-speech-with-special-regards-to-section-14-of-the-dsa-part-i/> accessed 15 October 2024; Lorna McGregor, Daragh Murray and Vivian Ng, 'International Human Rights Law as a Framework for Algorithmic Accountability' (2019) 68 (2) International and Comparative Law Quarterly 309, 331, DOI: https://doi.org/10.1017/S0020589319000046

[40] Giancarlo Frosio, Christophe Geiger, 'Taking Fundamental Rights Seriously in the Digital Services Act's Platform Liability Regime' (2023) 29 (1-2) European Law Journal 31-77, DOI: https://doi.org/10.2139/ssrn.3747756

[41] European Parliament resolution of 20 October 2020 on the Digital Services Act and fundamental rights issues posed (2020/2022(INI)), point 13.

[42] Elena Sherstoboeva, 'Russian Bans on 'Fake News' about the war in Ukraine: Conditional truth and unconditional loyalty' (2024) 86 (1) International Communication Gazette 36-54, DOI: https://doi.org/10.1177/17480485231220141; Gergely Ferenc Lendvai, 'Media in War: An Overview of the European Restrictions on Russian Media' (2023) 8 (3) European Papers 1235-1245, DOI: https://doi.org/10.15166/2499-8249/715

[43] Katamadze v Georgia no. 69857/01, 02 February 2001; Norwood v the United Kingdom, no. 23131/03, 16 November 2004.

[44] Garaudy v France no. 65832/01, 24. June 2003; Witzsch v Germany no. 7485/03, 13 December 2005.

[45] Frosio, Geiger (n 40) 45.

[46] Alain Strowel, Jean De Meyere, 'The Digital Services Act: transparency as an efficient tool to curb the spread of disinformation on online platforms?' (2023) 14 (1) Journal of Intellectual Property, Information Technology and Electronic Commerce Law 1.

[47] Lani Watson, The Right to Know: Epistemic Rights and Why we Need Them (Routledge 2021, London).

[48] Mark D. Cole and others, Algorithmic transparency and accountability of digital services (European Audiovisual Observatory 2023, Strasbourg).

[49] Mark D. Cole, Christina Etteldorf, Future Regulation of Cross-Border Audiovisual Content Dissemination. A Critical Analysis of the Current Regulatory Framework for Law Enforcement under the EU Audiovisual Media Services Directive and the Proposal for a European Media Freedom Act (Nomos 2023, Baden-Baden) 92-106, DOI: https://doi.org/10.5771/9783748939856

[50] Cole and others (n 48).

[51] Standing Committee of the Parliamentary Assembly of the Council of Europe, Resolution 2334 (2020), Provisional version, 'Towards an Internet Ombudsman institution', 15 September 2020, 6.

Lábjegyzetek:

[1] The author is assistant Professor, American University in Bulgaria. https://orcid.org/0009-0001-5044-6380.

Tartalomjegyzék

Visszaugrás

Ugrás az oldal tetejére