https://doi.org/10.56749/annales.elteajk.2020.lix.7.133
Section 4 of the Directive on electronic commerce (ECD) established significant regulations concerning the liability of intermediary service providers regarding illegal content in the European Union. However, over the past twenty years, it has become apparent that its details are not adequately developed. The Court of Justice of the European Union (ECJ) in Luxembourg performs significant legislative action in this field. Its rulings touch upon the concept of 'information society services', the active or passive role of service providers and issues regarding the general prohibition of monitoring obligations. The present study examines the practices and role of the Court of Justice of the European Union, as this organisation has significantly contributed to moving in the direction of having legislation that is more suitable for meeting present-day demands concerning internet liability.
Keywords: information society services, notice-and-takedown system, prohibition of general monitoring, case law, Google France, L'Oréal, UPC Telekabel Wien, Sotiris Papasavvas, Tobias Mc Fadden, Eva Glawischnig-Piesczek
One of the current 'hot topics' about the regulation of Internet is who can be held liable for infringing content. In the European Union, the central element of the regulatory framework is Section 4 of the Directive on electronic commerce (ECD),[1] which is
- 133/134 -
entitled 'Liability of intermediary service providers'.[2] The regulatory framework employs a three-pronged set of definitions, the first two of which ('mere conduit' and 'caching') give service providers immunity from liability similar to that provided by Section 230(c (1) of the US Communications Decency Act.[3] Of more interest, however, is the issue of liability of hosting providers, for which rules are set out in Article 14 of the ECD. According to this, the provider is in principle liable for the content hosted on it and is exempt from liability if (a) it has no actual knowledge of illegal activity or information and, as far as claims are concerned, no knowledge of facts or circumstances which would clearly indicate illegal activity or information; or (b) as soon as it becomes aware of such activity or information, it takes immediate steps to remove or disable access to it.[4]
The (relative)[5] novelty of the European system is therefore this commonly used notice-and-takedown system (NTDS[6]), which has thus introduced a multi-stage system of conditions and procedures: the intermediary service provider must have certain knowledge of content that is manifestly illegal and must take steps to remove it within a specified time. It can therefore be concluded that, in contrast to US regulation, the European Union has opted for a different model (also known as the 'safe harbour model'[7]), which focuses on a non-automatic exemption.[8]
In addition to the NTDS, the provisions of Article 15 of the ECD should be highlighted, as it is stated in addition to the above provisions that Member States shall not impose a general obligation on service providers to (a) monitor the information they transmit or store, or (b) that they should investigate facts or circumstances indicating illegal activity.[9] This rule does not therefore oblige service providers, and therefore social
- 134/135 -
media, to monitor content posted on their sites on a continuous basis[10] (prohibition of general monitoring).[11]
The problem that had to be faced, however, is that 'the way the ECD is implemented varies widely across the EU and that national jurisprudence on online liability remains very fragmented today'.[12] This is compounded by the good Samaritan paradox,[13] i.e. the practice whereby platform providers prefer to remain passive because they lose the possibility of immunity from liability if they are active.
It is fair to say that, in the twenty years since the adoption of the ECD, there have been professional debates on a number of issues (such as when to declare that a provider has actual knowledge; what constitutes manifestly illegal content; what is the time limit within which a provider must act; and whether we are talking about an active or passive type of provider; but it has been suggested that there should be more than one type of procedure for different types of content and greater emphasis should also be placed on cybercrime[14]), but the careful interpretation of the rules - without which it is not possible to determine whether content has been lawfully removed or whether there are censorship effects - is often left to international courts.
The two most important international courts for detailing the rules on platform provider liability in the ECD are the Court of Justice of the European Union (ECJ) in Luxembourg and the European Court of Human Rights (ECtHR) in Strasbourg.[15] These courts take into account each other's judgments in which human rights violations are invoked; furthermore, 'through judicial law development, the ECJ has developed robust doctrines of fundamental rights protection, even though formally
- 135/136 -
the ECJ is not a human rights court, so that fundamental rights protection is a mere secondary corollary of the ECJ's work'.[16] This is all the more important because, as Oreste Pollicino notes, 'The ECtHR and the ECJ protect freedom of expression in very different ways. While the former actually functions as a court of fundamental constitutional law, the latter is more influenced by the original economic nature of the European Community'.[17] These differences are also evident in the practice of the two courts in relation to issues of moderation or content regulation of platform providers.
In the Delphi case[18] before the ECtHR, Estonia noted that the ECJ 'had never adjudicated on a case similar to the Delfi case'.[19] In response, the ECtHR referred to the L'Oreal case but, in fact, the ECJ had already dealt with the issue a year earlier. Before examining the cases concerned in relation to the practice of the ECJ, it is worth referring to Oreste Pollicino's study, in which he writes that 'the ECJ takes its decisions in the context of preliminary procedures. In this case, it is the national courts that play the decisive role, which refer questions to the ECJ. This difference (compared to the ECtHR[20]) leads the Luxembourg court to play a de facto role in the adjudication of fundamental rights'.[21]
In the present study, it is therefore worth following the practice of the ECJ on this issue, as the ECJ - alongside the ECtHR, of course - has contributed a number of key judgments which will allow the EU to move in a more modern direction with regard to liability.
- 136/137 -
In the first major case on this issue, several different trademark owners sued Google's French subsidiary and the cases were consolidated. In each case, the lawsuits concerned the use of their trademarks by the Google search engine as part of a paid referencing service called 'AdWords', with the results leading to pages of counterfeit products. All parties concerned asked the national court to find Google liable for the infringement. Google argued that it had no control over the linked sites and therefore could not be held liable. The case was referred to the ECJ, which, in view of the complexity and importance of the case, acted as a Grand Chamber and took the view that, (a) search engine services are clearly 'information society services' under the ECD[23] and, (b) Google is not a passive, technical service provider, since it is the company's criteria that determine the order of the results and, second, it can change the order of the results on the basis of the paid service.[24] The ECJ therefore indicated to the national court in this case that a general exemption is not conceivable in the case in question, but that it is for the national court to examine whether the service provider has an active role in the products in question.[25]
In the second milestone case, the French cosmetics company L'Oreal reported to the online marketplace eBay that counterfeit versions of its products were being sold under the L'Oreal brand name on several occasions, while the marketplace prohibits the sale of counterfeit goods in contracts signed by its users. In addition, L'Oreal products
- 137/138 -
intended for testing (and not for sale) have also been put up for sale on the website. The cosmetics company held eBay (and in particular the European subsidiary that operates eBay.co.uk in this case) liable, while it also sued Google, which, after searching for the name of the cosmetics products, also displayed ads for these counterfeit products, which were not for sale, on eBay.
Judge Arnold in the case raised a number of possibilities that eBay could use to detect or minimise problems without having to monitor the content uploaded to it across the board, but noted that just because all of these things are possible does not mean that it is legally required to do them.
I am in no doubt that it would be possible for eBay Europe to do more than they currently do. For example, it would appear to be possible for eBay Europe to take some or all of the following steps, although some would be more technically challenging and costly than others:
a) filter listings before they are posted on the Site;
b) use additional filters, including filters to detect listings of testers and other not-for-sale products and unboxed products;
c) filter descriptions as well as titles;
d) require sellers to disclose their names and addresses when listing items, at least when listing items in a manner which suggests that they are selling in the course of trade;
e) impose additional restrictions on the volumes of high-risk products, such as fragrances and cosmetics, that can be listed at any one time;
f) be more consistent in their policies, for example regarding sales of unboxed products;
g) adopt policies to combat types of infringement which are not presently addressed, and in particular the sale of non-EEA goods without the consent of the trade mark owners;
h) take greater account of negative feedback, particularly feedback concerning counterfeits;
i) apply sanctions more rigorously; and
j) be more rigorous in suspending accounts linked to those of users whose accounts have been suspended (although it is fair to say that the evidence is that eBay Europe have recently improved their performance in this regard).[27]
On the basis of all these observations, the English court referred the matter to the ECJ, inter alia, as to whether and under what conditions Article 14 of the ECD applies to the operator of an online marketplace. The ECJ clearly took the view that the answer was in the affirmative, namely that the relevant legal position applies to operators of online marketplaces.[28] As regards liability, the ECJ indicated, as in the Google France case, that
- 138/139 -
an exemption from liability for neutral, passive-type operators is conceivable, but that in the present case eBay actively contributed to the success of the sales (for example, in some cases by helping to optimize prices or by advertising certain products),[29] hence it does not have a case for exemption from liability. The ECJ therefore indicated to the national court in this case that a general exemption cannot be envisaged in this particular case, but that it is for the national court to assess whether there is an active role for the service provider in relation to L'Oreal's products in the particular case. If the answer to that question is in the affirmative then liability can be established. It should be noted that the ECJ judgment led some scholars (e.g. Christine Riefa)[30] to conclude that service providers would have a general duty to monitor thereafter, but the ECJ did not take such a view in the formal documents in the case.[31]
As a background to the case, two German film distribution companies informed UPC Telekabel in Austria that some of their copyrighted films were being downloaded without their knowledge from the kino.to website, which uses UPC's internet service. The two companies demanded that the kino.to website be completely shut down or made inaccessible. UPC denied responsibility, claiming that it had not been involved in the transmission of the copyrighted content. The case was referred to the ECJ, which, although it did not refer to the ECD but to the Copyright Directive,[33] held that an internet service provider which (also) transmits protected content to the public is an intermediary service provider and that there is no need for a contractual relationship between the internet service provider and the rightholder of the protected content to establish this. The ECJ held that the choice of how and by what technical means an
- 139/140 -
internet service provider protects intellectual property is left to its own discretion,[34] and that it is for the national courts to determine whether this is lawful. The solution must, however, take account of two conjunctive conditions: (a) the measures taken must not unnecessarily deprive internet users of the possibility of lawful access to the information available and (b) those measures must prevent unauthorised access to the protected subject-matter or, at the very least, make it more difficult and seriously discourage internet users from accessing content made available to them in breach of intellectual property rights.[35]
In the case, which was referred to the ECJ by a Cypriot court in a preliminary ruling, Sotiris Papasavvas sued a newspaper publishing company, its editor-in-chief and two journalists over online content he felt was defamatory. The national court referred the case to the ECJ on, inter alia, whether the rules of the ECD on 'information society services' (a) preclude the assessment of civil liability, (b) apply to the online interface of a printed newspaper, and (c) whether it is relevant whether the online interface is available free of charge or in exchange for payment.
On the basis of Article 3(1) of the ECD, the ECJ made it clear that Cyprus may lay down rules (in this case concerning defamation) in relation to 'information society services', which are in no way excluded by the ECD. In answering the second question, the ECJ first answered the third question: the relevant element for exemption from liability is not whether the content is paid or free, but whether the provider plays an active or passive role in relation to the content in question.[37] The ECJ clearly stated that, since the content in question was the online publication of content produced by the publisher of a printed newspaper, the active role cannot be called into question. However, the online interface of a printed newspaper was not considered by the ECJ to be an 'information society service' and the second question was not answered.[38]
- 140/141 -
Tobias Mc Fadden ran a shop selling light and sound equipment in Germany and, in order to better serve his customers, he also offered them free access to a Wi-Fi network without password protection. In 2010, Sony Music officially notified him that a copyrighted track was available on the network. Mc Fadden went to court, asking for a negative declaration (negative Feststellungsklage), as known in German law, confirming that he was not liable for the infringement in providing the network, as he had no right to control the content. Sony Music filed a counterclaim seeking a declaration that Mc Fadden is directly liable in addition to damages. As Mc Fadden did not appear before the national court, the counterclaim of Sony Music was granted. Mc Fadden appealed, and the national court referred the matter to the ECJ, asking, inter alia, (a) whether the provision of the Wi-Fi network is an 'information society service', (b) whether it is a mere transmission service and whether Article 12 of the ECD applies, (c) whether it is relevant that the provider offered access to the Wi-Fi network as an additional service to its original market profile, and (d) whether the national court could order the provider to provide the Wi-Fi network only with some form of protection.
The ECJ made it clear that the provision of a Wi-Fi network is an 'information society service'. It came to this conclusion by means of a negative inference, as nothing in the definition of the term excludes it - so this case could be read to be significant for the extension of the scope of the legislation. Moreover, the ECJ noted that it is sufficient for the service provider to offer the service in order to promote the service according to its original market profile,[40] it is not necessary for the concept to be fulfilled either to set a separate remuneration for this service[41] or to have a separate contractual relationship with the users of the network.[42] With regard to liability, the ECJ - maintaining its previous position - pointed out that the exemption applies if the conditions of Article 12 of the ECD are fulfilled. The ECJ specifically underlined that the fulfilment of the conditions of Article 14 of the ECD does not apply mutatis mutandis in a case involving a simple transmission service within the meaning of Article 12 of the ECD.[43] Of particular interest for our purposes is the issue of the protection of the national court's access to the network, where the ECJ (by a majority) accepted Advocate General Szpunar's Opinion.[44] The ECJ stated that, although the national court is responsible for
- 141/142 -
the administration of justice under national and EU law, of the three technical solutions hypothetically proposed by the national court (termination of service,[45] password protection[46] or general traffic monitoring obligations[47]), only password protection could pass the test of legality.[48] That is also only if the three conflicting fundamental rights in the present case are duly balanced by the national court, i.e. freedom of expression, freedom to conduct a business and intellectual property rights, are all upheld.[49] The difficulty of all this in the context of constant technological development has been described by Ciarán Burke and Alexandra Molitorisová as a 'catch me if you can game'.[50] Martin Husovec went even further, arguing that the ECJ, due to its own framing problem, has thus drilled a new hole in the 'safe harbour' paradigm of protection.[51]
In the Glawischnig-Piesczek case, the ECJ had to take a position on another point of the ECD, namely Article 15 and the prohibition of general monitoring. Whereas in the L'Oréal case examined above, and also in the two SABAM cases,[53] the ECJ had previously concluded that general monitoring was not an obligation for service providers, there appears to be a slight change of direction on this issue in the present case. In it, a defamatory text about the Austrian MEP Eva Glawischnig-Piesczek was published on Facebook along with her photo. She asked the service provider not only to remove the content in question, but also to remove all similar content. The national court ordered the service provider to remove not only the content of the incriminated content but also all similar content brought to the defendant's attention by the plaintiff.
- 142/143 -
However, the national court referred the matter to the ECJ asking, inter alia, (a) whether it is possible to apply the NTDS not only to the content in question, but also to any similar content that may be shared, (b) if so, what are the conditions for establishing similarity, (c) whether it is possible by national legislation or a court to impose such a requirement on the provider not only within the country concerned, and (d) whether such obligations do not conflict with the prohibition of a 'general obligation to monitor'.
In its decision, the ECJ pointed out that 'information society services are characterised both by its rapidity and by its geographical extent'[54] and that 'there is a genuine risk that information which was held to be illegal is subsequently reproduced and shared by another user of that network'.[55] On this basis, the ECJ concluded that Member States have the possibility not only to require that content be removed, but also to impose such requirements on any similar content that may be shared.[56] The reasoning behind this decision is that if the ban were to apply only to a particular piece of content, 'the effects of such an injunction could easily be circumvented by the storing of messages which are scarcely different from those which were previously declared to be illegal'[57] and the aggrieved party would have to bring new proceedings in each case.
However, the ECJ stressed that, contrary to the national court's finding, the removal of such content should not only be done at the request of the aggrieved party, as this would impose an undue burden on the legislator or the law enforcement authorities. In order to establish identity, it is necessary to compare the content and not merely the words, but this can only be done if it 'does not require the host provider to carry out an independent assessment'[58] as with that, an undue burden would be placed on it. As these can be achieved by automated means, the ECJ also considered it appropriate to balance the conflicting fundamental rights[59] and, on the other hand, summarised, with reference to Article 47 of the ECD, that the requirement of ad hoc monitoring does not conflict with Article 15 of the ECD. With regard to the extraterritorial scope,[60] the ECJ stated
- 143/144 -
that the ECD does not contain a prohibition[61] in this respect, so that national legislation may oblige service providers to apply a prohibition beyond their national borders 'within the framework of the relevant international law'.[62] All this - however much the ECJ refers to Article 47 of the ECD - seems to be more a general monitoring than a specific case.[63]
As we have seen, the practice of the ECJ covers a wide range of questions in relation to the ECD. The ECJ has made important clarifying statements on the concept of 'information society services', stating that whether a search engine service, an online marketplace or the provision of a Wi-Fi network is covered by the concept and thus by the regulation. Moreover, the ECJ also used negative inference, thereby extending the scope of the regulation.
This also includes the fact that the ECJ has clearly stated, on the basis of Article 3(1) of the ECD, that national legislation may establish civil liability for 'information society services', as this option is in no way excluded by the ECD. In addition, the ECJ has consistently maintained in its decisions that, in relation to the active or passive role of service providers, only neutral, passive service providers can be exempted from liability in the application of the rules.
The last significant issue that the ECJ has addressed - and on which the ECtHR has not yet reached a settled position[64] - is the issue of the general prohibition of the obligation to monitor in Article 15 of the ECD. However, it should be noted in this context that the ECJ seems to have made a minor policy change in the time between the L'Oréal case in 2011 and the Eva Glawischnig-Piesczek case in 2019 and, although it referred to case-by-case monitoring, it seems to have shifted towards the adoption of general monitoring when analysing the case.
Although the decisions of the ECJ and the ECtHR differ on certain issues, the case law of the two international courts contributes significantly to a better understanding of the rules of the ECD and to a more solid basis for the national courts to address the liability issues of internet service providers and of the internet as a complex and constantly changing ecosystem. ■
NOTES
[1] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce'), OJ L 178, 17.7.2000 1-16.
[2] C. Wendehorst, Platform Intermediary Services and Duties under the E-Commerce Directive and the Consumer Rights Directive, (2016) 5 (1) Journal of European Consumer and Market Law, 30-33.
[3] Communications Decency Act (CDA), Pub. L. No. 104-104 (Tit. V), 110 Stat. 133 (Feb. 8, 1996).
[4] ECD, [14].
[5] The procedure already appeared in 1998 in the DMCA [The Digital Millennium Copyright Act (DMCA), Pub. L. No. 105-304, 112 Stat. 2860 (Oct. 28, 1998)]; however, only to be applied in copyright infringement issues. See M. Peguera, The DMCA Safe Harbors and Their European Counterparts: A Comparative Analysis of Some Common Problems, (2009) 32 (4) Columbia Journal of Law & the Arts, 481-512.
[6] A. de Streel et al., Online Platforms' Moderation of Illegal Content Online. Law, Practices and Options for Reform, (European Parliament, Luxembourg, 2020) 10.
[7] T. Madiega, Reform of the EU liability regime for online intermediaries: Background on the forthcoming Digital Services Act, European Parliamentary Research Service, PE.649.404 (Brussels, 2020), https://www.europarl.europa.eu/RegData/etudes/IDAN/2020/649404/EPRS_IDA(2020)649404_EN.pdf (Last accessed: 31 December 2020) 1-2.
[8] It is important to note, however, that, under Article 14(3) of the ECD, Member States have the possibility to establish procedures to regulate the removal or withdrawal of access to information.
[9] However, it must be stressed - as has been addressed in subsequent case law - that Article 47 of the ECD states that 'this does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation'.
[10] J. van Hoboken et al., Hosting intermediary services and illegal content online. An analysis of the scope of article 14 ECD in light of developments in the online service landscape: final report, (Publications Office, Luxembourg, 2018) 45-47.
[11] J. Oster, European and International Media Law, (Cambridge University Press, Cambridge, 2017) 234-236.
[12] Madiega, Reform of the EU liability regime for online intermediaries... Summary.
[13] P. Strömbäck, Good Samaritan Paradox Paradox, Netopia, 12.06.2020., http://www.netopia.eu/good-samaritan-paradox-paradox (Last accessed: 31 December 2020).
[14] Sorbán, K., The role of Internet intermediaries in combatting cybercrime: obligations and liability, in Nemeslaki, A., et al. (eds), Central and Eastern European eDem and eGov Days, (Austrian Computer Society, Wien, 2019) 19-31. https://doi.org/10.24989/ocg.v335.1
[15] For the related practice of the ECtHR see G. Gosztonyi, How the European Court of Human Rights Contributed to Understanding Liability Issues of Internet Service Providers, (2019) (58) Annales Universitatis Scientiarum Budapestinensis de Rolando Eötvös Nominatae - Sectio Iuridica, 121-133. https://doi.org/10.56749/annales.elteajk.2019.lviii.7.121
[16] M. Daka, The European Convention on Human Rights and the European Union's system of fundamental rights protection - convergence and divergence in the European legal space, PhD Thesis, (PTE AJK, Pécs, 2020), https://ajk.pte.hu/sites/ajk.pte.hu/files/file/doktori-iskola/daka-marija/daka-marija-muhelyvita-ertekezes.pdf (Last accessed: 31 December 2020) 72.
[17] O. Pollicino, Judicial protection of fundamental rights in the transition from the world of atoms to the word of bits: the case of freedom of speech, (2019) 25 (2) European Law Journal, (155-168) 168. https://doi.org/10.1111/eulj.12311
[18] Nádori P., Delfi AS v. Észtország: strasbourgi döntés a névtelen kommentekért viselt szolgáltatói felelősségről (Delfi AS v. Estonia: Strasbourg's decision on service provider liability for anonymous comments), (2013) 10 (56) Infokommunikáció és Jog, 131-140.; Nádori P., Úton a tömeges internetes szólás jogi megítélésének új megközelítése felé. A strasbourgi Nagykamara ítélete a Delfi-ügyben (On the way to a new approach to the legal assessment of mass online speech. The Strasbourg Grand Chamber's ruling in the Delfi case), (2019) (2) In Medias Res, (343-366) 362.
[19] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015), [85].
[20] Author's note.
[21] Pollicino, Judicial protection of fundamental rights in the transition from the world of atoms to the word of bits: the case of freedom of speech, 161. For more details see G. de Burca, After the EU Charter of Fundamental Rights: The Court of Justice as a Human Rights Adjudicator?, (2013) 20 Maastricht Journal of European & Competition Law, 168-184. https://doi.org/10.1177/1023263X1302000202
[22] Judgments of 23 March 2010 in Joined Cases C-236/08, Google France SARL and Google Inc. v Louis Vuitton Malletier SA, C-237/08, Google France SARL v Viaticum SA and Luteciel SARL and C-238/08, Google France SARL v Centre national de recherche en relations humaines (CNRRH) SARL and Others, ECLI:EU:C:2010:159.
[23] Ibid. [110].
[24] Ibid. [115].
[25] However, it is worth highlighting what the Harvard Law Review indicated about the case and the more logical reasoning of Advocate General Poiares Maduro's application to the ECJ: 'The Advocate General's opinion demonstrates, however, that a more traditional analysis could have avoided the flaws in the court's reasoning while still achieving the same result. [...] Traditional doctrine would have served the ECJ well in Google France, even in the age of the internet.' N/A, Joined Cases C-236/08, C-237/08 & C-238/08, Google France SARL v. Louis Vuitton Malletier SA, (2010) 124 Harward Law Review, (648-655) 655.
[26] Judgment of 12 July 2011 in Case C-324/09, L'Oreal SA and others v eBay International AG and others, ECLI:EU:C:2011:474.
[27] L'Oreal SA v. eBay International AG, [2009] RPC 21, [2009] ETMR 53, [2009] EWHC 1094 (Ch), [277].
[28] Judgment of 12 July 2011 in Case C-324/09, L'Oreal SA and others v eBay International AG and others, ECLI:EU:C:2011:474, [109].
[29] Ibid. [114].
[30] C. Riefa, The end of Internet Service Providers liability as we know it - Uncovering the consumer interest in ECJ Case C-324/09 (L'Oreal/eBay), (2012) (1) Zeitschrift für Europäisches Unternehmensund Verbraucherrecht, 104-111. https://doi.org/10.1007/s13590-012-0006-x
[31] K. Gilbert, L'Oreal v. eBay: ECJ Judgment, (2011) SCL, https://www.scl.org/news/2165-l-or-al-v-ebay-ecj-judgment (Last accessed: 31 December 2020).
[32] Judgment of 27 March 2014 in Case C-314/12, UPC Telekabel Wien GmbH v Constantin Film Verleih GmbH and Wega Filmproduktionsgesellschaft mbH, ECLI:EU:C:2014:192.
[33] Although the present study does not aim to examine the 2019 copyright directive, it is worth noting that its regulation of content removal is a unique phenomenon in the system. Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (Text with EEA relevance.), PE/51/2019/REV/1, OJ L 130, 17.5.2019 92-125. See F. Romero-Moreno, Notice and staydown and social media: amending Article 13 of the Proposed Directive on Copyright, (2019) 33 (2) International Review of Law, Computers & Technology, 187-210. https://doi.org/10.1080/13600869.2018.1475906
[34] F. F. Wang, Site-blocking Orders in the EU: Justifications and Feasibility, in 14th Annual Intellectual Property Scholars Conference (IPSC), Boalt Hall School of Law, University of California, Berkeley, August 7-8, 2014, https://www.law.berkeley.edu/files/Wang_Faye_Fangfei_IPSC_paper_2014.pdf (Last accessed: 31 December 2020) 2.
[35] Judgment of 27 March 2014 in Case C-314/12, UPC Telekabel Wien GmbH v Constantin Film Verleih GmbH and Wega Filmproduktionsgesellschaft mbH, ECLI:EU:C:2014:192, [64].
[36] Judgment of 11 September 2014 in Case C-291/13, Sotiris Papasavvas v O Fileleftheros Dimosia Etaireia Ltd and others, ECLI:EU:C:2014:2209.
[37] Ibid. [46].
[38] Ibid. [48].
[39] Judgment of 15 September 2016 in Case C-484/14, Tobias Mc Fadden v Sony Music Entertainment Germany GmbH, ECLI:EU:C:2016:689.
[40] Ibid. [43].
[41] Ibid. [41].
[42] Ibid. [50].
[43] Ibid. [65].
[44] Opinion of Advocate General Szpunar in Case C-484/14, Tobias Mc Fadden v Sony Music Entertainment Germany GmbH, ECLI:EU:C:2016:170, [125-150].
[45] Judgment of 15 September 2016 in Case C-484/14, Tobias Mc Fadden v Sony Music Entertainment Germany GmbH, ECLI:EU:C:2016:689, [89].
[46] Ibid. [99].
[47] Ibid. [87].
[48] In his opinion, Advocate General Szpunar questioned the legality of all three methods.
[49] B. J. Jütte, ECJ sheds light on liability for operators of open Wi-Fi networks, European Law Blog, 28.09.2016., https://europeanlawblog.eu/2016/09/28/ECJ-sheds-light-on-liability-for-operators-of-open-wi-fi-networks-case-c-48414-mc-fadden-v-sony-music (Last accessed: 31 December 2020).
[50] C. Burke and A. Molitorisová, What Does It Matter Who is Browsing? (2017) (8) Journal of Intellectual Property, Information Technology and E-Commerce Law, (238-253) 241.
[51] M. Husovec, Holey Cap! CJEU Drills (Yet) Another Hole in the E-Commerce Directive's Safe Harbors, (2017) 12 (2) Journal of Intellectual Property Law & Practice, (115-125) 125. https://doi.org/10.1093/jiplp/jpw203
[52] Judgment of 3 October 2019 in Case C-18-18, Eva Glawischnig-Piesczek v Facebook Ireland Limited, ECLI:EU:C:2019:821.
[53] Judgment of 24 November 2011 in Case C-70/10, Scarlet Extended SA v Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM), ECLI:EU:C:2011:771; Judgment of 16 February 2012 in Case C-360/10, Belgische Vereniging van Auteurs, Componisten en Uitgevers CVBA (SABAM) v Netlog NV, ECLI:EU:C:2012:85.
[54] Judgment of 3 October 2019 in Case C-18-18, Eva Glawischnig-Piesczek v Facebook Ireland Limited, ECLI:EU:C:2019:821, [3].
[55] Ibid. [36].
[56] This is known as notice-and-staydown (NSD). A. Kuczerawy, Intermediary Liability and Freedom of Expression in the EU: from Concepts to Safeguards, (Intersentia, Cambridge, 2018) 38.
[57] Ibid. 41.
[58] Ibid. 46.
[59] The complexity of the issue is illustrated by the fact that, after the decision, some commentators have predicted the 'legal death' of active/passive differentiation. A. Andolina and A. T. Ferrari, Court of Rome and ECJ again on Internet Service Providers' liability: two days of ordinary (dis)harmonization, Lexology, 27.01.2020., https://wwwlexologycom/library/detail.aspx?g=f3bcbdbb-1689-46df-a126-100d0868a2b9 (Last accessed: 31 December 2020).
[60] For an illustration of how the issues are interrelated in the field of media law, in relation to extraterritoriality and the right to be forgotten (RTBF), see Láncos P. L., Az elfeledtetéshez való jog és az extraterritorialitás kérdései (The right to be forgotten and extraterritoriality), (2017) (6) In Medias Res, 365-370.
[61] Judgment of 3 October 2019 in Case C-18-18, Eva Glawischnig-Piesczek v Facebook Ireland Limited, ECLI:EU:C:2019:821, [34].
[62] Ibid. [53].
[63] Koltay A., A szólásszabadság doktrínája és a fake news jelensége az online platformokon (Freedom of expression doctrine and the phenomenon of fake news on online platforms), in Kovács É. M. (ed.), Ünnepi kötet a 65 éves Imre Miklós tiszteletére (Festive volume in honour of the 65 years old Miklós Imre) (Ludovika Press, Budapest, 2020, 231-268) 253.
[64] See in details Gosztonyi, How the European Court of Human Rights Contributed to Understanding Liability Issues of Internet Service Providers, 121-133.
Lábjegyzetek:
[1] The Author is Assistant Professor, Eötvös Loránd University, Faculty of Law.
Visszaugrás