Megrendelés

Gergely Gosztonyi[1]: How the European Court of Human Rights Contributed to Understanding Liability Issues of Internet Service Providers (Annales, 2019., 121-133. o.)

https://doi.org/10.56749/annales.elteajk.2019.lviii.7.121

Abstract

Section 4 of the 2000 Directive on electronic commerce (ECD) established significant regulations concerning the liability of intermediary service providers regarding illegal content. However, in the past twenty years it has become apparent that its details are not adequately developed. The European Court of Human Rights (ECtHR), in accordance with the European Convention on Human Rights (ECHR), performs significant legislative action in this field. Its rulings touch upon the active or passive role of service providers, issues regarding the operation of the notice-and-takedown system, and the legal grounds of due notice. The present study examines the practices and role of the European Court of Human Rights, as this organisation has significantly contributed to the new set of proposals on digital regulation (to be introduced in 2020) moving in a direction more suitable to meet present-day demands concerning liability.

Keywords: information society services, liability, six-part liability test, internet service providers, notice-and-takedown system, case law, Delfi AS, Magyar Tartalomszolgáltatók Egyesülete, Index.hu Zrt., Pihl, Tamiz, Magyar Jeti Zrt., Hoiness

I. Introduction

Regarding one of the most important issues of internet regulations, namely who is liable for illegal content, the central element of the set of regulations developed by the European Union is Section 4 of the Directive on electronic commerce (ECD),[1] entitled

- 121/122 -

"Liability of intermediary service providers".[2] The set of regulations uses three different terms: "mere conduit", "caching" and "hosting". In the case of the first two, similarly to paragraph 230(c)(1) of the Communications Decency Act of the United States,[3] providers are exempt from liability. In case of "hosting", however, as expounded in Article 14, providers are not liable for the information stored on condition that (a) the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or (b) the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.[4]

This notice-and-takedown system (NTDS) is a relative[5] novelty in the European set of regulations, implementing a multi-stage system of conditions and procedures. First, the intermediary service provider must have actual knowledge of illegal content and, second, they must take steps to remove that content within a certain amount of time. Based on this, we can establish the fact that the European Union, unlike the United States, has opted for a model (often called a 'safe harbour model'),[6] wherein exemption from liability is not automatic.[7]

Any measure taken must be in view of Article 10 of the ECHR, which states that "everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers." The article then proceeds to state that "[...] the exercise of these freedoms may be subject to [...] formalities, conditions, restrictions or penalties", but these must pass a three-step test, which ensures that the actions taken are not arbitrary. Any interference must be "(a) suitable to achieve the legitimate aim pursued (suitability), (b) the least intrusive amongst those which might achieve the legitimate aim (necessity), and (c) strictly proportionate to the legitimate aim pursued (proportionality stricto sensu)".[8]

- 122/123 -

It is safe to say that several professional disputes have taken place in the past twenty-one years about various issues (such as the question of "actual knowledge" on the part of the provider; what constitutes "illegal content"; how much is the "certain amount of time" within which a provider must act; whether the provider is passive or active; moreover, many have put forward the idea that various types of content would require various types of procedures).[9] Still, the task of thorough interpretation of the set of regulations - without which it is impossible to ascertain whether any content has been removed legally or some type of censorship has taken place - was left to international courts.

II. The European Court of Human Rights and the Court of Justice of the European Union

The two most important international courts regarding the ECD's detailed regulations of service provider liability and the implementation of the ECHR are The European Court of Human Rights (ECtHR) in Strasbourg and the Court of Justice of the European Union (ECJ) in Luxembourg. The main difference between them is that "while the ECJ can be seen as an integrative agent, striving for further EU harmonization, the ECtHR's mandate is that of providing a minimum human rights standards protection, beyond which wider scope is left for pluralism and national sovereignty within the EU".[10] In practice, this means that the ECJ acts as an intermediary in disputes between EU institutions, or between EU institutions and EU Member States, and also "ensures compliance with EU Treaty at the national level, and by the Treaty of Maastricht has the right to impose fines for legal entities and Member States that violated EU law".[11] On the other hand, "an adverse ECtHR ruling will result in a »more gradual (and perhaps less politically costly) implementation« of the decision than in the case of an adverse ECJ ruling".[12] These two courts take account of one another's rulings where

- 123/124 -

the infringement of human rights is involved. Which is indeed important, because, as Oreste Pollicino states: "The ECtHR and the ECJ have protected freedom of speech in a very different manner. Whereas the former does actually work as a constitutional court of fundamental rights, the latter has been more influenced by the original economic nature of the European Community".[13] These differences are apparent in the way these two courts handle issues of content regulation and moderation by platform providers.

III. Milestone legal cases before the European Court of Human Rights

It is thus worth examining how the ECtHR approaches these questions, as it has, along with the ECJ, significantly contributed to the new set of proposals on digital regulation (to be introduced in 2020) moving in a direction more suitable for meeting present-day demands concerning liability.

1. Delfi AS v. Estonia (2015)[14]

The first milestone case was that of the Estonian online news portal Delfi, where it was possible for users to comment anonymously.[15] Some of these comments proved to be disputed, offensive or even defamatory.[16] The portal actually had an inbuilt content filter, which deleted comments when certain "blacklisted" expressions were used. The only other way, however, to delete a comment, was for other users to report the comment as inappropriate, and then wait for Delfi to take any measure they deemed fit. Delfi did indeed remove comments based on their own investigations, but quite rarely. In 2006 a particular article had twenty comments[17] which were deemed threatening or offensive. After the comments had been reported, the portal deleted the content, but refused to pay the damages claimed. Following a long legal procedure,[18] the case was submitted to

- 124/125 -

the ECtHR, which had to establish whether platform providers are liable for content uploaded by third parties. The ECtHR ruled that Estonia did not violate Section 10 of the ECHR when the court established the liability of Delfi regarding the comments to their articles. Four aspects were considered by both the Chamber and the Grand Chamber:[19]

- the context of the comments;

- the measures applied by the applicant company in order to prevent or remove defamatory comments;

- the liability of the actual authors of the comments as an alternative to the applicant company's liability; and

- the consequences of the domestic proceedings for the applicant company.[20]

The Court eventually ruled that the unlawful comments constituted hate speech, and did not require any linguistic or legal analysis.[21] It is important to note that the Court agreed with the Estonian court that Delfi must be regarded as a publisher (that is, commercial entrepreneur), and its activities in publishing the comments are not merely of a technical, automatic and passive nature. The liability of the author of the original comment was not examined in this case, but it was established in the ruling that the complainant could have, on their own volition, sued either the internet provider or the (anonymous) author of the original comment. Section 109 of the ruling stated that "established law in the European Union and other countries envisaged the notice-and-take-down system as a legal and practical framework for Internet content hosting. This balance of responsibilities between users and hosts allowed platforms to identify and remove defamatory or other unlawful speech, whilst at the same time enabling robust discussion on controversial topics of public debate [...]".[22] That is, "this system (i.e. NTDS)[23] can in the Court's view function in many cases as an appropriate tool for balancing the rights and interests of all those involved".[24] So, to summarise, in 2015 the ECtHR ruled that if content placed on intermediary providers constitutes hate speech,

- 125/126 -

Member States may "impose liability on internet news portals [...] if they fail to take measures to remove clearly unlawful comments without delay, even without notice from the alleged victim or from third parties".[25]

The ruling was met with harsh criticism by judges András Sajó and Nona Tsotsoria in their joint dissenting opinion:

The consequences are easy to foresee. For the sake of preventing defamation of all kinds, and perhaps all "illegal" activities, all comments will have to be monitored from the moment they are posted. As a consequence, active intermediaries and blog operators will have considerable incentives to discontinue offering a comments feature, and the fear of liability may lead to additional self-censorship by operators. This is an invitation to self-censorship at its worst.[26]

Neville Cox emphasised[27] that the ruling does not take national laws in force at the time into consideration, and does not wish (or fails to) provide a precedent for later cases.[28] Another criticism, expressed by Nádori, was that

the Grand Chamber considered the portal's comment section dissociated not only from web hosting services, but also 'other internet fora'. One wonders why, as the decision lacked any reasonable arguments. The Grand Chamber underlined that this case does not concern so-called social media sites, but it is not easy to determine what characteristics make an internet news portal differ from a social media site to such great extent in the relevant matter.[29]

- 126/127 -

2. Magyar Tartalomszolgáltatók Egyesülete (MTE) and Index.hu Zrt. v. Hungary (2016)[30]

Roughly half a year later, in another milestone case, the ECtHR chose a slightly different approach.[31] Certain comments on Index (then the market-leading Hungarian online news site) were offensive to the company of a third party ("Benkő-Sándor-sort-of sly, rubbish, mug company"). The third party criticised in the comments brought a civil action against MTE and Index.hu Zrt. When notified of the lawsuit, Index removed the comment in question. The Hungarian court ruled that Index has objective liability regarding unlawful comments by its readers. The ECtHR, however, later dismissed this notion, stating that "this amounts to requiring excessive and impracticable forethought capable of undermining freedom of the right to impart information on the Internet".[32] Moreover, in the Court's view, Hungary violated Article 10 of the ECHR when determining the liability of Index regarding the comments written concerning their articles. Comparing this case to the case of Delfi, the ECtHR confirmed that online news portals are indeed liable for comments to their articles and content uploaded by third parties, but pointed out a significant difference between the two cases: while in the case of Delfi the content in question was undeniably unlawful and amounted to hate speech, in the case of Index the content was "only" offensive and vulgar. The ECtHR was of the opinion that, although the act of hate speech was not committed, the "criteria of liability", as identified in the case of Delfi (the context of the comments, the measures applied by the applicant company in order to prevent or remove defamatory comments, the liability of the actual authors of the comments as an alternative to the intermediary's liability, and the consequences of the domestic proceedings for the applicant company), were also relevant in this case.[33] In addition, the Court introduced two further aspects to be considered:

- the conduct of the injured party; and

- the consequences of the comments for the injured party.[34]

- 127/128 -

Speaking of the behaviour of the injured party, the ECtHR disapproved of the fact that they "never requested the applicants to remove the comments but opted to seek justice directly in court". Regarding the consequences of the comments, it was observed that domestic courts never examined this aspect of the case, but it is highly unlikely that these comments would have had any negative consequences on the injured company. With these six criteria - the previous four having been supplemented with the two new ones -, the ECtHR took a step towards deciding "whether today's ECtHR case law formulates the universal criteria for internet news portal managers' liability, or whether they are relative and should rather be applied ad hoc in each case".[35] Although experts tend to disagree on the question, Tamás Szigeti and Éva Simon are of the opinion that this case "can be viewed as a correction on a "previous unlucky decision"[36] (that is, the Delfi-case)".[37]

3. Pihl v. Sweden (2017)[38]

Unlike in the previous cases, this time the applicant was a private citizen, suing because his right to privacy and good reputation had been violated. Swedish authorities refused to confirm the liability of the operators of a blog where a blog entry and an anonymous comment appeared defaming Rolf Anders Daniel Pihl. The ECtHR utilized the aforementioned criteria as relevant precedents to this case. An intriguing element of the proceedings was that the Court attached "importance to the fact that the association is a small non-profit association, unknown to the wider public, and it was thus unlikely that it would attract a large number of comments or that the comment about the applicant would be widely read".[39] Attila Tatár noted the different approaches to what seems to be one and the same aspect: "When determining the lack of liability, in the case of the MTE the Court in Strasbourg attributed little importance to the fact that one of the applicants was the owner of one of the leading news portals in Hungary, whereas in the case of Pihl it was expressly emphasized that the small blog was run by a non-profit association".[40] In any case, the final ruling was that Sweden did not violate the ECHR.

- 128/129 -

4. Tamiz v. United Kingdom (2017)[41]

British politician Payam Tamiz was preparing for a local election when, at the end of April 2011, a blog entry concerning him and containing his photograph appeared on Blogger.com, a blog service run by Google. Several anonymous comments were posted in the comments section, many of which Mr. Tamiz found defamatory, therefore he used the "report abuse" function to complain. He also complained in writing, to which letter Google requested clarifications in writing. Google forwarded the letter of complaint after a few months had passed, in August 2011, to the author of the blog entry, who subsequently removed the post and the comments. Payam Tamiz, however, had already sued Google, requesting the national court to establish the company's liability. The court characterised five of the comments as "mere vulgar abuse", and three as "arguably defamatory".[42] Whether Google was to be regarded as provider or publisher, the court[43] concluded: "It is no doubt often true that the owner of a wall which has been festooned, overnight, with defamatory graffiti could acquire scaffolding and have it all deleted with whitewash. That is not necessarily to say, however, that the unfortunate owner must, unless and until this has been accomplished, be classified as a publisher."[44] The national court's opinion was that even if Google was to be regarded as a publisher, the company did indeed do everything in their power to have the impugned content removed. An important new element in the case was the fact that the court had examined[45] the complaint sent to Google, as "appropriate notice" is necessary for the provider to have "actual knowledge of illegality".[46] The court concluded that "a report merely stating that the impugned content is defamatory is not precise and substantial enough, as it is not reasonable to expect internet providers to take every such claim for granted".[47]

Later, when heard by the ECtHR, Tamiz argued that "although Google Inc. had operated a "notice-and-take-down" process, it was inadequate in his case since four months elapsed between his "reporting abuse" and the content being removed",[48] and that the comments were plainly defamatory, as they did not contribute to a debate of

- 129/130 -

public interest.[49] The ECtHR highlighted that the allegedly defamatory content "must attain a certain level of seriousness"[50] since "millions of Internet users post comments online every day and [...] the majority of comments are likely to be too trivial in character, and/or the extent of their publication is likely to be too limited, for them to cause any significant damage to another person's reputation". Moreover, Payam Tamiz, as a politician, would be expected to have a higher tolerance threshold than the average internet user.[51] The ECtHR examined whether there was any evidence of a "real and substantial tort" and ruled that "a fair balance was struck" between "the right to private life and reputation and the right to freedom of expression".[52]

The ECtHR dismissed Tamiz's argument that the Delfi criteria should be used in this case, on the grounds that Delfi is a "professionally managed Internet news portal run on a commercial basis which published news articles of its own and invited its readers to comment on them",[53] whereas Google does not create content and has no editorial liability for blog entries published. Also, as demonstrated above, in the Delfi case the Grand Chamber expressed its opinion - albeit without solid evidence - that internet providers are different from "other Internet fora", hence the Delfi case does not constitute a precedent to this particular case. As such, it is no wonder that eventually the ECtHR ruled that the United Kingdom did not violate the ECHR.

5. Magyar Jeti Zrt. v Hungary (2018)[54]

A slightly different issue was in the focus in 2013 concerning a certain hyperlinked content. 444.hu, a news portal run by Magyar Jeti Zrt., published an article containing a link to a YouTube video where a gypsy leader stated that football fans singing racist songs were members of the right-wing political party Jobbik, (then[55]) mostly known for its anti-gypsy activities. Jobbik sued both the gypsy leader and Magyar Jeti Zrt. for defamation. The Hungarian court ruled that the company was liable for publishing

- 130/131 -

content containing false information. Years later, the ECtHR, contrary to the ruling of the national court,[56] confirmed that Article 10 of the ECHR on the right to freedom of expression was violated by holding the media company liable for content hyperlinked in its articles.[57] The ECtHR was of the opinion that hyperlinks serve as a kind of navigation tool; "they merely direct users to content available elsewhere on the Internet", and that "the person referring to information through a hyperlink does not exercise control over the content to which the hyperlink enables access".[58]

In the ECtHR's final comments a set of factors were introduced to be taken into consideration in future hyperlink-related law cases. These are the following:[59]

- Did the journalist endorse the impugned content?

- Did the journalist repeat the impugned content (without endorsing it)?

- Did the journalist merely put a hyperlink to the impugned content (without endorsing or repeating it)?

- Did the journalist know or could reasonably have known that the impugned content was defamatory or otherwise unlawful?

- Did the journalist act in good faith, respect the ethics of journalism and perform the due diligence expected in responsible journalism?

The Court also noted that "objective liability [for hyperlinked content] could have negative consequences on the flow of information on the internet by impelling authors and publishers to refrain altogether from hyperlinking to material whose content they could not control. That could directly or indirectly have a chilling effect[60] on freedom of expression on the internet."[61]

6. Høiness v. Norway (2019)[62]

In the latest case pertaining to the issue, the ECtHR ruled that the ECHR was not violated when the national court did not find the provider of an online portal liable

- 131/132 -

for certain vulgar comments posted on it. The applicant was a Norwegian citizen, Mona Hoiness, who claimed that three anonymous comments constituted sexual harassment against her. The portal, Hegnar Online removed, the content in question as soon as having been notified of it. Although the case before the ECtHR dealt with online content regulation, Article 8 of the ECHR was the focus of the proceedings, not Article 10.[63] In harmony with the national court, the ECtHR ruled that while the comments were vulgar, it was not necessary to examine in depth the nature of the impugned comments, as they [...] did not amount to hate speech or incitement to violence.[64] As previously, it was declared that a working NTDS is suitable for exemption from liability.

IV. Conclusion

As we could see, the practice of the ECtHR regarding issues on the liability of service providers covers a wide range. The ECtHR has made important clarifications regarding NTDS, which, as in the Delfi case, can be a suitable means for balancing the rights and interests of all parties concerned, and thus establishing exemption from liability. Also, the ECtHR has noticed that different types of content might require different approaches (especially content which amounts to hate speech), and seems to be willing to extend the framework of a unified system of regulation. Another important feature of ECtHR rulings was the consistent standpoint regarding the active versus passive roles of internet providers, namely that only providers of a neutral, passive nature may be exempt from liability.

Unlike the ECJ, the ECtHR puts great emphasis on consciously defining the criteria which may be of help to concerned parties, as well as to national law enforcers. The six-part liability test established by the ECtHR follows this route, clarifying the aspects to be examined in relation to the ECD and the ECHR. These are the following:

- the context of the comments;

- the measures applied by the applicant company in order to prevent or remove defamatory comments;

- the liability of the actual authors of the comments as an alternative to the applicant company's liability; and

- the consequences of the domestic proceedings for the applicant company;

- the conduct of the injured party; and

- the consequences of the comments for the injured party.

- 132/133 -

The rulings of the ECJ and the ECtHR differ on certain questions, which is not surprising, as the two courts focus on different aspects of the same issues. At the same time, the jurisprudence of these two international courts significantly contributes to national courts having well-grounded practical knowledge and precedents in questions of the liability of service providers and of the internet as a complex, ever-changing ecosystem. ■

NOTES

[1] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce'), OJ L 178, 17.7.2000 1-16.

[2] C. Wendehorst, Platform Intermediary Services and Duties under the E-Commerce Directive and the Consumer Rights Directive, (2016) 5 (1) Journal of European Consumer and Market Law, 30-33.

[3] Communications Decency Act (CDA), Pub. L. No. 104-104 (Tit. V), 110 Stat. 133 (Feb. 8, 1996).

[4] ECD, [14].

[5] The procedure already appeared in 1998 in the DMCA (The Digital Millennium Copyright Act), Pub. L. No. 105-304, 112 Stat. 2860 (Oct. 28, 1998); however, only to be applied in copyright infringement issues. See: M. Peguera, The DMCA Safe Harbors and Their European Counterparts: A Comparative Analysis of Some Common Problems, (2009) 32 (4) Columbia Journal of Law & the Arts, 481-512.

[6] M. Husovec, Holey Cap! CJEU Drills (Yet) Another Hole in the E-Commerce Directive's Safe Harbors, (2017) 12 (2) Journal of Intellectual Property Law & Practice, 125. https://doi.org/10.1093/jiplp/jpw203

[7] It is also important to note that, based on Section 4 Article 14(c), member states are entitled to implement their own regulations of removing content and restricting access.

[8] J. Oster, Media Freedom as a Fundamental Right, (Cambridge University Press, Cambridge, 2015) 123-124. https://doi.org/10.1017/CBO9781316162736

[9] Although the present study does not aim to examine the 2019 copyright directive, it is worth noting that its regulation of content removal is a unique phenomenon in the system. Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (Text with EEA relevance.), PE/51/2019/REV/1, OJ L 130, 17.5.2019 92-125. See: F. Romero-Moreno, Notice and staydown and social media: amending Article 13 of the Proposed Directive on Copyright, (2019) 33 (2) International Review of Law, Computers & Technology, 187-210. https://doi.org/10.1080/13600869.2018.1475906

[10] E. Imbarlina, The Roles and Relationship between the Two European Courts in Post-Lisbon EU Human Rights Protection, Jurist, (12 September 2013) https://www.jurist.org/commentary/2013/09/elena-butti-lisbon-treaty (Last accessed: 31 July 2019).

[11] S. Bardarova, Comparison Between the European Court of Justice and European Court of Human Rights, SSRN Paper, (18 June 2013) https://doi.org/10.2139/ssrn.2281215, https://ssrn.com/abstract=2281215 (Last accessed: 31 July 2019).

[12] Imbarlina, The Roles and Relationship between the Two European Courts...

[13] O. Pollicino, Judicial protection of fundamental rights in the transition from the world of atoms to the word of bits: the case of freedom of speech, (2019) 25 (2) European Law Journal, 168. https://doi.org/10.1111/eulj.12311

[14] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015).

[15] Deák I., Ki felel a kommentekért? A Delfi AS kontra Észtország ügy (Who is liable for comments? Delfi AS v. Estonia), (2015) 8 (3) Közjogi Szemle, 56-59.

[16] E. Derieux, Responsabilité d'un portail d'actualités du fait de commentaires diffamatoires postés par des internautes, (2015) (118) Revue Lamy droit de l'immatériel, 26-29.

[17] Out of the 185 posted on the article. - Author's note.

[18] Nádori P., Delfi AS v. Észtország: strasbourgi döntés a névtelen kommentekért viselt szolgáltatói felelősségről (Delfi AS v. Estonia: Strasbourg's decision on service provider liability for anonymous comments), (2013) 10 (56) Infokommunikáció és Jog, 131-140.

[19] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015), [144]-[161].

[20] An interesting comparison can be made with criteria applied by the ECtHR in traditional media-related cases established in the Axel Springer v. Germany and the Von Hannover v. Germany cases: - Contribution to a debate of general interest; - How well known is the person concerned and what is the subject of the report; - Prior conduct of the person concerned; - Method of obtaining the information and its veracity; - Content, form and consequences of the publication; - Severity of the sanction imposed. Axel Springer AG v. Germany, App no. 39954/08 (ECtHR, 7 February 2012), [89]-[95]; Von Hannover v. Germany (No. 2) App nos. 40660/08 and 60641/08 (ECtHR, 7 February 2012), [108]-[113].

[21] F. Krenc and S. van Drooghenroeck, Chronique de jurisprudence de la Cour européenne des droits de l'homme, (2015) 39 (6625) Journal des tribunaux, 821.

[22] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015), [109].

[23] Author's note.

[24] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015), [159]; See more: Sepsi T., No comment? Az internetes hozzászólásokért való jogi felelősség (No comment? Liability for online comments), (2015) 19 (4) Fundamentum, 108.

[25] Delfi AS v. Estonia, App no. 64569/09 (ECtHR, 16 June 2015), [159].

[26] Ibid., Joint dissenting opinion of Judges Sajó and Tsotsoria, I. 17.

[27] N. Cox, Delfi AS v Estonia: The Liability of Secondary Internet Publishers for Violation of Reputational Rights under the European Convention on Human Rights, (2014) 77 (4) The Modern Law Review, 619-629. https://doi.org/10.1111/1468-2230.12081

[28] Roberto Spano's counter-argument is that the ECtHR's related judgments provide merely a starting point, with limited relevancy as precedent. R. Spano, Intermediary Liability for Online User Comments under the European Convention on Human Rights, (2017) 17 (4) Human Rights Law Review, 665-679. https://doi.org/10.1093/hrlr/ngx001

[29] Nádori P., Úton a tömeges internetes szólás jogi megítélésének új megközelítése felé. A strasbourgi Nagykamara ítélete a Delfi-ügyben (On the way to a new approach to the legal assessment of mass online speech. The Strasbourg Grand Chamber's ruling in the Delfi case), (2019) 8 (2) In Medias Res, 362.

[30] Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt. v. Hungary, App no. 22947/13 (ECtHR, 2 February 2016).

[31] On detailed comparison of the two cases see: J. Sidlauskiene and V. Jurkevičius, Website Operators' Liability for Offensive Comments: A Comparative Analysis of Delfi as v. Estonia and MTE & Index v. Hungary, (2017) 10 (2) Baltic Journal of Law & Politics, 46-75. https://doi.org/10.1515/bjlp-2017-0012; J. T. Papp, Liability for Third-party comments before the European Court of Human Rights -Comparing the Estonian Delfi and the Hungarian Index-MTE decisions, in M. Szabó, P. L. Láncos and R. Varga (eds), Hungarian Yearbook of International and European Law 2016, (Eleven International Publishing, The Hague, 2017) 315-326. https://doi.org/10.5553/HYIEL/266627012016004001019

[32] Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt. v. Hungary, App no. 22947/13 (ECtHR, 2 February 2016), [82].

[33] Ibid. [70].

[34] Ibid. [80]-[85].

[35] Sidlauskiene and Jurkevičius, Website Operators' Liability for Offensive Comments... 49.

[36] Szigeti T. and Simon É., A hozzászólás szabadsága: a közvetítő szolgáltatói felelősség aktuális kérdéseiről (The freedom of commenting: On present-day issues of intermediary provider liability), (2016) 20 (2-4) Fundamentum, 113.

[37] Author's note.

[38] Pihl v. Sweden, App no. 74742/14 (ECtHR, 9 March 2017).

[39] Ibid. [31].

[40] Tatár A., A tárhelyszolgáltatók körében felmerülő felelősségi kérdésekről (On liability issues for service providers), (2019) 16 (72) Infokommunikáció és Jog, 11.

[41] Tamiz v. United Kingdom, App no. 3877/14 (ECtHR, 19 September 2017).

[42] Ibid. [25].

[43] On the relevant jurisdiction in the United Kingdom see: D. McGoldrick, The Limits of Freedom of Expression on Facebook and Social Networking Sites: A UK Perspective, (2013) 13 (1) Human Rights Law Review, 125-151. https://doi.org/10.1093/hrlr/ngt005

[44] Tamiz v. United Kingdom, App no. 3877/14 (ECtHR, 19 September 2017), [29].

[45] Tamiz v Google Inc Google UK Ltd, [2013] EWCA Civ 68.

[46] See more: Bunt v Tilley & Others, [2006] EWHC 407 (QB); [2007] 1 WLR 1243; [2006] 3 All ER 336; [2006] EMLR 523; Davison v Habeeb & Others, [2011] EWHC 3031 (QB); Kaschke v Gray & Hilton, (No 2) [2010] EWHC 1907 (QB); Jameel (Yousef) v Dow Jones & Co. Inc., [2005] EWCA Civ 75; [2005] QB 946; [2005] 2 WLR 1614; [2005] EMLR 353.

[47] Tatár, A tárhelyszolgáltatók körében felmerülő felelősségi kérdésekről, 9.

[48] Tamiz v. United Kingdom, App no. 3877/14 (ECtHR, 19 September 2017), [69].

[49] Axel Springer AG v. Germany, App no. 39954/08 (ECtHR, 7 February 2012), [90]; Von Hannover v. Germany, (No. 2) App nos. 40660/08 and 60641/08 (ECtHR, 7 February 2012), [109].

[50] Tamiz v. United Kingdom, App no. 3877/14 (ECtHR, 19 September 2017), [80].

[51] See: Koltay A., A közügyek vitáinak szabadsága és a személyiségi jogok védelme (On the freedom to dispute public affairs and on defending privacy), (2019) (4) Pázmány Law Working Papers, https://plwp.eu/files/PLWP_2019_04_Koltay.pdf (Last accessed: 31 July 2019).

[52] I. Milkaite, Tamiz v. UK: Google's blog-publishing service is not liable for offensive comments, Strasbourg Observers, (23 November 2017) https://strasbourgobservers.com/2017/11/23/tamiz-v-uk-googles-blog-publishing-service-is-not-liable-for-offensive-comments (Last accessed: 31 July 2019).

[53] Tamiz v. United Kingdom, App no. 3877/14 (ECtHR, 19 September 2017), [85].

[54] Magyar Jeti Zrt. v. Hungary, App no. 11257/16 (ECtHR, 4 December 2018).

[55] Author's note.

[56] C. Vander Maelen, Magyar Jeti Zrt v. Hungary: the Court provides legal certainty for journalists that use hyperlinks, Strasbourg Observers, (18 January 2019) https://strasbourgobservers.com/2019/01/18/magyar-jeti-zrt-v-hungary-the-court-provides-legal-certainty-for-journalists-that-use-hyperlinks (Last accessed: 31 July 2019).

[57] H. Surrel, Responsabilité d'un portail d'actualité en raison de l'affichage d'un hyperlien: CEDH, 4 décembre 2018. Magyar Jeti Zrt c/Hongrie, (2019) (52) La Semaine Juridique: Édition Générale, 2376.

[58] Magyar Jeti Zrt. v. Hungary, App no. 11257/16 (ECtHR, 4 December 2018), [73]-[75].

[59] Ibid. [77].

[60] See: Belpietro v. Italy, App no 43612/10 (ECtHR, 24 September 2013), [61]; Fatullayev v. Azerbaijan, App no 40984/07 (ECtHR, 22 April 2010), [100]-[103].

[61] Magyar Jeti Zrt. v. Hungary, App no. 11257/16 (ECtHR, 4 December 2018), [83].

[62] Høiness v. Norway, App no. 43624/14 (ECtHR, 19 March 2019).

[63] C.E.D.H., 19 mars 2019. Hoiness c. Norvège, (2018-2019) (3) Auteurs & media, 380.

[64] Høiness v. Norway, App no. 43624/14 (ECtHR, 19 March 2019), [69].

Lábjegyzetek:

[1] The Author is Assistant Professor, Eötvös Loránd University, Faculty of Law.

Tartalomjegyzék

Visszaugrás

Ugrás az oldal tetejére