COMMUNIA Association - CJEU https://communia-association.org/tag/cjeu/ Website of the COMMUNIA Association for the Public Domain Mon, 25 Sep 2023 12:35:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://communia-association.org/wp-content/uploads/2016/11/Communia-sign_black-transparent.png COMMUNIA Association - CJEU https://communia-association.org/tag/cjeu/ 32 32 Do 90s rappers dream of electric pastiche? https://communia-association.org/2023/09/20/do-90s-rappers-dream-of-electric-pastiche/ Wed, 20 Sep 2023 10:28:43 +0000 https://communia-association.org/?p=6349 Last week Germany’s highest court, the Bundesgerichtshof (BGH), for the 2nd time in less than a decade referred questions related to the Metall auf Metall case to the European Court of Justice. This time the BGH is asking the CJEU to explain the concept of pastiche so that it can determine if the use of […]

The post Do 90s rappers dream of electric pastiche? appeared first on COMMUNIA Association.

]]>
Last week Germany’s highest court, the Bundesgerichtshof (BGH), for the 2nd time in less than a decade referred questions related to the Metall auf Metall case to the European Court of Justice. This time the BGH is asking the CJEU to explain the concept of pastiche so that it can determine if the use of a 2 second sample of Kraftwerks 1977 song Metall auf Metall in Sabrina Setlur’s 1997 song Nur Mir qualifies as such.

Last week’s referral is the newest development in the legal saga that started in 1999, when Kraftwerk sued Setlurs producer Moses Pelham for the unauthorized use of the sample, and that has seen Germany’s highest court deal with the matter for the fifth time already. In response to the previous referral, the CJEU had established that the use of the sample was legal under Germany’s pre-2002 copyright rules but that it was infringing under the post-2002 copyright rules (that implemented the 2001 Copyright in the Information Society Directive). This conclusion was largely based on the finding that following the adoption of the 2001 Copyright in the Information Society (InfoSoc) directive, the concept of free use (“Freie Benutzung”) in German copyright law was against EU law.

The new referral arises from the fact that, as part of its 2021 Copyright revision and in order to bring German copyright law into compliance with the EU directives, Germany had removed the free use provision and at the same time introduced a new exception for the purpose of Caricature, Parody and Pastiche (§ 51a UrhG). The Hamburg Court of Appeals, to which the BGH had returned the case for a final determination, has subsequently ruled that after the introduction of the new exception in 2021 the use of the sample was in fact legal again as it constituted a use for the purpose of pastiche.

This decision has since been appealed by Kraftwerk, which is how the case came back to the BGH for another round and in the context of this appeal the BGH has now again asked the CJEU for guidance, this time on the meaning of the the term Pastiche in Article 5(3)(k) of the 2001 InfoSoc Directive from which the German exception is derived. This means that this time around the CJEU’s ruling in the case will have much wider implications than for German copyright law alone. It is very likely to determine the EU legal regime for sampling.

The referral to the BGH contains two separate questions which are described in the court’s press release (the text of the actual decision which contains the questions has still to be released by the BGH). According to the press release (translation ours)…

… the question first arises as to whether the restriction on use for the purpose of pastiche within the meaning of Article 5(3)(k) of Directive 2001/29/EC is a catch-all provision at least for an artistic treatment of a pre-existing work or other subject matter, including sampling, and whether restrictive criteria such as the requirement of humour, imitation of style or homage apply to the concept of pastiche.

The idea that uses for the purpose of pastiche serve as a sort of exception of last resort to safeguard artistic freedom is a welcome one, as it would protect freedom to create at the EU level, as we recommend in our Policy Recommendation #7. Considering that the pastiche exception is already mandatory in the EU, a positive answer to the first part of that question by the CJEU would ensure an harmonized protection of freedom of artistic expression at the EU level.

The CJEU has been suggesting for a while now that the principles enshrined in the EU Charter of Fundamental Rights are already fully internalized by EU copyright law, namely through the existing list of EU exceptions. As we have noted in our Policy Paper #14 on fundamental rights as a limit to copyright during emergencies, that is not necessarily the case, as the existing exceptions do not appear to have exhausted all the fundamental rights considerations that are imposed by the Charter, and on the other hand not all of those balancing mechanisms have yet found full expression in the national laws of the EU Member States.

With this referral, however, the court will have the opportunity to analyze whether the EU copyright law is sufficiently taking into account artistic freedom considerations. In our view, an interpretation of the pastiche exception in light of that fundamental freedom should lead the Court to provide a broad scope that covers all forms of artistic treatment protected by the Charter.

In the press release the BGH expresses a very similar concern noting the inherent conflict between the rigid EU copyright system and the freedom of (artistic) expression:

The pastiche exception could be understood as a general exception for artistic freedom, which is necessary because the necessary scope of artistic freedom cannot be safeguarded in all cases by the immanent limitation of the scope of protection of exploitation rights to uses of works and performances in a recognisable form and the other exceptions such as, in particular, parody, caricature and quotation.

This understanding of the Pastiche exception would also align with the intent of the German legislator when introducing it in 2021. In his 2022 study on the Pastice Exception conducted for the Gesellschaft für Freiheitsrechte, Till Kreutzer notes that

The German legislator has deliberately phrased the pastiche term in an open manner. It is clearly stated in the legislative materials that sec. 51a UrhG is intended to have a broad and dynamic scope of application. The pastiche exception serves to legitimize common cultural and communication practices on the internet, especially user-generated content and communication in social networks. It is supposed to be applied to remixes, memes, GIFs, mashups, fan art, fan fiction and sampling, among others.

In the context of this study Kreutzer proposes the following “copyright-specific definition” of pastiche and concludes that the concept covers the practice of sampling:

A pastiche is a distinct cultural and/or communicative artifact that borrows from and recognizably adopts the individual creative elements of published third-party works.

It will be interesting to see how the CJEU will approach the same task. In this context the second question formulated by the BGH is slightly more troubling. Here the BGH wants to know …

… whether the use “for the purpose” of a pastiche within the meaning of Article 5(3)(k) of Directive 2001/29/EC requires a finding of an intention on the part of the user to use an object of copyright protection for the purpose of a pastiche or whether the recognisability of its character as a pastiche is sufficient for someone who is aware of the copyright object referred to and who has the intellectual understanding required to perceive the pastiche.

Taking into account the facts of the Metall auf Metall case this question does not make much sense. In 1997, when Nur Mir was recorded, the concept of pastiche did not exist in German copyright law (and neither did the InfoSoc directive which introduced the concept at the EU level). This makes it pretty much impossible for the record producers to have had the intention to use the snippet from Metall of Metall for the purpose of pastiche — a purpose that according the the BGH itself still need to be defined by the CJEU.

For the reasons of legal certainty alone the CJEU should reject the intention requirement and base any definition on the characteristics of the use alone, as suggested in the above quoted definition developed by Kreutzer.

In any case the new BGH referral is a very welcome development in the Metall auf Metall saga. It provides the CJEU with the much needed opportunity to clarify this important concept that played a major role in the recent discussions about Article 17 CDSM Directive. In order to secure a majority for the directive, the EU legislator made the pastiche exception mandatory in an effort to safeguard transformative uses of copyrighted works on user generated content platforms.

It would only be fitting that the final legacy of Kraftwerks narrow-minded attempt to weaponize copyright to limit the creative expression of a subsequent generation of artists would almost three decades later result in a broad conceptualisation of pastiche as safeguarding artistic expression across the EU.

The post Do 90s rappers dream of electric pastiche? appeared first on COMMUNIA Association.

]]>
A Two-Tier System for Freedom of Expression: Towards a Right to be Heard? https://communia-association.org/2022/11/14/a-two-tier-system-for-freedom-of-expression/ Mon, 14 Nov 2022 09:00:55 +0000 https://communia-association.org/?p=6083 COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s third session “Beyond […]

The post A Two-Tier System for Freedom of Expression: Towards a Right to be Heard? appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s third session “Beyond the Judgement: The Future of Freedom of Expression.” It is published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

The changes brought by the CDSMD to the copyright paradigm cannot be underestimated. In addition to changing the liability regime for online content-sharing service providers (see Jütte), the  obligation to introduce ex ante and ex post content moderation has the potential to turn copyright into a censorship tool if not adequately balanced against the right to freedom of expression. In the long run, the implementation of article 17 CDSMD bears crucial consequences for cultural diversity, creativity and the ability to participate in political and societal discourse online.

A strong commitment to respecting the right to freedom of expression

Whilst a strong commitment to freedom of expression is commended, the balance to be struck between the right to property of copyright holders and the right to freedom of expression remains uncertain, as the EU legislator and the CJEU in C-401/19 leave this matter for EU member states to resolve. The potential serious interference with the right to freedom of expression and freedom of the arts (guaranteed by articles 11 and 13 of the EU Charter of Fundamental Rights respectively) created by inadequate controls on content moderation is undeniable, as acknowledged by the AG (at 77 and 151). The AG also confirmed that this interference is ‘attributable’ to the EU legislature and should not be left in the hands of service providers (AGO, at 83-84). On the compatibility of this limitation with the right of freedom of expression, the AG went back to the basics by first reminding that freedom of expression does not constitute an absolute right but amounts to a qualified right, meaning that limitations to the exercise of this fundamental freedom are possible as long as they meet the three-pronged test of: (1) being ‘provided by law’; (2)  respecting the ‘essence’ of that freedom (see O’Sullivan) and, (3) respecting the principle of proportionality (in accordance with art. 52(1) of the Charter which echoes the traditional test of art. 10(2) ECHR, as confirmed by the AG at 90).

This post wishes to focus on the third step of this assessment. To determine whether the limitation on freedom of expression is proportionate, the AG first recalled that over-blocking represents an inherent risk, and that current content moderation can only look for ‘matches’ rather than identify copyright infringement (see Jacques, Garstka, Hviid and Street). As such, this obligation must be accompanied by sufficient safeguards without which art. 17 would not constitute a proportional limitation on freedom of expression. Arguably, such safeguards are provided by paras 5, 7, 8 and 9 of art. 17 (AGO at 155). Here, through a combination of the addressees of art. 17, the mandatory character of some copyright exceptions such as parody, and the requirement of a complaint mechanism to protect users’ rights, the AG finds that proportionality has the potential to be met if adequately implemented in practice to achieve the goals of the provision (at 204 and 220). Although the CJEU did not embark on a sufficiently detailed analysis to determine whether content moderation constitutes an appropriate and necessary limit on freedom of expression to meet the objectives sought, the Court mostly agreed with the AG’s opinion whilst acknowledging that, in practice, the balance of fundamental rights and, consequently, the proportionality requirement, hinges heavily on national implementation (at 99).

Understanding the margin of appreciation granted to national authorities

Given that we are in the midst of national implementation (still) of this controversial provision, understanding what kind of margin of appreciation lies in the hands of national legislators is essential.

Firstly, it is quite noteworthy that the CJEU has not demonstrated that these filters could effectively meet the objectives set by art. 17. If filters are intended, even if not explicitly mentioned in the adopted text, arguably the EU legislator or CJEU should have investigated whether the technology is up to the task.

Secondly, if it is demonstrated that content moderation can meet the objectives of art 17 to –  let’s say – 95% accuracy, some categories of expressions might be more disproportionately hit than others, which has a detrimental impact on the diversity of expressions present on these platforms. This is likely to be the case for parodic expressions (understood here as including caricatures and pastiches) given that this type of expression can be adversely affected by content moderation. This is because such uses imply copying by their nature and are precarious particularly when carrying offensive messages. In addition, parody cases decided by EU member states under art 5.3(k) InfoSoc Directive are rather inconsistent in their application of the copyright exception as to the type of uses covered (questions as to how much copying or what type of humour is allowed remain the subject of many domestic discussions). Such inconsistencies can even be found within a single member state between different courts, making it extremely difficult for OCSSPs to assess the scope of the exception or for users to justify their use and save their expressions from over-blocking.

One of the possible consequences of leaving too wide of a margin of appreciation to member states is that the safeguards intended by art. 17 result in placing too much burden on users to demonstrate the legitimacy of their use, as is seen in some member states where the broadening of the interpretation of the parody exception has led courts to require defendants to demonstrate in what manner the copying of copyright-protected content was necessary (see e.g., Malka v. Klasen saga in France). This must be avoided as it also goes against the mechanics of art. 10 ECHR, as incorporated into EU law by arts 11 and 52(3) of the Charter.

It has already been argued that the way in which the ECtHR has approached the use of humour and balanced the right to freedom of expression with the right to property, as well as other fundamental rights such as the right to protect one’s reputation, can help in colouring the application of requirements of the parody exception to achieve harmonisation and the necessary predictability (Jacques). But beyond the application of the parody exception, the ECHR framework and its jurisprudence influence the margin of appreciation left to member states in their implementation of art. 17 to ensure that the new obligations contained therein remain proportional.

When faced with conflicting fundamental rights, the CJEU reminds us that a fair balance must be struck between the fundamental rights at stake. Looking at how the balance should be struck between the users’ right to freedom of expression and the property rights of right-holders, the ECtHR’s jurisprudence becomes helpful. Whilst, in the last decades, the jurisprudence has over-emphasized the margin of appreciation granted to member states, more recent decisions operate a shift in trying to recalibrate the balance so that any restriction to freedom of expression is more predictable and the subjectivity of national judges less influential. The ECtHR has had the opportunity to establish that the member states’ margin of appreciation is wider when it comes to purely commercial expressions than it is for political or artistic expressions (see Sekmadienis Ltd. v. Lithuania, para 73; markt intern Verlag GmbH and Klaus Beermann v Germany, para 33).  Furthermore, the ECtHR has begun to be stricter with Member States in emphasising that the intent and context of an expression are appropriately taken into consideration (See Patrício Monteiro Telo de Abreu v. Portugal, paras 37, 42 and 43). As held by the ECtHR, in a case regarding political caricatures, when balancing freedom of expression with the right to protect one’s reputation, the member state should not excessively focus on the right to protect one’s reputation at the expense of the exercise of freedom of expression. In an even more recent case dealing with criticism of the Bible made by a Polish popstar on the radio (Rabczewska v Poland, paras 58-59), the ECtHR noted that one must consider the normal audience of the speaker to determine whether a restriction is justified.

A shift towards a framed margin of appreciation and a right of being heard online

This shift towards a framed margin of appreciation is of relevance when considering how art. 17 CDSM Directive will operate in practice to ensure that proportionality is met and that a vibrant digital public sphere remains possible. Indeed, an overview of national cases where courts have had to strike a fair balance between the right to freedom of expression and the right to property has often tilted the balance towards the right to property with relative ease. This may be due to the legal tradition of specific member states, but a sustainable digital environment calls for greater scrutiny of this balancing exercise. It may also be due to the over-emphasis on the ability to express oneself freely in the digital environment, discarding that a strong commitment to freedom of expression also implies a right to be heard. A user should have a realistic expectation of being able to be heard in the digital realm. And yet, provisions such as art. 17 CDSMD have the potential to further curtail the ability of being heard online. Putting greater emphasis on the right to be heard could lead to ensuring that the balance between fundamental rights in the context of art. 17 is appropriately struck by giving more weight to users and focusing the burden of proof on right-holders. Where an expression has been caught through over-blocking, a greater commitment to the right to be heard could give ammunition to private actors to authorise the use until a decision is reached through an out-of-court mechanism or judicial decision. In essence, this focus on the right to be heard online would facilitate the satisfaction of the proportionality requirement and bring parties on par.

Finally, but nontrivially, the pledge for a strong right to freedom of expression in the digital environment has led the EU legislator to make the parody exception mandatory, meaning that member states do not have a choice but to introduce a parody exception for the scope of art. 17. However, some member states might not have elected to implement a parody exception under the InfoSoc Directive. This could ultimately lead to a situation where platforms that do not fall within the scope of art. 17 end up deploying such content moderation without adequate safeguards for users’ rights and leaving the users in a defenceless position unable to rely on the parody exception in that member state. Hence, national legislators have been called upon by scholars to take the opportunity of implementing the CDSM directive to introduce a broader parody exception given its roots in the right to freedom of expression and the ECtHR jurisprudence which consistently reminds us that the means of communication should not matter. It is important to remember that currently =fragmentation is still possible even amongst platforms where a lawful parody is shared if the platform is not falling within the scope of art. 17 CDSM Directive and where no conventional parody exception exist under the national copyright legal framework.

The post A Two-Tier System for Freedom of Expression: Towards a Right to be Heard? appeared first on COMMUNIA Association.

]]>
Poland, the CDSM and the Court of Justice: Still Searching for the ‘Essence’ of the Fundamental Right to Freedom of Expression https://communia-association.org/2022/11/11/poland-the-cdsm-and-the-court-of-justice/ Fri, 11 Nov 2022 09:00:11 +0000 https://communia-association.org/?p=6078 COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s third session “Beyond […]

The post Poland, the CDSM and the Court of Justice: Still Searching for the ‘Essence’ of the Fundamental Right to Freedom of Expression appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s third session “Beyond the Judgement: The Future of Freedom of Expression.” It is published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

On the 16th of April, the Court of Justice (CJEU) delivered its decision in case C-401/19  The Republic of Poland v European Parliament and the European Council concerning Article 17 of the Copyright in the Digital Single Market Directive (the CDSM). The Polish challenge centred on the argument that Article 17(4)(b) and (c) require ex ante preventative monitoring of all user uploads by content sharing service providers and is incompatible with Article 11 of the European Charter of Fundamental Rights. As such, the CDSM is silent on the need for ex ante preventative monitoring. Rather, Article 17(4)(b) and (c) require such providers employ their ‘best efforts’ in accordance with ‘high industry standards of professional diligence’ to ensure the unavailability of protected works. Prior to the Polish challenge, the conclusion reached by commentators on Article 17 – in particular the academic community – was that these requirements pointed in one direction only – the adoption of ex ante filtering technology.  Naturally, this conclusion did not manifest in a vacuum. Indeed, it is well known that the concern of European policymakers to remedy the so-called ‘value gap’ underpinned Article 17 and a desire that content sharing service providers follow the likes of YouTube in deploying filtering solutions such as its ContentID system. Indeed, the final non-prescriptive wording of Article 17 reflects a compromise over concern that expressly mandating for filtering technology was a step too far in burdening content sharing service providers and their users alike.

The Polish Challenge: Throwing Down the Gauntlet to the Court of Justice

Against the backdrop of this implicit European policy embrace of algorithmic enforcement of copyright, the Polish challenge can be viewed as a throwing down of the gauntlet before the CJEU to make this implicit policy express, or annul Article 17 in part or in full. The choice facing the Court was to fill the policy gap leading to the compromise wording of Article 17 and justify upload filters under a fundamental rights framework or identify alternative preventative measures that would comply with the ‘best efforts’ criterion of Article 17(4)(b) and (c). In this respect, the challenge rested on the widely held assumption that upload filters would be incompatible, in particular, with the norms of freedom of expression – not least, because current filtering technology is incapable of making a sufficient distinction between legal and illegal uses of content relative to the exceptions and limitations under the copyright acquis. In light of this, the Polish challenge asserted that Article 17 was absent sufficient safeguards to protect internet user rights of freedom of expression. Recognising that Article 17 does mandate the use of upload filters, the CJEU rejected the Polish challenge, confirming the compatibility of Article 17 with Article 11 of the European Charter of Fundamental Rights and laying considerable emphasis on the safeguards that exist within Article 17.

These safeguards are intended to operate both ex ante and ex post. Under Article 17(7), prior to deploying a filtering solution, cooperation between content sharing providers and right holders, i.e. the provision of information from the latter to the former, shall not result in the blocking of legal content. Article 17(4)(b) directs that right holders provide relevant and necessary information, which according to the Court protects freedom of expression, as content sharing providers will not, in the absence of such information, make such content unavailable. Moreover, the Court reiterated that a system that cannot sufficiently distinguish between lawful and unlawful content, in particular content that would require a value judgement as to its illegality, would not be compatible with Article 11 of the Charter. In short, the Court narrowed the scope of Article 17 as applying only to manifestly infringing content. Article 17(9) establishes the first ex post safeguard, in the event that content is ‘erroneously or unjustifiably’ blocked, by requiring content sharing providers to put in place effective and expeditious complaint and redress mechanisms. Moreover, the Member States are to ensure there is access to out-of-court redress mechanisms and recourse to the courts. The Court was therefore satisfied that Article 17 did – contrary to the Polish claim to the opposite – contain sufficient safeguards to ensure compatibility with Article 11 of the Charter. (Many) questions however remain, not least, the question as to the ‘essence’ of freedom of expression in this context.

The Fact of Safeguards Tells Us Nothing About the ‘Essence’ of Freedom of Expression

Adopting a classic proportionality formula, the Court’s position is supported by its jurisprudence and Article 52(1) of the Charter. To this end, Article 52(1) allows for fundamental rights to be limited if justified and subject to minimum safeguards, while Promusicae  makes clear internet users’ fundamental rights can be limited under a balancing exercise to support copyright enforcement. In turn, the Court had little difficulty in establishing that the ‘essence’ of freedom of expression in the context of Article 17 was respected, in particular, by the minimum safeguards under Article 17(7)-(9). In doing so, it drew on rulings of the European Court of Human Rights that prior restraints on freedom of expression are permissible, subject to a particularly tight legal framework. However, the fact of safeguards tells us nothing about their substance, particularly their actual or likely efficacy in practice relative to protecting internet users’ right to freedom of expression. Following the ruling, the normative position for the Member States is to now answer the question of how these safeguards are to work to ensure internet users’ fundamental rights are respected. Therein lies the problem. In framing the essence of the right as unaffected by prior restraints and insulated by minimum safeguards, the Court fails to illuminate the degree to which such restraints may impact the right or indeed to what degree the minimum safeguards will need to be calibrated to protect freedom of expression.

The objection of course is that this should not be the function of the Court but rather the Member States. Indeed, the same argument could be made concerning the Court’s filling of the policy gap by reading upload filters into Article 17 generally. In short, we are where we are because of the Court’s ruling – and we now face a dilemma. Article 17 envisages that legitimate content should not be blocked. The Court reminds us that a system that cannot sufficiently distinguish legal from illegal content will not comply with Article 11 of the Charter. Yet, the remaining safeguards within the provision deal with this precise scenario, giving rise to questions that put into sharp focus the lack of guidance from the Court as to what the substantive essence of freedom of expression is, in this context. To this end, even with the Court attempting to limit the overall scope of such filtering, mistakes will happen. What then of redress?

What if Time is of the ‘Essence’ of Freedom of Expression under the CDSM?

Let us assume the content was legal but erroneously blocked – how long is too long for the block to be disproportionate in a scenario where there are apparent safeguards so that this should not have occurred at all? Does it matter? After all, a safeguard is not a guarantee per se and in this context if it were, imbalance would surely occur in suggesting blocks could never occur. But this is precisely the point – if under classic proportionality balancing a block can occur, the redress mechanism must still be effective and, as Article 17(9) provides, without undue delay and subject to human review. The irony is that the latter, unable to achieve the former on sufficient grounds under Article 14 of the eCommerce Directive, is one of the main arguments in favour of upload filters. What then is a proportionate delay where the most appropriate redress is removal of the block? The answer may well prove elusive with the task being to apply objective standards to expressions that are by their nature inherently subjective and relative to the context in which they are made.

Indeed, coming within an exception or limitation only tells us the category of the expression – it says nothing as to the value of that expression to the user relative to any myriad of factors, from the imprint of user personality to the temporal resonance with society and culture at the point of upload. When it comes to freedom of expression in this context, time and not the existence of safeguards may well be the starting point for discerning the ‘essence’ of the right. Indeed, as we move towards greater filtered futures, we need a surer understanding of the ‘essence’ of freedom of expression in this context beyond the existence of safeguards. For the implementation of Article 17 in light of the Court’s ruling, ex ante mechanisms that allow prior user input to challenge any potential block appear a necessary minimal mitigation measure to protect internet user rights. But even this delays the inevitable question where a block is imposed on what turns out to be legal content – to what degree is time of the essence?

If the normative position following the Court’s ruling is that Article 17 is to be implemented, the assumption is that some delay is inevitable in resolving an end user’s complaint. If we accept the essence of freedom of expression is contingent on the existence of safeguards that include redress then this is unproblematic. However, in balancing copyright and freedom of expression in this context, the most proportionate (and appropriate) redress is the expeditious removal of the block, relative to the factors described above. If an objective standard remains elusive in this context precisely because of those factors, can we find balance? The perspective of classic fundamental rights balancing would still suggest an answer in the affirmative; after all, an end-user can still avail of out of court settlement procedures and the courts. However, if it is accepted that in this context the ‘essence’ of the right comes down to individualistic, nuanced and relative factors, such avenues appear wholly inadequate where time is of the essence. Following the Court’s ruling, the question of time may be derided as an abstract consideration that sits in contrast to the normative position of implementation. However, on the contrary, it is precisely because of the Court’s ruling that difficult questions remain surrounding the balance between freedom of expression and copyright in this context. In protecting end-users, a useful starting point would be to identify what the real ‘essence’ of freedom of expression is, in this context. If, as the foregoing suggests, time is of the essence, this invites the question – is balance in practice, on these terms, possible under Article 17?

The post Poland, the CDSM and the Court of Justice: Still Searching for the ‘Essence’ of the Fundamental Right to Freedom of Expression appeared first on COMMUNIA Association.

]]>
Implications from C-401/19 for National Transpositions in the Light of Freedom of Expression https://communia-association.org/2022/11/01/implications-from-c-401-19-for-national-transpositions-in-the-light-of-freedom-of-expression/ Tue, 01 Nov 2022 09:00:39 +0000 https://communia-association.org/?p=6073 COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation […]

The post Implications from C-401/19 for National Transpositions in the Light of Freedom of Expression appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation or Harmonisation? The impact of the Judgment on National Implementations.” It is published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

Article 17 of Directive (EU) 2019/790 on copyright in the Digital Single Market (CDSMD) has been subject to much debate even before its enactment. The latest twist is the CJEU’s ruling in the Polish action for annulment of Article 17 CDSMD. Uncertainties about the precise and correct practical application of Article 17 CDSMD remain. The judgment, however, provides some clarity on how this norm must be transposed into national law to ensure compliance with fundamental rights, particularly with freedom of expression and information as enshrined in Article 17(2) of the Charter of Fundamental Rights of the European Union.

Compatibility of verbatim transpositions and why the German approach is ahead of the pack

Article 17 CDSMD is open to various interpretations – as has become clear during the hearings before the CJEU. While Spain and France argued that an implementation of ex post safeguards is sufficient to protect user rights, the CJEU later confirmed the position taken by the Advocate General and Member States like Germany that Article 17 CDSMD’s ex post safeguards must necessarily be supplemented by ex ante safeguards. These should address the danger of overblocking, that is the undue blocking of lawful content by OCSSPs before its dissemination in order to comply with the obligations from Article 17(4) CDSMD.

The judgment emphasises the need for ex ante safeguards against rampant blocking under Article 17(4) CDSMD. This fact, together with the obligation for Member States, when transposing Article 17 CDSMD, to strike a fair balance between the various fundamental rights, have raised doubts about the compatibility of verbatim transpositions. Other commentators have rejected these, arguing that minimal verbatim transposition is necessary to avoid impairing the harmonisation effect of the Directive.

The CJEU did not concern itself with national transpositions, but rather solely Article 17 CDSMD in its original version, and found an interpretation in compatibility with the freedom of expression. The CJEU held that Article 17(4) CDSMD is accompanied by appropriate safeguards. The judgment requires Member States to ensure an interpretation of national provisions that contains these safeguards.

As the judgment itself already identifies an interpretation of the Article in line with fundamental rights, the same must surely also apply to identical wording (i.e., copy and paste transpositions) in national law. This has to be the case as Member States are bound to interpret their laws in line with the CJEU’s interpretation. National courts, when interpreting national law, must have regard to the case law of the CJEU. Therefore, a conforming interpretation of verbatim transpositions should be ensured. As a consequence, copy and paste transpositions must be considered compatible with the judgment and the fundamental right to freedom of expression.

This does not mean, however, that this kind of implementation is the best in the face of freedom of expression and information. Instead, a more elaborative implementation, which provides more details on the delineation of permitted and prohibited ex ante blocking, should be the preferable way forward.

A conceivably elaborative implementation is the German version of Article 17 CDSMD, as it was transposed in the act on the copyright liability of OCSSPs (UrhDaG). One aspect of particular interest is the concept of a “presumed legal use”. In summary, Germany established a national additional ex ante safeguard for content which either qualifies as minor usage or is marked by the user as legally permitted. Under certain requirements, this content is presumed to be lawful and therefore cannot be blocked by automated means implemented by the OCSSPs. If rightholders contest this content, they have to initiate the complaint procedure, which may result in the content being taken down.

While there is an ongoing discussion about the compatibility of the German mechanism with the EU template, it is true that it dares to do something that had been missing from the EU Directive: it defines circumstances under which ex ante blocking is not possible.

The need for a definition of “manifestly infringing”

It has to be said that while this constitutes a step in the right direction, the current German provisions may not be the ultimate solution. Rightholders argue that even the unjustified usage of a film sequence as short as 15 seconds can significantly harm their economic interests, when only blocked after an ex post intervention. Nevertheless, the German transposition puts requirements in black and white for the design and use of automated content recognition (ACR) technology and automated blocking based on it.

In order to protect freedom of expression, it is important to be more specific about the requirements and circumstances under which automated ex ante blocking of content is permissible. One of the key points from the judgment in the Polish case is that for content to be blocked ex ante without freedom of expression being unjustifiably harmed, no independent assessment of its unlawfulness must be necessary. In other words, content needs to be “manifestly infringing”, which makes this term the central standard for determining whether the prevention of an upload was lawful or not.

Therefore, it should not be left to OCSSPs to determine when content is infringing enough to be regarded as manifestly infringing and can thus lawfully be blocked automatically. Rather, regulators should find ways to define requirements. This would not only provide clarity to users, rightholders and platforms, but deployed at the EU level it would also contribute to the harmonisation objective.

The implementation of the German legislator may serve as a starting point. However, it only defines circumstances under which automated blocking is not permissible, i.e., when manifestly infringing content is not present. Therefore, the law only gives a hint of a negative definition. A positive definition, which indicates when content can be blocked automatically, has yet to be found.

Implications of the judgment for the design of the complaint mechanism

In the context of national transpositions of Article 17 CDSMD, two remarks regarding current questions of implementation should be made.

The first concerns national provisions in respect of the complaint mechanism as set out in Article 17(9) CDSMD. From the judgment in the Polish case, we know that the complaint mechanism is considered as an additional (ex post) safeguard, which applies in “cases where, notwithstanding the [ex ante] safeguards […], the providers of those services nonetheless erroneously or unjustifiably block lawful content” (para 93).

The complaint mechanism is therefore intended to deal with cases where there is a dispute as to whether the content is manifestly infringing. In those cases, however, it is in the nature of things that the content in question stays offline for the duration of the complaint mechanism. This presupposes that the basic requirements for ex ante safeguards have been implemented and that the ex post complaint mechanism only applies in exceptional cases. It is only under these conditions that provisions like the Italian one under which all contested content shall remain disabled for the duration of the complaint procedure can be considered compatible with the judgment.

The Commission’s category of “earmarked content” needs revision

The second aspect relates to earmarked content as mentioned in the Commission’s Guidance on Article 17. The Guidance defines earmarked content as content flagged by rightholders that is particularly valuable and could cause them significant harm if it remains available without authorization (examples include pre-released music or films). According to the Guidance, earmarked content should be specifically taken into account when assessing whether OCSSPs have made their best efforts to ensure the unavailability of content.

What is highly problematic about this provision, however, is that OCSSPs would be forced to exercise particular care and diligence in this case, which would ultimately result in a higher blocking rate and ignore the requirements of the judgment in the Polish annulment action. As a solution, the Commission presents rapid ex ante human review in the Guidance, which takes place for such earmarked content before the content gets online, when detected by the filters.

This, however, does not comply with Article 17(8) CDSMD and what follows from the Glawischnig-Piesczek and recent Poland cases. According to these cases, a provider can only be required to remove content where a detailed legal examination is not necessary. And, although framed as “rapid ex ante review”, this is nothing other than a detailed legal examination.

Therefore, the Commission needs to revise its Guidance on this point and Member States should choose an implementation of earmarked content which respects the case law. A possible solution could be to use an earmark mechanism not ex ante but ex post. Content which is marked by rightholders as of significant economic value and matches content uploaded by users could be processed through an accelerated complaint procedure. This would be similar to what Article 19 of the Digital Services Act (DSA) establishes.

The DSA needs to fix Article 17 CDSMD

The DSA raises hopes for more harmonisation of details related to the interpretation of Article 17 CDSMD. Due to a largely overlapping scope of application for OCSSPs in the area of copyright, it can be assumed that the DSA provisions apply on the basis of a lex generalis relationship to Article 17 CDSMD. Provisions such as Article 17 DSA, that sets out detailed rules for a complaint mechanism, or Article 19 DSA, with its trusted flagger regime, could influence the way Article 17 CDSMD works in practice.

Due to its nature as a regulation, the DSA should lead to greater harmonisation. In order to achieve this, it would in addition be necessary to use the aforementioned revision of the Guidance to develop a positive definition of manifestly infringing content which can be used as a basis for designing the algorithms of OCSSPs.

The post Implications from C-401/19 for National Transpositions in the Light of Freedom of Expression appeared first on COMMUNIA Association.

]]>
The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms https://communia-association.org/2022/10/26/the-impact-of-the-german-implementation-of-art-17/ Wed, 26 Oct 2022 08:00:59 +0000 https://communia-association.org/?p=6062 Based on the joint paper with Alexander Peukert, “Coming into Force, not Coming into Effect? The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms.” COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU […]

The post The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms appeared first on COMMUNIA Association.

]]>
Based on the joint paper with Alexander Peukert, “Coming into Force, not Coming into Effect? The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms.”

COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation or Harmonisation? The impact of the Judgement on National Implementations.” It is published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

On 26 April 2022, the CJEU dismissed the annulment action initiated by the Republic of Poland against Art. 17 CDSM Directive 2019/790 on copyright and related rights in the Digital Single Market (CDSMD): According to the Grand Chamber of the CJEU, the provision imposes a de facto obligation on service providers to use automatic content recognition tools in order to prevent copyright infringements by users of the platform. While this obligation leads to a limitation of the freedom of expression of users, appropriate and sufficient safeguards accompany the obligation, ensuring respect for the right to freedom of expression and information of users and a fair balance between that right and the right to intellectual property. However, guidelines of the CJEU as to how such safeguards have to be implemented in detail remain vague (C-401/19).

User safeguards in the German implementation

The Member States of the European Union follow different approaches when it comes to the implementation of Art. 17 CDSMD. The result is a legal fragmentation of platform regulations and uncertainty for service providers, rightholders and users alike as to the prerequisites under which OCSSPs have to operate. When the German Act on the Copyright Liability of Online Content Sharing Service Providers (OCSSP Act) entered into force imposing several detailed obligations on the service providers, many considered this new law as a model for the remaining implementations of other Member States. With its unique system, in which ex ante duties to block unlawful content are inseparably intertwined with ex ante duties to avoid the unavailability of lawful user content, the German OCSSP Act contains provisions which could pass as sufficient safeguard mechanisms in the meaning of the decision of the CJEU. In response to the debate on EU level and in other Member States, the OCSSP Act introduces a new category of “uses presumably authorised by law” – i.e., any statutory limitation to copyright –, which, as a rule, must not be blocked ex ante.

“Uses presumably authorised by law”, as laid down in sec. 9 of the OCSSP Act, can either be minor uses, which do not exceed the thresholds of sec. 10 OCSSP Act, or – if that is not the case – uses which the user flagged as legally authorised as per sec. 11 OCSSP Act. Both minor uses and flagged UGC must contain less than half of one or several other works (with an exception for images) and must combine this third-party content with other content. If these requirements are met cumulatively, the service provider must communicate the respective UGC to the public up until the conclusion of a complaints procedure. Thus, the category of “uses presumably authorised by law” enables the user to upload the content without interference by an automated copyright moderation tool. Rightholders, on the other side, are equipped not only with the possibility to initiate an internal complaints procedure but also with a “red button” which leads to the immediate blocking of content if it impairs the economic exploitation of premium content by the rightholder.

In sum, the German OCSSP Act provides a well-balanced legal framework. However, the quality of a statute is not only measured by its text and the concepts applied but also by the practical impact on the behaviour of its addressees. The question arises whether the OCSSP Act is able to deliver on its promises or if it turns out to be a toothless tiger in practice.

Effects of the German OCSSP Act

Against this background, Alexander Peukert and I have analysed whether the enactment of the German OCSSP Act in August 2021 had an immediate impact on platforms’ policies. The results of the study are compiled in a paper published in January 2022, which was the foundation for the presentation at the Filtered Futures conference on 19 September 2022 in Berlin. For the purpose of answering the question of what factual effect the OCSSP Act has actually had on the platform policies, the study examines the terms and conditions of several service providers both before and after the enactment of the OCSSP Act on 1 August 2021. We reviewed and analysed the German-language websites of eight services as to whether their terms and conditions and other publicly accessible copyright policies changed upon the entry into force of the German OCSSP Act. The data was collected at four points in time between July and November 2021. At all four points, we analysed the source-data as to whether the service provider implemented six selected mandatory duties, including the possibility for rightholders to submit reference files, the flagging option, the red button solution and a complaint system in accordance with the requirements of the OCSSP Act. With a total of 514 saved documents, including terms and conditions, general community and copyright guidelines, complaint forms, FAQs and other relevant copyright help pages, the paper allowed us to identify the practical effect of the German OCSSP Act over time on individual services, and across the eight services covered.

The results of the data collection are twofold. One the one hand, the changes which could be observed in the terms and conditions of the platforms over time are minor. On the other hand, there were differences between the service providers with regard to their compliance level with the statutory duties of the OCSSP Act already before its enactment (see table below).

Table on compliance of platforms with OCSSP Act

Most changes we witnessed concerned the duty of the OCSSPs to guarantee a “notice and prevent” procedure. The flagging option, by contrast, was not clearly laid out in the terms and conditions of any service provider, at best vaguely indicated by YouTube and Facebook. The obligation of the service providers to inform their users about all statutory limitations under German copyright law in their terms and conditions was not fully met by the services, as they primarily referred to exceptions and limitations under “fair use” or EU law, but never to the exceptions and limitations under German copyright law.

Conclusions and outlook

In the conclusions of the paper, we raise the question of why larger platforms such as YouTube, Facebook and Instagram display a higher compliance with the OCSSP Act than comparatively small content sharing platforms. Furthermore, we note that it has become apparent that different Member State implementations and generally uncertain legal circumstances on the EU level impair the willingness of OCSSPs to take measures. It remains to be seen whether the decision of the CJEU will have any noticeable impact on the platform-side implementation. Lastly, the study brings to light the consequences of the lack of sanctions for failure to implement the user rights, in particular the regime regarding “uses presumably authorised by law”, i.e., minor or pre-flagged uses, in the German OCSSP Act.

In its essence, the study can serve as a starting point for further research. While the findings of the study reflect the changes of platform policies and primarily offer a text-based evaluation, they may provide incentives to investigate the upload process and other functionalities of the service providers further. More in-depth research on the legal and practical aspects of the new era of platform regulation is necessary to close the gap in legal doctrinal research on the implementation of Art. 17 CDSMD on platform level. The recently published decision of the CJEU and its emphasis on sufficient user safeguards adds fuel to this fire.

The post The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms appeared first on COMMUNIA Association.

]]>
Implementation Imperatives for Article 17 CDSM Directive https://communia-association.org/2022/10/24/implementation-imperatives-for-article-17-cdsm-directive/ Mon, 24 Oct 2022 08:00:01 +0000 https://communia-association.org/?p=6029 COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation […]

The post Implementation Imperatives for Article 17 CDSM Directive appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation or Harmonisation? The impact of the Judgement on National Implementations.” It first appeared on Kluwer Copyright Blog and is here published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

The adoption of the CDSM Directive marked several turning points in EU copyright law. Chief amongst them is the departure from the established liability exemption regime for online content-sharing service providers (OCSSPs), a type of platform that (at the time) was singled out from the broader category of information society service providers regulated for the last 20 years by the E-Commerce Directive. To address a very particular problem – the value gap – Article 17 of the CDSM Directive changed (arguably, after the ruling in (YouTube/Cyando, C-682/18) the scope of existing exclusive rights, introduced new obligations for OCSSPs, and provided a suite of safeguards to ensure that the (fundamental) rights of users would be respected. The attempt to square this triangle resulted in a monstrosity of a provision. In wise anticipation of the difficulties Member States would face in implementing, and OCSSPs in operationalizing Article 17, the provision itself foresees a series of stakeholder dialogues, which were held in 2019 and 2020. Simultaneously, the drama was building up with a challenge launched by the Republic of Poland against important parts of Article 17. Following the conclusion of all these processes, and while (some) Member States are considering reasonable ways to transpose Article 17 into their national laws, it is time to take stock and to look ahead. The ruling in Poland v Parliament and Council (C-401/19) is a good starting point for such an exercise.

Shortly after the CDSM Directive was adopted, the Republic of Poland sought to annul those parts of Article 17 which it argued, and the Court later confirmed, effectively require OCSSPs to prevent (i.e. filter and block) user uploads. Preventing control of user uploads constitutes, the Court confirmed, a limitation of the right to freedom of expression as protected under Article 11 of the EU Charter of Fundamental Rights. In its argumentation Poland suggested that it might not be possible to cut up Article 17, a provision of ten lengthy paragraphs. Indeed, the intricate relations between specific obligations, new liability structures and substantive and procedural user safeguards cannot be seen in isolation, and therefore require a global assessment. And this is what the Court embarked upon. Whilst assessing the constitutionality of Article 17, the Luxembourg judges, in passing, provided some valuable insights into how a fundamental rights compliant transposition might look, but also left crucial questions unanswered.

The Court was very clear that any implementation of Article 17 – as a whole – must respect the various fundamental rights that are affected. However, the judges in Luxembourg did not go into detail on how Member States should transpose the provision. Of course, it is part of the nature of directives that Member States have a certain margin of discretion how the objectives pursued by a directive will be achieved. To complicate matters, the Court stated expressly that the limitation on the exercise of the right to freedom of expression contained in Article 17 was formulated in a ‘sufficiently open’ way so as to keep pace with technological developments. Together with the intricate structure of Article 17 itself, this openness tasks Member States with a difficult exercise: to arrive at a transposition of Article 17 (and of course also other provisions of the CDSM Directive) that achieves the objective pursued while respecting the various fundamental rights affected (see Geiger/Jütte). The various national implementation processes have already demonstrated that opinions differ as to what constitutes a fundamental rights compliant implementation, and proposals have been made at different points of the spectrum between rightsholder-friendly and user-friendly transpositions.

The Court’s ruling makes a few, very important statements in this respect and thereby sets the guardrails beyond which national transpositions should not venture. First and foremost, control of user uploads must be strictly targeted at unlawful uses without affecting lawful uses. Recognizing that the employment of online filters is necessary to ensure the effective protection of intellectual property rights, the ruling highlights the various safeguards that ensure that the limitation of the right to freedom of expression is a proportionate one. Implicit here is that the targeted filtering of user uploads is a limitation of this fundamental right that necessitates six distinct measures that must be put in place, and which, in concert, ensure respect for the rights of users on OCSSP platforms.

Targeted Filtering

To limit the negative effects of overblocking and overzealous enforcement, any intervention by OCSSPs must be aimed at bringing the infringement to an end and should not interfere with the rights of other users to access information on such services. This suggests that it must be clear that only unlawful, or infringing content is targeted. The Court elaborates that within the context of Article 17 only filters which can adequately distinguish between lawful and unlawful content, without requiring OCSSPs to make a separate legal assessment, are appropriate. This is problematic, since copyright infringements are context sensitive, in particular in relation to potentially applicable exceptions and limitations. Requiring rightholders to obtain court-ordered injunctions before content can be subject to preventive filtering and blocking seems unreasonable, considering the amount of information uploaded onto online platforms. On the one side, copyright infringements are different from instances of defamation or other offensive speech, that was the subject of the preliminary reference in Glawischnig-Piesczek (C-18/18). On the other, the requirement of targeted filtering seems to eliminate the proposal made by the Commission in its Guidance (see Reda/Keller) to allow rightholders to earmark commercially sensitive content, which could be subject to preventive filtering for a certain limited amount of time. Unfortunately, the Court does not elaborate further how OCSSPs can target their interventions. It merely states that rightholders must provide OCSSPs with the relevant and necessary information on unlawful content, which failure to remove would trigger liability under Article 17(4) (b) and (c). What this information must contain remains unclear. Arguably, rightholders must make it very clear that certain uploads are indeed infringing.

User Safeguards

In terms of substantive safeguards, Article 17 takes a frugal approach. To avoid speech being unduly limited, it requires Member States to ensure that certain copyright exceptions must be available to users of OCSSPs. These exceptions are already contained in Article 5 of the Information Society Directive as optional measures, but Article 17(7) makes them mandatory (see Jütte/Priora). Arguably not much changes with the introduction of existing exceptions (now in mandatory form). Moreover, the danger of context-insensitive automated filtering still persists, even though users enjoy these substantive rights.

Therefore, Article 17 foresees procedural safeguards, which is where the balance in Article 17 is struck. With the Court having confirmed that preventive filtering is an extreme limitation of freedom of expression, the importance of the procedural safeguards cannot be overstated (see Geiger/Jütte). The ruling in its relevant parts can be described as anticlimactic. Instead of describing how effective user safeguards must be designed, the judgment merely underlines that these safeguards must be implemented in a way that ensures a fair balance between fundamental rights. How such balance can be achieved was demonstrated in summer 2022, when the Digital Services Act (which is still to be formally adopted) took shape. A horizontally applicable regulation that amends the E-Commerce Directive, the DSA puts flesh to the bones that the CDSM Directive so clumsily constructed into a normative skeleton.

The DSA provides far more detailed procedural safeguards compared to the CDSM Directive. It is also lex posterior to the latter, which in itself is, however, lex generalis to the former. Their relation, but arguably also their genesis, holds the key to outlining not necessarily the solution to the CDSM conundrum, but to the questions that national legislators must ask.

The DSA sets out how hosting services and online platforms must react to notifications of unlawful content and how they must handle complaints internally; the DSA also outlines a system for out-of-court dispute settlement, which requires certification by an external institution. Some of these elements are, in embryonic form as mere abstract obligations, already contained in the CDSM Directive’s Article 17. And by definition, OCSSPs are hosting providers and online platforms in the parlance of the DSA, which is why these rules should also apply to them. OCSSPs, however, incur special obligations and are exempted from the general liability of the E-Commerce Directive and (soon) the DSA, because they are more disruptive – of the use of works and other subject matter protected by copyright and of the rights of users, the latter as a result of obligations incurred because of the former.

That some of the rules introduced by the DSA must also apply to OCSSPs has been argued elsewhere (see Quintais/Schwemer). It has been suggested that the rules of the DSA should apply in areas in which the DSA leaves Member States a margin of discretion or where the CDSM Directive is silent. CDSM rules that derogate from those of the DSA, specifically the absence of a liability exemption for user uploaded content will certainly remain unaffected. But there are good arguments to be made why in areas of overlapping scope, the DSA should prevail, or systematically supplement the CDSM Directive. Instead, the DSA rules must form the floor of safeguards that Member States have to provide, and which should be elevated in relation to the activities of OCSSPs. The reason is a shifting of balance between the fundamental rights concerns, which relates to the last paragraph of the CJEU’s ruling in Poland v Parliament and Council. The obligation to proactively participate in the enforcement of copyright intensifies the intervention of OCSSPs, as opposed to the merely reactive intervention required under the rules of the DSA. The effects for rightholders are beneficial (although the rationale for Article 17 suggests that it addresses a technological injustice) in the sense that OCSSPs must intervene in a higher volume of cases; the negative effects are borne by users of platforms, their rights are limited as a result. Arguably, this must be balanced by a higher level of protection of users, in this case in the form of stronger and more robust procedural safeguards. A further elevation of substantive safeguards would in itself not be helpful, since their enjoyment relies effectively on procedural support.

As a result, Member States should, or even must, consider that the elaboration of user safeguards in the form of internal complaints mechanisms and out-of-court dispute settlement mechanisms must find concrete expression in their national transpositions. They should be more robust than those provided by the DSA. Ideally, this robustness will be written into national laws and not left to be defined by OCSSPs as executors of the indecisiveness of legislators. The difficulty lies, of course, in the uncertainty of technological process, the development of user behaviour and the rise and fall of platforms and their business models. Admittedly, some sort of flexibility is necessary, the Court has recognized this explicitly. But if the guardrails that guarantee compliance with fundamental rights are not, and possibly cannot be written into the law, they must be determined by another independent institution. One institution, understood more broadly, could be the stakeholder dialogues required pursuant to Article 17(10) CDSM Directive. The Court listed them as one of the applicable safeguards, and a continuous dialogue between OCSSPs, rightholders and users could serve to define these guardrails more concretely. Such a trilogue, however, must take as its point of departure the ruling in C-401/19 and learn from the misguided first round of stakeholder dialogues. Another institution that could progressively develop standards for user safeguards are formal independent institutions, such as the Digital Service Coordinators (DSCs) required under the DSA. There, they have the task, amongst others, of certifying out-of-court dispute settlement institutions and awarding the status of trusted flaggers. Their tasks could also include general supervision and auditing of OCSSPs with regard to their obligations arising not only under the DSA, but also under the CDSM Directive. In the context of the Directive, DSCs could also be tasked with developing and supervising a framework for rightholders as ‘trusted flaggers’ on online content-sharing platforms. The role of DSCs under the DSA and their potential role in relation to OCSSPs is still unclear. But the reluctance of the legislature to give substance to safeguards and to clarify the relation between the DSA and the CDSM Directive mandates that the delicate task of reconciling the reasonable interests of rightholders and the equally important and vulnerable interest of users be managed by independent arbiters. Leaving this mitigation to platform-based complaint mechanism or independent dispute settlement institutions is an easy solution. A constitutionally sound approach would try to solve these fundamental conflicts at an earlier stage instead of making private operators the guardians of freedom of expression and other fundamental rights. While the Court of Justice of the European Union has not stated this expressly, its final reference to the importance of implementing and applying Article 17 in light of, and with respect to fundamental rights can be understood as a warning not to take the delegation of sovereign tasks too lightly.

The post Implementation Imperatives for Article 17 CDSM Directive appeared first on COMMUNIA Association.

]]>
The Filtered Futures conference programme is now live https://communia-association.org/2022/08/31/the-filtered-futures-conference-programme-is-now-live/ Wed, 31 Aug 2022 08:18:51 +0000 https://communia-association.org/?p=5876 COMMUNIA and Gesellschaft für Freiheitsrechte are pleased to announce the detailed programme of the Filtered Futures conference on September 19th in Berlin.

The post The Filtered Futures conference programme is now live appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte are pleased to announce the detailed programme of the Filtered Futures conference.

Taking place on Monday, September 19th, in Berlin at Robert Bosch Stiftung, Filtered Futures will discuss the consequences of the CJEU ruling on Article 17 of the Copyright Directive for fundamental rights. 

Registration for in-person attendance is now closed. It will be possible to follow the live stream of the conference here.

After the closing of the conference, COMMUNIA will be hosting a networking reception from 17:00 to 19:00.

Programme

08:45-09:15 Door Opening

09:15-09:45 Welcome and Opening Remarks by Susanne Zels (Robert Bosch Stiftung) and Felix Reda (GFF – Society for Civil Rights)

10:00-12:00 Session 1: Fragmentation or Harmonization? The impact of the Judgment on National Implementations – While the CJEU has rejected the Polish challenge to Article 17, the Court has formulated a number of requirements for ensuring that national implementations are fundamental rights compliant. In this light, the opening session of the conference will examine the consequences of the judgment for Member States’ implementations of Article 17. What are the requirements established by the judgment for national legislators? How do the existing national implementations measure up to these requirements? Which implementation strategies are available to those member states that still have to implement the directive? And have platforms already reacted to the existing national implementations?

  • Bernd Justin Jütte (​​University College Dublin): Imperatives for implementing Article 17: the importance of national implementations.
  • Finn Hümmer (Stockholm University): Implications from C-401/19 for national transpositions under the light of freedom of expression.
  • Jasmin Brieske (Goethe University Frankfurt am Main): The impact of the enactment of the German OCSSP Act on selected online platforms.
  • Christina Angelopoulos (University of Cambridge): The national implementations of Article 17 of the EU’s CDSM Directive.
  • Moderator: Paul Keller (COMMUNIA)

12:00-13:30 Lunch Break

13:30-15:00 Session 2: Balancing Enforcement & Usage Rights in Practice – Protecting legal forms of expression from automated blocking decisions by online platforms is not just a task for the national legislators when transposing Article 17, but also a question of implementation of those provisions by regulators and courts. Who is going to ensure that filtering systems will leave legal uses of copyright-protected works unaffected in practice? How can the balance of competing rights be enforced in cross-border situations? How does the ban on general monitoring obligations as interpreted by the CJEU constrain the content moderation obligations of platforms – in the context of Article 17, but also when applied to other types of illegal content? Will the Digital Services Act improve users’ access to effective remedies against over-blocking?

  • Natasha Mangal (University of Strasbourg): Regulating Creativity Online: Proposal for an EU Copyright Institution.
  • Daniel Holznagel (academia): Don’t touch the ceiling – Why we should not narrow the EU no-monitoring-obligation-rules.
  • Martin Husovec (London School of Economics): Mandatory Filtering Does Not Always Violate Freedom of Expression: Lessons from Poland v Council and European Parliament.
  • Moderator: Felix Reda (GFF – Society for Civil Rights)

15:00-15:30 Coffee Break

15:30-17:00 Session 3: Beyond the Judgment: The Future of Freedom of Expression – In its ruling, the CJEU was of the view that the procedural safeguards present in Article 17 protect the ‘essence’ of the right to freedom of expression of the users of online sharing platforms. But many argue that filtering mechanisms can still pose real risks to fundamental freedoms and to the flourishing of parodies, caricatures and pastiche. Is the CJEU classical approach to proportionality balancing apt in a filtered online environment? Do we need a new conceptualisation of the ‘essence’ of fundamental rights? Can the case law from the CJEU and the European Court of Human Rights on freedom of expression offer avenues to better the future of parodic uses? Finally, are we moving towards a European Right to Remix?

  • Kevin O’Sullivan (Dublin City University): A new conceptualisation of the ‘essence’ of fundamental rights.
  • Sabine Jacques (University of East Anglia, Law School): A two-tier system for freedom of expression.
  • Till Kreutzer (iRights.Law): Towards a European Right to Remix (?) – On the new Pastiche exception in the German Copyright Act.
  • Moderator: Teresa Nobre (COMMUNIA)

17:00-19:00 Reception hosted by COMMUNIA

*All times are indicated in CEST.

The post The Filtered Futures conference programme is now live appeared first on COMMUNIA Association.

]]>
Join us for the Filtered Futures conference on 19 September 2022 https://communia-association.org/2022/08/05/join-us-for-the-filtered-futures-conference-on-19-september-2022/ Fri, 05 Aug 2022 07:57:44 +0000 https://communia-association.org/?p=5863 On September 19th, 2022, we are organising — together with Gesellschaft für Freiheitsrechte — the Filtered Futures conference on fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the copyright directive. The CJEU decision on Article 17 of the copyright directive has defined a framework for the use of automated […]

The post Join us for the Filtered Futures conference on 19 September 2022 appeared first on COMMUNIA Association.

]]>
On September 19th, 2022, we are organising — together with Gesellschaft für Freiheitsrechte — the Filtered Futures conference on fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the copyright directive.

The CJEU decision on Article 17 of the copyright directive has defined a framework for the use of automated content moderation. The Court considers filtering obligations compatible with the right to freedom of expression and information as long as they are limited to use cases that allow for a robust automated distinction between legal and illegal content. In the context of Article 17, upload filters may therefore only be used by online platforms to block manifest infringements of copyright law. The Court leaves it up to the Member States to ensure that legal uses remain unaffected by their national transpositions of Article 17.

The judgment raises a host of important questions for the enforcement of copyright law as well as for the compatibility of upload filters with fundamental rights even beyond copyright law. To discuss these consequences, COMMUNIA and Gesellschaft für Freiheitsrechte are jointly hosting the Filtered Futures conference on Monday, September 19th, at Robert-Bosch-Stiftung in Berlin. Please see below for the preliminary conference programme. A more detailed version of the programme with session descriptions will follow in early September.

Registrations for attending the conference in person are now open. Please consider that participation is limited. Registrations will be considered on a first come, first serve basis.

CONFERENCE PROGRAMME:

08:45-09:15 Registration

09:15-09:45 Opening remarks (Felix Reda, GFF)

09:45-10:00 Coffee break

10:00-12:00 Session 1: Fragmentation or Harmonization? Impact of the Judgment on National Implementations (Christina Angelopoulous, Jasmin Brieske, Finn Hümmer, Bernd Justin Jütte. Chair: Paul Keller, COMMUNIA)

12:00-13:30 Lunch break

13:30-15:00 Session 2: Balancing Copyright & Usage Rights in Practice (Daniel Holznagel, Martin Husovec, Natasha Mangal. Chair: Felix Reda, GFF)

15:00-15:30 Coffee break

15:30-17:00 Session 3: Beyond filters: Impacts of the Judgment on Freedom of Expression (Sabine Jacques, Till Kreutzer, Kevin O’Sullivan. Chair: Teresa Nobre, COMMUNIA)

17:00-19:00 COMMUNIA Reception

Participation is free of charge and a light lunch will be served.

The post Join us for the Filtered Futures conference on 19 September 2022 appeared first on COMMUNIA Association.

]]>
Filtered Futures: a Conference to examine upload filters after the CJEU ruling on Art. 17 https://communia-association.org/2022/06/14/filtered-futures-a-conference-to-examine-upload-filters-after-the-cjeu-ruling-on-art-17/ Tue, 14 Jun 2022 07:09:37 +0000 https://communia-association.org/?p=5727 The recent CJEU decision on Article 17 of the copyright directive has defined a framework for the use of automated content moderation. The Court considers filtering obligations compatible with the right to freedom of expression and information as long as they are limited to use cases that allow for a robust automated distinction between legal […]

The post Filtered Futures: a Conference to examine upload filters after the CJEU ruling on Art. 17 appeared first on COMMUNIA Association.

]]>
The recent CJEU decision on Article 17 of the copyright directive has defined a framework for the use of automated content moderation. The Court considers filtering obligations compatible with the right to freedom of expression and information as long as they are limited to use cases that allow for a robust automated distinction between legal and illegal content. In the context of Article 17, upload filters may therefore only be used by online platforms to block manifest infringements of copyright law. The Court leaves it up to the Member States to ensure that legal uses remain unaffected by their national transpositions of Article 17.

The judgment raises a host of important questions for the enforcement of copyright law as well as for the compatibility of upload filters with fundamental rights even beyond copyright law. To discuss these consequences, together with Gesellschaft für Freiheitsrechte we are jointly organizing the “Filtered Futures” conference on Monday, September 19th 2022, in Berlin.

We are inviting papers from all disciplines contributing to the conference theme. To present your work at Filtered Futures, please complete the submission form by July 10th, 2022. The form asks for a short abstract of your talk. All applicants will be notified by July 22th, 2022.

In addition, we will offer an opportunity to present your work to a broader audience through the COMMUNIA website or a dedicated publication.

It will be possible for a limited number of people to attend the conference without presenting their work. Please request participation with: uploadfilter@freiheitsrechte.org

Participation will be free of charge. A light lunch will be served. A limited budget to support travel and accommodation expenses for presenters is available.

Possible topics for conference contributions include:

  1. The impact of the ruling on existing national implementations of Article 17:
    1. How are verbatim implementations to be interpreted?
    2. Does the Court mandate or enable a harmonized EU-wide technical implementation of Article 17 by platforms?
    3. Do any national implementations violate the standards set by the ruling?
    4. What role will the Commission guidance play in application of Article 17?
  2. Rights and obligations of rights holders and users:
    1. standards for “information provided by rightsholders”
    2. enforcement of user rights
    3. measures against misuse of copyright enforcement tools
    4. sanctions for non-compliance beyond platform liability?
  3. Minimum fundamental rights safeguards for the use of upload filters:
    1. different standards for voluntary (based on terms and conditions) and mandatory filtering by platforms?
    2. Do filters sufficiently distinguish between legal and illegal uses?
    3. ex-ante safeguards for use of upload filters
  4. Impacts on the relationship of Article 17 to other norms:
    1. intermediary liability for platforms that don’t qualify as OCSSPs
    2. Digital Services Act
    3. other sector-specific content regulation (TERREG, protection of minors)
  5. Implications of the ruling on CJEU freedom of expression case-law:
    1. prior restraint and its necessary safeguards
    2. scope of ban on general monitoring obligations

The post Filtered Futures: a Conference to examine upload filters after the CJEU ruling on Art. 17 appeared first on COMMUNIA Association.

]]>
Video recording of the COMMUNIA Salon on the CJEU decision on Article 17 https://communia-association.org/2022/05/04/video-recording-of-the-communia-salon-on-the-cjeu-decision-on-article-17/ Wed, 04 May 2022 15:12:25 +0000 https://communia-association.org/?p=5705 On the 28th of April, we hosted the second COMMUNIA Salon of 2022 to discuss the implications of the CJEU judgment in Case C-401/19, which rejected the request of the Polish government to annul Article 17 and confirmed that this provision can be reconciled with the right to freedom of expression provided that certain users […]

The post Video recording of the COMMUNIA Salon on the CJEU decision on Article 17 appeared first on COMMUNIA Association.

]]>
On the 28th of April, we hosted the second COMMUNIA Salon of 2022 to discuss the implications of the CJEU judgment in Case C-401/19, which rejected the request of the Polish government to annul Article 17 and confirmed that this provision can be reconciled with the right to freedom of expression provided that certain users rights safeguards are in place.

The Salon started with João Pedro Quintais (Assistant Professor at the Institute for Information Law (IViR), University of Amsterdam), who presented an overview of the case and the three main takeaways of the judgment, according to his preliminary reading of the judgment. First, the Court clarified that Article 17 follows a normative hierarchy, where the obligation of result to protect user rights or freedoms takes precedence over the obligations of best efforts that exist for preventive measures. Secondly, the ruling makes it clear that ex-post procedural safeguards are insufficient to take care of overblocking; ex-ante safeguards are also required to protect user rights or freedoms. Finally, with regards to filtering measures, it appears that it will be difficult to argue that the judgment leads to a conclusion that is different from the AG Opinion, according to which only manifestly infringing content can be blocked at upload.

Next, Marco Giorello (Head of the European Commission’s Copyright Unit at DG CONNECT) shared his first insights on the judgment. Giorello started by saying that the Commission was satisfied that the Court had not only confirmed the validity of Article 17 but it had also largely confirmed the interpretation of the provision brought forward by the Commission. He highlighted that, since the judgment did not define how exactly the national legislator has to implement Article 17, the Commission’s guidelines for the implementation of Article 17 (which Giorello could not yet confirm if the Commission would revise in light of the judgment) could help legislators, courts and market players to get a sense of what could be a practical way of implementing the general principles drawn by the CJEU. Finally, he added that, while it is not possible to draw firm conclusions on what the judgment means for the Member States’ implementation (namely if they could make literal implementations of Article 17), it is very clear that ex-post redress mechanisms are not enough and there needs to be an ex-ante consideration for users rights leading to the distinction between lawful/unlawful content at upload.

The third speaker, Felix Reda (former MEP and Control © project lead at the Gesellschaft für Freiheitsrechte), started by highlighting that, given that the CJEU had already confirmed that under very certain circumstances automated content recognition technologies can or should be used, he was quite happy with the outcome of the judgment, since the Court now sets specific requirements for upload filters, namely that they cannot be used unless they can ensure that lawful content does not get blocked, which is a very high bar to meet. Reda then focused his intervention on the discussion of who has to define the ex-ante safeguards against overblocking. In his view, the platforms cannot be the ones defining the technical parameters of the upload filters. According to Reda’s reading of the judgment, this follows logically from the conclusion that the platforms cannot be required to employ upload filters that do not adequately distinguish between legal and illegal content, together with the conclusion that they cannot be required to make an independent assessment of the lawfulness of the content at upload. As a consequence, the verbatim implementations of Article 17 appear to not be enough. Member States need to define ex-ante safeguards in the law (or, possibly, in secondary legislation).

Finally, Eliška Pírková (Global Freedom of Expression Lead at Access Now) presented the civil society and fundamental rights perspective and connected the discussion with the recently finalized Digital Services Act. Pírková started by recalling that the civil society has for many years challenged the deployment of upload filters because they impose ex-ante restrictions on legal forms of expression. Still, since upload filters are a reality, she welcomed the fact that online platforms do not have to turn into judges of the legality of uploaded content; that filtering system must be able to recognize, and not automatically block, lawful content; and the ex-ante safeguards of fundamental rights of users. She then turned on to discuss the relationship between the horizontal umbrella framework provided by the DSA and the sectoral legislation that precedes such regulation, such as Article 17 of the DSM directive.

The panel was followed by a Q&A session with the participants.

The post Video recording of the COMMUNIA Salon on the CJEU decision on Article 17 appeared first on COMMUNIA Association.

]]>