Bernd Justin Jütte, Author at COMMUNIA Association https://communia-association.org/author/justin/ Website of the COMMUNIA Association for the Public Domain Mon, 24 Oct 2022 08:46:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://communia-association.org/wp-content/uploads/2016/11/Communia-sign_black-transparent.png Bernd Justin Jütte, Author at COMMUNIA Association https://communia-association.org/author/justin/ 32 32 Implementation Imperatives for Article 17 CDSM Directive https://communia-association.org/2022/10/24/implementation-imperatives-for-article-17-cdsm-directive/ Mon, 24 Oct 2022 08:00:01 +0000 https://communia-association.org/?p=6029 COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation […]

The post Implementation Imperatives for Article 17 CDSM Directive appeared first on COMMUNIA Association.

]]>
COMMUNIA and Gesellschaft für Freiheitsrechte co-hosted the Filtered Futures conference on 19 September 2022 to discuss fundamental rights constraints of upload filters after the CJEU ruling on Article 17 of the Directive on Copyright in the Digital Single Market (CDSMD). This blog post is based on the author’s contribution to the conference’s first session “Fragmentation or Harmonisation? The impact of the Judgement on National Implementations.” It first appeared on Kluwer Copyright Blog and is here published under a Creative Commons Attribution 4.0 International licence (CC BY 4.0).

The adoption of the CDSM Directive marked several turning points in EU copyright law. Chief amongst them is the departure from the established liability exemption regime for online content-sharing service providers (OCSSPs), a type of platform that (at the time) was singled out from the broader category of information society service providers regulated for the last 20 years by the E-Commerce Directive. To address a very particular problem – the value gap – Article 17 of the CDSM Directive changed (arguably, after the ruling in (YouTube/Cyando, C-682/18) the scope of existing exclusive rights, introduced new obligations for OCSSPs, and provided a suite of safeguards to ensure that the (fundamental) rights of users would be respected. The attempt to square this triangle resulted in a monstrosity of a provision. In wise anticipation of the difficulties Member States would face in implementing, and OCSSPs in operationalizing Article 17, the provision itself foresees a series of stakeholder dialogues, which were held in 2019 and 2020. Simultaneously, the drama was building up with a challenge launched by the Republic of Poland against important parts of Article 17. Following the conclusion of all these processes, and while (some) Member States are considering reasonable ways to transpose Article 17 into their national laws, it is time to take stock and to look ahead. The ruling in Poland v Parliament and Council (C-401/19) is a good starting point for such an exercise.

Shortly after the CDSM Directive was adopted, the Republic of Poland sought to annul those parts of Article 17 which it argued, and the Court later confirmed, effectively require OCSSPs to prevent (i.e. filter and block) user uploads. Preventing control of user uploads constitutes, the Court confirmed, a limitation of the right to freedom of expression as protected under Article 11 of the EU Charter of Fundamental Rights. In its argumentation Poland suggested that it might not be possible to cut up Article 17, a provision of ten lengthy paragraphs. Indeed, the intricate relations between specific obligations, new liability structures and substantive and procedural user safeguards cannot be seen in isolation, and therefore require a global assessment. And this is what the Court embarked upon. Whilst assessing the constitutionality of Article 17, the Luxembourg judges, in passing, provided some valuable insights into how a fundamental rights compliant transposition might look, but also left crucial questions unanswered.

The Court was very clear that any implementation of Article 17 – as a whole – must respect the various fundamental rights that are affected. However, the judges in Luxembourg did not go into detail on how Member States should transpose the provision. Of course, it is part of the nature of directives that Member States have a certain margin of discretion how the objectives pursued by a directive will be achieved. To complicate matters, the Court stated expressly that the limitation on the exercise of the right to freedom of expression contained in Article 17 was formulated in a ‘sufficiently open’ way so as to keep pace with technological developments. Together with the intricate structure of Article 17 itself, this openness tasks Member States with a difficult exercise: to arrive at a transposition of Article 17 (and of course also other provisions of the CDSM Directive) that achieves the objective pursued while respecting the various fundamental rights affected (see Geiger/Jütte). The various national implementation processes have already demonstrated that opinions differ as to what constitutes a fundamental rights compliant implementation, and proposals have been made at different points of the spectrum between rightsholder-friendly and user-friendly transpositions.

The Court’s ruling makes a few, very important statements in this respect and thereby sets the guardrails beyond which national transpositions should not venture. First and foremost, control of user uploads must be strictly targeted at unlawful uses without affecting lawful uses. Recognizing that the employment of online filters is necessary to ensure the effective protection of intellectual property rights, the ruling highlights the various safeguards that ensure that the limitation of the right to freedom of expression is a proportionate one. Implicit here is that the targeted filtering of user uploads is a limitation of this fundamental right that necessitates six distinct measures that must be put in place, and which, in concert, ensure respect for the rights of users on OCSSP platforms.

Targeted Filtering

To limit the negative effects of overblocking and overzealous enforcement, any intervention by OCSSPs must be aimed at bringing the infringement to an end and should not interfere with the rights of other users to access information on such services. This suggests that it must be clear that only unlawful, or infringing content is targeted. The Court elaborates that within the context of Article 17 only filters which can adequately distinguish between lawful and unlawful content, without requiring OCSSPs to make a separate legal assessment, are appropriate. This is problematic, since copyright infringements are context sensitive, in particular in relation to potentially applicable exceptions and limitations. Requiring rightholders to obtain court-ordered injunctions before content can be subject to preventive filtering and blocking seems unreasonable, considering the amount of information uploaded onto online platforms. On the one side, copyright infringements are different from instances of defamation or other offensive speech, that was the subject of the preliminary reference in Glawischnig-Piesczek (C-18/18). On the other, the requirement of targeted filtering seems to eliminate the proposal made by the Commission in its Guidance (see Reda/Keller) to allow rightholders to earmark commercially sensitive content, which could be subject to preventive filtering for a certain limited amount of time. Unfortunately, the Court does not elaborate further how OCSSPs can target their interventions. It merely states that rightholders must provide OCSSPs with the relevant and necessary information on unlawful content, which failure to remove would trigger liability under Article 17(4) (b) and (c). What this information must contain remains unclear. Arguably, rightholders must make it very clear that certain uploads are indeed infringing.

User Safeguards

In terms of substantive safeguards, Article 17 takes a frugal approach. To avoid speech being unduly limited, it requires Member States to ensure that certain copyright exceptions must be available to users of OCSSPs. These exceptions are already contained in Article 5 of the Information Society Directive as optional measures, but Article 17(7) makes them mandatory (see Jütte/Priora). Arguably not much changes with the introduction of existing exceptions (now in mandatory form). Moreover, the danger of context-insensitive automated filtering still persists, even though users enjoy these substantive rights.

Therefore, Article 17 foresees procedural safeguards, which is where the balance in Article 17 is struck. With the Court having confirmed that preventive filtering is an extreme limitation of freedom of expression, the importance of the procedural safeguards cannot be overstated (see Geiger/Jütte). The ruling in its relevant parts can be described as anticlimactic. Instead of describing how effective user safeguards must be designed, the judgment merely underlines that these safeguards must be implemented in a way that ensures a fair balance between fundamental rights. How such balance can be achieved was demonstrated in summer 2022, when the Digital Services Act (which is still to be formally adopted) took shape. A horizontally applicable regulation that amends the E-Commerce Directive, the DSA puts flesh to the bones that the CDSM Directive so clumsily constructed into a normative skeleton.

The DSA provides far more detailed procedural safeguards compared to the CDSM Directive. It is also lex posterior to the latter, which in itself is, however, lex generalis to the former. Their relation, but arguably also their genesis, holds the key to outlining not necessarily the solution to the CDSM conundrum, but to the questions that national legislators must ask.

The DSA sets out how hosting services and online platforms must react to notifications of unlawful content and how they must handle complaints internally; the DSA also outlines a system for out-of-court dispute settlement, which requires certification by an external institution. Some of these elements are, in embryonic form as mere abstract obligations, already contained in the CDSM Directive’s Article 17. And by definition, OCSSPs are hosting providers and online platforms in the parlance of the DSA, which is why these rules should also apply to them. OCSSPs, however, incur special obligations and are exempted from the general liability of the E-Commerce Directive and (soon) the DSA, because they are more disruptive – of the use of works and other subject matter protected by copyright and of the rights of users, the latter as a result of obligations incurred because of the former.

That some of the rules introduced by the DSA must also apply to OCSSPs has been argued elsewhere (see Quintais/Schwemer). It has been suggested that the rules of the DSA should apply in areas in which the DSA leaves Member States a margin of discretion or where the CDSM Directive is silent. CDSM rules that derogate from those of the DSA, specifically the absence of a liability exemption for user uploaded content will certainly remain unaffected. But there are good arguments to be made why in areas of overlapping scope, the DSA should prevail, or systematically supplement the CDSM Directive. Instead, the DSA rules must form the floor of safeguards that Member States have to provide, and which should be elevated in relation to the activities of OCSSPs. The reason is a shifting of balance between the fundamental rights concerns, which relates to the last paragraph of the CJEU’s ruling in Poland v Parliament and Council. The obligation to proactively participate in the enforcement of copyright intensifies the intervention of OCSSPs, as opposed to the merely reactive intervention required under the rules of the DSA. The effects for rightholders are beneficial (although the rationale for Article 17 suggests that it addresses a technological injustice) in the sense that OCSSPs must intervene in a higher volume of cases; the negative effects are borne by users of platforms, their rights are limited as a result. Arguably, this must be balanced by a higher level of protection of users, in this case in the form of stronger and more robust procedural safeguards. A further elevation of substantive safeguards would in itself not be helpful, since their enjoyment relies effectively on procedural support.

As a result, Member States should, or even must, consider that the elaboration of user safeguards in the form of internal complaints mechanisms and out-of-court dispute settlement mechanisms must find concrete expression in their national transpositions. They should be more robust than those provided by the DSA. Ideally, this robustness will be written into national laws and not left to be defined by OCSSPs as executors of the indecisiveness of legislators. The difficulty lies, of course, in the uncertainty of technological process, the development of user behaviour and the rise and fall of platforms and their business models. Admittedly, some sort of flexibility is necessary, the Court has recognized this explicitly. But if the guardrails that guarantee compliance with fundamental rights are not, and possibly cannot be written into the law, they must be determined by another independent institution. One institution, understood more broadly, could be the stakeholder dialogues required pursuant to Article 17(10) CDSM Directive. The Court listed them as one of the applicable safeguards, and a continuous dialogue between OCSSPs, rightholders and users could serve to define these guardrails more concretely. Such a trilogue, however, must take as its point of departure the ruling in C-401/19 and learn from the misguided first round of stakeholder dialogues. Another institution that could progressively develop standards for user safeguards are formal independent institutions, such as the Digital Service Coordinators (DSCs) required under the DSA. There, they have the task, amongst others, of certifying out-of-court dispute settlement institutions and awarding the status of trusted flaggers. Their tasks could also include general supervision and auditing of OCSSPs with regard to their obligations arising not only under the DSA, but also under the CDSM Directive. In the context of the Directive, DSCs could also be tasked with developing and supervising a framework for rightholders as ‘trusted flaggers’ on online content-sharing platforms. The role of DSCs under the DSA and their potential role in relation to OCSSPs is still unclear. But the reluctance of the legislature to give substance to safeguards and to clarify the relation between the DSA and the CDSM Directive mandates that the delicate task of reconciling the reasonable interests of rightholders and the equally important and vulnerable interest of users be managed by independent arbiters. Leaving this mitigation to platform-based complaint mechanism or independent dispute settlement institutions is an easy solution. A constitutionally sound approach would try to solve these fundamental conflicts at an earlier stage instead of making private operators the guardians of freedom of expression and other fundamental rights. While the Court of Justice of the European Union has not stated this expressly, its final reference to the importance of implementing and applying Article 17 in light of, and with respect to fundamental rights can be understood as a warning not to take the delegation of sovereign tasks too lightly.

The post Implementation Imperatives for Article 17 CDSM Directive appeared first on COMMUNIA Association.

]]>