On 15 October, the European Commission held the first of the stakeholder
dialogues, mandated by Article 17 of the EU copyright Directive,
inviting 65 organisations to help map current practices, and opening the
door for deeper collaboration in the future.
Support our work with a donation!
Organisations from all sides of the debate were able to present their
positions. While the first meeting focused on music, software and
gaming, the next one will focus on audiovisual, visual, sports and text.
These live-streamed dialogues are probably the last window of
opportunity at the EU level for those who campaigned against upload
filters in the copyright Directive to achieve the alleged goals of the
Directive ? harmonisation and modernisation of the copyright framework ?
without the collateral damage to citizens? liberties. If the dialogues
fail to achieve this, the battle will move to EU Member States.
The Copyright Directive was adopted as part of plans to unite Europe?s
Digital Single Market in June 2019 ? just over a year after the General
Data Protection Regulation (GDPR) was adopted, and in the midst of an
ongoing struggle over the proposed ePrivacy Regulation. The contentious
Directive was welcomed by rightsholders who were keen to see online
platforms take responsibility for copyright infringement; but it
received criticism across civil society, academia, UN Special Rapporteur
on Freedom of Expression David Kaye and even Edward Snowden, for
enabling the removal of citizens? legal content by automatic filters.
?Techno-solutionism? as a knee-jerk reaction
Techno-solutionism describes attempts to solve any and all problems with
technology. The technologically-focused approach taken in the Directive
and advocated for by some rightsholders is the wrong solution for the
alleged problem (lack of negotiating power between rightsholders and
streaming services). The upload filters deriving from Article 17 are
severely error-prone (from cat purring being mistaken for copyrighted
music, to evidence of war crimes being lost) and do not understand the
full range of nuanced human expression, for example caricature, parody
or pastiche. This situation empowers tech giants, harms small and medium
enterprises, and fails to adequately protect authors. Furthermore,
Article 17(7) of the Directive offers only limited mandatory exceptions
for the use of content for quotation, parody or pastiche. Member States
still have the opportunity to go beyond these exceptions and make all
exceptions and limitations mandatory. However, the proposed automated
filters will not be able to deal with the analysis of most of them. A
more nuanced approach towards copyrighted content will be needed,
including human supervision.
Violations and harms in the current situation
More than just theoretically flawed, the application of the copyright
Directive could lead to violation of freedoms. So-called ?copyright
trolling? is a phenomenon used to either extort or censor individual
users. When implementing the Directive, Member States should enable
systems that penalise such abuses. Furthermore, the use of automated
filters may collide with Article 22 of the GDPR which gives the right to
data subjects not to be subject to a decision based solely on automated
processing if that decision significantly affects them. How this will be
dealt with in practice is to be seen.
Fundamental incompatibility with the human right to redress
The right to redress is a fundamental principle for this Directive to
avoid collateral damages. The current redress mechanism has already been
shown to be inadequate, as platforms are likely to turn to their Terms
of Service as the excuse to delete content rather than going through the
hassle of deciding if this or that exception or limitation in the
Directive protects their right to use copyrighted content. We hope that
the non-judicial redress mechanisms mentioned in Article 17(9) are
easily and freely available to anyone needing them.
Reframing the debate to prevent violations of free expression
If the goal is indeed to target services that unfairly benefit from
authors? work, then the definition of Online Content Sharing Service
Providers (OCSSPs) must be made more specific; it has to better reflect
the few services that specifically profit from infringing copyright at
large scale to the extent that they become alternatives to paid
streaming services and that do not adequately remunerate rightsholders.
Another possible solution is to reverse the burden of proof so that
disputed content is not immediately removed. In essence, silence cannot
play to the disadvantage of citizens: if platforms ask rightsholders for
a licence, and the rightsholder does not react, this should mean that
the ?best efforts? threshold to obtain license has been met by the
platform. If a rightsholder asks to block the content of a user and the
user claims that they were within their right, the silence of the
rightsholder should imply that the disputed content stays or is
reinstated as soon as possible. In the case of disagreement in the
dispute, human intervention would be appropriate.
The next stakeholder meeting will be held on 5 November.
First meeting of the Stakeholder Dialogue on Art 17 of the Directive on
Copyright in the Digital Single Market (15.10.2019)
Organisation of a stakeholder dialogue on the application of Article 17
of Directive on Copyright in the Digital Single Market (28.08.2019)
All you need to know about copyright and EDRi (15.03.2019)
Copyfails: time to #fixcopyright! (23.05.2016)
Article 17 Stakeholder Dialogue: We?ll Continue to Advocate for
Safeguarding User Rights (08.10.2019)
(Contribution by Ella Jakubowska, EDRi intern)