It is perhaps understandable that European lawmakers have large Internet companies in their sights. The Cambridge Analytica scandal and subsequent data breaches at Facebook led to admonishments and fines. Germany and France have passed laws on misinformation in the wake of findings that malicious actors could have exploited the way news stories are displayed and spread on sites like Twitter.
The recent and controversial vote to adopt the proposed Directive on Copyright in the Digital Single Market could be read as another act in a tale of European lawmakers holding American web giants to account. But when it comes to Article 13 [now 17] which concerns the liability of internet platforms, the end result resembles a Wild West scenario where the most powerful are likely to thrive and where the rights of citizens will be less protected. The Directive appears to reward large Internet incumbents by solidifying their competitive advantage, while neglecting the expressive freedom of European internet users.
Before discussing the specifics of the directive, it is important to understand why the current intermediary liability regime evolved to be balanced the way it has, and why the newly adopted Article 17 upsets the balance so significantly. By the late 1990s, the growth of the Internet had created a new problem for lawmakers. The scale of many web services grew such that it was impossible for their operators to inspect each individual message that circulated on their systems. Some messages might contain material which infringed the copyright of third parties. But without monitoring and inspection, who should be held liable for these copyright infringements?
For the past 20 years, the Digital Millennium Copyright Act (DMCA) in the United States and the E-commerce Directive (2000/31/EC) in Europe have provided safe harbour for online service providers from liability for infringements carried out by their users. A system of notice-and-takedown balanced the needs of copyright owners and platforms by requiring the latter to take action to remove infringing content once notified by a rights holder. A system of due process consisting of rules about the specific content of notifications and counter-notifications placed obligations on the different parties. Over time, this evolved into an increasingly automated system in which computer systems were used to intake ever-larger quantities of takedown requests and process them.
Under the notice-and-takedown regime which has operated until now, a considerable burden is borne by rights holders. They must seek out, locate, and notify platforms about instances of infringement. This requires investment, and large media rights holders have invested heavily in copyright enforcement, either in-house or outsourced to third-party agents like Web Sheriff. At the same time, Internet platforms have had to invest in systems to receive and process takedown requests.
The current balance has some considerable benefits. It is easier to detect and remove an instance of straight piracy than to detect a transformative use which might benefit from a fair dealing exception, such as a parody or review. Because of the resource requirement, rights holders are incentivised to go after clear cases of infringement. The law is also balanced to require that rights holders consider exceptions and limitations to copyright before issuing a takedown request. Users who feel that their use is ‘fair’ may issue a counter-notice, although empirical research shows this feature has been under-used. The aim of protecting freedom of expression is built into the existing regime, although it could be improved further. Importantly, the current regime applies equally to different types and sizes of online intermediaries, and favours start-up innovation.
The newly adopted Copyright Directive changes the balance of intermediary liability and moves Europe in a different direction from the current global standard. First, Article 17 creates a new legal category of service providers called online content-sharing service providers (read: YouTube, Instagram). The Directive requires that online content-sharing service providers obtain a license from a rights holder for all works uploaded by their users and for which they do not have authorisation. It is expected that online service providers will negotiate a license rate with major rights holders (such as music publishers and collective management organisations) to compensate owners for potentially infringing uploads. But for the many other uploads where the rights holder is unknown, service providers will be required to carry out a ‘best efforts’ diligent search for a rights holder to request a licence, or be liable for infringement.
What about remixes, mash-ups and memes? The Directive contains language aimed at protecting new expressive uses covered by exceptions, such as parody, criticism and review, but the determination of whether a use benefits from an exception now falls to the platform operator. With incentives re-balanced this way, is an online content service provider likely to take a risk on a given piece of user-generated content or remove it and avoid liability? Another concern is the scale of uploads that providers must handle. Human review of uploads to check whether a use is fair is likely to be practically impossible. So, if the law is implemented this way, the likely result will be algorithmic filtering of uploaded content, erring on the side of removal to avoid liability for user-generated content.
It is easy to see how this law could be potentially remunerative for large holders of copyrights. It is less clear how it will help smaller rights holders who might never be found, and might lack the resources to engage in enforcement. Other than pitting large internet platforms against the expressive behaviour of their own users, it is not necessarily bad news for the bottom line of Internet giants like YouTube. Since the video-sharing platform controls a large audience, YouTube will be among the first stops for rights holders wishing to sell licenses. It also controls a highly effective industry-leading content detection system, and algorithmic filtering technology that can be deployed to satisfy the requirements of the new regime. YouTube has a competitive advantage over other service providers who operated comfortably under the notice-and-takedown regime but now must start from scratch.
Reading the proposed compromises to the draft of the directive is illustrative. We see the European Parliament and the presidency grappling with many of the same balancing issues which gave us the current intermediary liability regime in the first place. Worried about the negative impact on digital innovators, a clause was introduced to exclude smaller, early stage start-ups from the requirement to license. But what should be the precise cut-off in terms of annual turnover? How many years should a service provider be permitted to operate under reduced liability? The final text of the Directive sets the threshold at €10 million annual turnover and three years of operation, with a clause that the system should be additionally reviewed by the Commission after the law is implemented.
Similarly, the negotiations focus on the definition of ‘best efforts’ that must be made by service providers to locate and obtain licenses from rights holders. Rather than solve existing problems, the directive creates a raft of new legal uncertainties. How is a rights holder to know whether a particular platform falls under the reduced liability regime? For platforms, what is the standard of diligence that must be applied when searching for rights holders who have not given prior authorisation? Reliance on a vaguely-defined diligent search standard in the Orphan Works Directive (2012/28/EU) gave cultural institutions little clarity about best practices and led to a sprawling patchwork of different national standards of diligent search. This in turn increased transaction costs for would-be digitisers of cultural heritage, the very effect lawmakers sought to avoid.
There has been broad opposition to Article 13 [17] from law and economics scholars, open knowledge advocates, innovators and users. By changing the balance of intermediary liability, the EU has rejected the legal certainty and process of 20 years of notice-and-takedown, instead opting for a disorderly regime where innovation will be riskier and the one with the quickest filters is likely to prevail.
♣♣♣
Notes:
- This blog post appeared originally on LSE Media Policy Project.
- The post gives the views of the author, not the position of LSE Business Review or the London School of Economics.
- Featured image by Clker-Free-Vector-Images, under a Pixabay licence
- When you leave a comment, you’re agreeing to our Comment Policy.
Kris Erickson is associate professor in media and communication at the University of Leeds. His areas of expertise are innovation policy; online communities; creative industries; copyright; open source; and peer production.