Why #Article13 inevitably requires filters

When the German MEP Axel Voss took over the new EU Copyright Directive and reinstated the controversial Article 13, he was explicit that the idea of the rule was to make all online services use filters, similar to Youtube's Content ID, to screen everything their users posted and block anything that seemed to match any unlicensed copyrighted work, anywhere.

But filters are so expensive that only US Big Tech companies could afford them, and they are incapable of distinguishing fair dealing (including things like the music playing in the background of the video of your child's first steps) from infringement, and they are incredibly error prone, to say nothing of the problems of allowing anyone in the world to identify creative works as their copyright with no means to weed out false and fraudulent claims.

So eventually the word "filters" was purged from the Directive, and with weeks to go before the final vote around March 25, backers of Article 13 are insisting that the Copyright Directive has nothing to do with filters.

But as Communia's detailed legislative analysis of Article 13 shows, there is no way to comply with its rules without filters.

Article 13(4) lists a total of 4 different measures that platforms have to implement, two of which do not apply smaller platforms that are younger than three years:

1. All platforms have to make “best efforts” to license all copyrighted works uploaded by their users (the yellow box in the middle). We have already established that it is impossible to actually license all works, so a lot depends on how “best efforts” will be interpreted in practice. On paper this rather vague term is highly problematic since there is a virtually unlimited number of rights holders who could license their works to the platforms. It will be economically impossible for all but the biggest platforms to continue licenses with large numbers of rights holders. For all other platforms this provision will create substantial legal uncertainty (which will result in the need for lots of expensive legal advice which in turn will make it very difficult for smaller platforms to survive).

2. In addition, all platforms will have to make “best efforts to take down works upon notice from rightsholders”. This provision re-introduces the notice and take down obligation that platforms currently have under the E-Commerce Directive and as such it is nothing new.

3. On top of this all platform with more than 5 million monthly users will also need to implement a “notice and stay down” system (the top-most red box). This means that it will need to ensure that works that have been taken down after a notice from rightsholders cannot be re-uploaded to the platform. This requires platforms to implement filters that can recognise these works and filter them out. In terms of technology these filters will work the same as the more general upload filters introduced in the next step.

4. Finally all platforms that exist for more than 3 years or that have more than €10 million in yearly revenue will need to make “best efforts to ensure the unavailability of specific works for which the rightsholders have provided the service providers with the relevant and necessary information”. At scale this obligation can only be achieved by implementing upload filters that block the upload of the works identified by rightsholders.

Taken together this means that only a small number of platforms (those that are less than 3 years old and have less than €10 million in revenue) will be temporarily excepted from the obligation to implement upload filters. Regardless of how often proponents of Article 13 stress that the final text does not contain the word “filters” there cannot be any doubt that adopting Article 13 will force almost all platforms in the EU to implement such filters.

A final x-ray of Article 13: legislative wishful thinking that will hurt user rights [Communia]

(via Techdirt)