Article 13 is the on-again/off-again controversial proposal to make virtually every online community, service, and platform legally liable for any infringing material posted by their users, even very briefly, even if there was no conceivable way for the online service provider to know that a copyright infringement had taken place.
This will require unimaginable sums of money to even attempt, and the attempt will fail. The outcome of Article 13 will be a radical contraction of alternatives to the U.S. Big Tech platforms and the giant media conglomerates. That means that media companies will be able to pay creators less for their work, because creators will have no alternative to the multinational entertainment giants.
Throwing Creators Under the Bus
The media companies lured creators' groups into supporting Article 13 by arguing that media companies and the creators they distribute have the same interests. But in the endgame of Article 13, the media companies threw their creator colleagues under the bus, calling for the deletion of clauses that protect artists' rights to fair compensation from media companies, prompting entirely justifiable howls of outrage from those betrayed artists' rights groups.
But the reality is that Article 13 was always going to be bad for creators. At best, all Article 13 could hope for was to move a few euros from Big Tech's balance-sheet to Big Content's balance-sheet (and that would likely be a temporary situation). Because Article 13 would reduce the options for creators by crushing independent media and tech companies, any windfalls that media companies made would go to their executives and shareholders, not to the artists who would have no alternative but to suck it up and take what they're offered.
After all: when was the last time a media company celebrated a particularly profitable year by increasing their royalty rates?
It Was Always Going to Be Filters
The initial versions of Article 13 required companies to build copyright filters, modeled after YouTube's "Content ID" system: YouTube invites a select group of trusted rightsholders to upload samples of works they claim as their copyright, and then blocks (or diverts revenue from) any user's video that seems to match these copyright claims.
There are many problems with this system. On the one hand, giant media companies complain that they are far too easy for dedicated infringers to defeat; and on the other hand, Content ID ensnares all kinds of legitimate forms of expression, including silence, birdsong, and music uploaded by the actual artist for distribution on YouTube. Sometimes, this is because a rightsholder has falsely claimed copyrights that don't belong to them; sometimes, it's because Content ID generated a "false positive" (that is, made a mistake); and sometimes it's because software just can't tell the difference between an infringing use of a copyrighted work and a use that falls under "fair dealing," like criticism, commentary, parody, etc. No one has trained an algorithm to recognise parody, and no one is likely to do so any time soon (it would be great if we could train humans to reliably recognise parody!).
Copyright filters are a terrible idea. Google has spent a reported $100 million (and counting) to build a very limited copyright filter that only looks at videos and only blocks submissions from a select group of pre-vetted rightsholders. Article 13 covers all possible copyrighted works: text, audio, video, still photographs, software, translations. And some versions of Article 13 have required platforms to block infringing publications of every copyrighted work, even those that no one has told them about: somehow, your community message-board for dog-fanciers is going to have to block its users from plagiarising 50-year-old newspaper articles, posts from other message-boards, photos downloaded from social media, etc. Even the milder "compromise" versions of Article 13 required online services to block publication of anything they'd been informed about, with dire penalties for failing to honour a claim, and no penalties for bogus claims.
But even as filters block things that aren't copyright infringement, they still allow dedicated infringers to operate with few hindrances. That's because filters use relatively simple, static techniques to inspect user uploads, and infringers can probe the filters' blind-spots for free, trying different techniques until they hit on ways to get around them. For example, some image filters can be bypassed by flipping the picture from left to right, or rendering it in black-and-white instead of color. Filters are "black boxes" that can be repeatedly tested by dedicated infringers to see what gets through.
For non-infringers — the dolphins caught in copyright's tuna-nets — there is no underground of tipsters who will share defeat-techniques to help get your content unstuck. If you're an AIDS researcher whose videos have been falsely claimed by AIDS deniers in order to censor them, or police brutality activists whose bodycam videos have been blocked by police departments looking to evade criticism, you are already operating at the limit of your abilities, just pursuing your own cause. You can try to become a filter-busting expert in addition to your research, activism, or communications, but there are only so many hours in a day, and the overlap between people with something to say and people who can figure out how to evade overzealous (or corrupted) copyright filters just isn't very large.
All of this put filters into such bad odor that mention of them was purged from Article 13, but despite obfuscation, it was clear that Article 13's purpose was to mandate filters: there's just no way to imagine that every tweet, Facebook update, message-board comment, social media photo, and other piece of user-generated content could be evaluated for copyright compliance without an automated system. And once you make online forums liable for their users' infringement, they have to find some way to evaluate everything their users post.
Just Because Artists Support Media Companies, It Doesn't Mean Media Companies Support Artists
Spending hundreds of millions of euros to build filters that don't stop infringers but do improperly censor legitimate materials (whether due to malice, incompetence, or sloppiness) will not put any money in artists' pockets.
Which is not to say that these won't tilt the balance towards media companies (at least for a while). Because filters will always fail at least some of the time, and because Article 13 doesn't exempt companies from liability when this happens, Big Tech will have to come to some kind of accommodation with the biggest media companies — Get Out Of Jail cards, along with back-channels that media companies can use to get their own material unstuck when it is mistakenly blocked by a filter. (It's amazing how often one part of a large media conglomerate will take down its own content, uploaded by another part of the same sprawling giant.)
But it's pretty naive to imagine that transferring money from Big Tech to Big Content will enrich artists. Indeed, since there's no way that smaller European tech companies can afford to comply with Article 13, artists will have no alternative but to sign up with the major media companies, even if they don't like the deal they're offered.
Smaller companies play an important role today in the EU tech ecosystem. There are national alternatives to Instagram, Google, and Facebook that outperform U.S. Big Tech in their countries of origin. These will not survive contact with Article 13. Article 13's tiny exemptions for smaller tech companies were always mere ornaments, and the latest version of Article 13 renders them useless.
Smaller tech companies will also be unable to manage the inevitable flood of claims by copyright trolls and petty grifters who see an opportunity.
Smaller media companies — often run by independent artists to market their own creations, or those of a few friends — will likewise find themselves without a seat at the table with Big Tech, whose focus will be entirely on keeping the media giants from using Article 13's provisions to put them out of business altogether.
Meanwhile, "filters for everything" will be a bonanza for fraudsters and crooks who prey on artists. Article 13 will force these systems to err on the side of over-blocking potential copyright violations, and that's a godsend for blackmailers, who can use bogus copyright claims to shut down artists' feeds, and demand money to rescind the claims. In theory, artists victimised in this way can try to get the platforms to recognise the scam, but without the shelter of a big media company with its back-channels into the big tech companies, these artists will have to get in line behind millions of other people who have been unjustly filtered to plead their case.
If You Think Big Tech Is Bad Now…
In the short term, Article 13 tilts the field toward media companies, but that advantage will quickly evaporate.
Without the need to buy or crush upstart competitors in Europe, the American tech giants will only grow bigger and harder to tame. Even the aggressive antitrust work of the European Commission will do little to encourage competition if competing against Big Tech requires hundreds of millions for copyright compliance as part of doing business — costs that Big Tech never had to bear while it was growing, and that would have crushed the tech companies before they could grow.
Ten years after Article 13 passes, Big Tech will be bigger than ever and more crucial to the operation of media companies. The Big Tech companies will not treat this power as a public trust to be equitably managed for all: they will treat it as a commercial advantage to be exploited in every imaginable way. When the day comes that FIFA or Universal or Sky needs Google or Facebook or Apple much more than the tech companies need the media companies, the tech companies will squeeze, and squeeze, and squeeze.
This will, of course, harm the media companies' bottom line. But you know who else it will hurt?
Because media giants, like other companies who have a buyer's market for their raw materials — that is, art and other creative works — do not share their windfalls with their suppliers, but they absolutely expect their suppliers to share their pain.
When media companies starve, they take artists with them. When artists have no other option, the media companies squeeze them even harder.
What Is To Be Done?
Neither media giants nor tech giants have artists' interests at heart.
Both kinds of company are full of people who care about artists, but institutionally, they act for their shareholders, and every cent they give to an artist is a cent they can't return to those investors.
One important check on this dynamic is competition. Antitrust regulators have many tools at their disposal, and those tools have been largely idle for more than a generation. Companies have been allowed to grow by merger, or by acquiring nascent competitors, leaving artists with fewer media companies and fewer tech companies, which means more chokepoints where they are shaken down for their share of the money from their work.
Another important mechanism could be genuine copyright reform, such as re-organizing the existing regulatory framework for copyright, or encouraging new revenue-sharing schemes such as voluntary blanket licenses, which could allow artists to opt into a pool of copyrights in exchange for royalties.
Any such scheme must be designed to fight historic forms of corruption, such as collecting societies that unfairly share out license payments, or media companies that claim these. That's the sort of future-proof reform that the Copyright Directive could have explored, before it got hijacked by vested interests.
In the absence of these policies, we may end up enriching the media companies, but not the artists whose works they sell. In an unfair marketplace, simply handing more copyrights to artists is like giving your bullied kid extra lunch-money: the bullies will just take the extra money, too, and your kid will still go hungry.
Artists Should Be On the Side of Free Expression
It's easy to focus on media and art when thinking about Article 13, but that's not where its primary effect will be felt.
The platforms that Article 13 targets aren't primarily entertainment systems: they are used for everything, from romance to family life, employment to entertainment, health to leisure, politics and civics, and more besides.
Copyright filters will impact all of these activities, because they will all face the same problems of false-positives, censorship, fraud and more.
The arts has always championed free expression for all, not just for artists. Big Tech and Big Media already exert enormous control over our public and civic lives. Dialing that control up is bad for all of us, not just those of us in the arts.
Artists and audiences share an interest in promoting the fortunes of artists: people don't buy books or music or movies because they want to support media companies, they do it to support creators. As always, the right side for artists to be on is the side of the public: the side of free expression, without corporate gatekeepers of any kind.