Why the EFF is Wrong and the DMCA Safe Harbour Must Be Scrapped

Popzazzle | Monday, 16 November 2020 |

"Look at this!... That's over FOUR MILLION Google results for people on YouTube EXPLICITLY saying they're in breach of copyright, and yet, oh look, the videos are still up."


Protection
Photo by Ricardo Resende on Unsplash (image modified).

By Bob Leggitt
© Popzazzle
Follow

It's inevitable. The Digital Millennium Copyright Act's “safe harbour” is always going to be championed by the tech oligarchs who exploit it to steal content by proxy. But bizarrely, it's also advocated by a lot of high-integrity, highly-respected people and organisations. One such example is the Electronic Frontier Foundation - more often cited by its abbreviation, the EFF.

Routinely, the EFF is firmly against the abuses perpetrated by big tech, and it frequently speaks for and acts on behalf of the powerless against the mighty. It's an organisation that small contributors to the Web desperately need. But it supports the DMCA safe harbour, and in so doing it awards licence to massive, multi-billionaire corporations to steal content by proxy. To exploit the small creatives who produce that content - many of whom have incredibly low incomes.

Big tech has not only knowingly exploited impoverished creatives to fuel and feed its multi-billion dollar advertising machine. It has also deliberately vilified copyright holders who have the nerve to insist that their creative work is not casually thrown around the Internet for other people's gain. How? By labelling their efforts to stop the grand banquet of content theft, as “censorship”.

That's right. We now have an environment where metaphorically, the millionaire shoplifter can send his kids into a struggling clothing store to steal him a leather jacket, and when the shopkeeper calls the police, the millionaire shouts to the public: “Hey everyone! This shopkeeper is trying to censor your right to look at this leather jacket!”... And then society screams: “How DARE that shopkeeper try to censor our right to look at that leather jacket???!!!

That's how monumentally E-F-effing stupid this is.

In official language, the DMCA “safe harbour” is the Online Copyright Infringement Liability Limitation Act. It was introduced in 1998, as a means to protect website owners from copyright lawsuits in instances where the infringing content was posted by a third party, and the site owner could not realistically be expected to be aware of the infringement.

But online, the world was very different in 1998 from the way it is today. There was virtually no means for members of the public to re-post media content without paying for hosting. And search engines were just that. Search engines. They were not the virally-sensitive, central banks of content and information that Google and others have become in the interim. Dial-up Internet connections struggled even to load images, so the devaluation of media through mass redistribution was not a thing back then.

In the 1990s, the primary format for user-generated content (UGC) publication was the forum or message board. Such boards were not designed to encourage, or even facilitate, crowdsourced redistribution of a submission, as later social networks did.

And because it was so impractical to post images, the average poster was only posting text. Could still be stolen text, obviously, hence the safe harbour, but the theme of the boards was much more focused on discussion than entertainment. And that limited the scope for copyright theft. You can't meaningfully interact entirely in stolen words. You can now, however, populate a video channel entirely with stolen videos. And populate a Twitter feed entirely with stolen photos. And don't let's pretend the platforms don't know those feeds are there.

If something appeared on a UGC site in the 1990s, it generally appeared once, in text format, in a segregated pocket of a heavily disconnected Web. That was much easier for a rightsholder to control than the real-time, universal distribution network we have today.

Since the Web wasn't centrally distributed from search engines and instant-reach viral platforms in '98, the likelihood of a piece of stolen content spreading across the entire Web in the blink of an eye was close to nil. Hence, when the safe harbour was introduced, copyright-protected content was not at anything like the risk it's at today.

Even when image hotlinking first arrived and random members of the public could finally place photos on forums for free, a stolen photo could sit on a board without greatly affecting the copyright holder's ability to profit. It wasn't until Google Images came along in the early 2000s that this began to change.

Today, the mutual feeder system of search engines, social media and the “megablogs”, acts as an almost instantaneous universal broadcast mechanism. It's a mechanism with built-in amplification protocols, no ceiling, no attribution prompt, and no stop button.

With the help of the public, this mechanism can completely devalue a piece of media content within a day - making it universally available for free, before its copyright holder has a hope of getting it taken down. The public are big tech's proxy. They're the worker ants transporting media content to its new destinations - pushing it into every corner of big tech's real estate. And because of the DMCA safe harbour, big tech can place the entire blame on the public. Big tech itself does not have to take a single iota of responsibility for the catastrophic damage its mechanism can do to the value of a digital work.

You're familiar with that work. You know that picture. It's Internet famous. But not only do you not know who created it - you also in many cases can't even find out who created it. No one knows, and no one cares, and there's now such an unthinkably large number of sources for that picture, that locating the original is like looking for a needle in a haystack. This is not what the safe harbour was designed to protect. It was designed to protect owners of public discussion boards from being sued for a one-off instance of inadvertent copyright infringement.

It's since been bastardised as a licence to authorise out-of-control, industrial-scale re-posting of media content, sucking that content dry for maximum ad-revenue over the course of a few days, and leaving its value completely shot.

The safe harbour law as it stands now needs to be scrapped, and replaced with a new law that protects only responsible distributors of content.

People who do as much as is possible to avoid breaching copyright, and who minimise the possibility of content theft. Big tech is not doing that. Big tech is doing the opposite. It's doing as much as is possible to encourage the breach of copyright, and maximising the possibility of content theft. We have to have a law that prevents billionaire businesses from using the general public as a shield for corporate freeloading.

We have to stop looking at the good that the DMCA safe harbour is supposed to have done, and notice the bad. We have to recognise the vast number of smaller websites that big tech's ceilingless content distribution machine have put out of business. We have to stop seeing the copyright holders as Mr Big, and the copyright thieves as the little people socking it to the man. The big corporations have the money to manage copyright theft. Impoverished artists and photographers don't. And today, copyright infringement is far more commonly a case of Mr Big exploiting the work of the little people, than vice versa.

When you side with a protection that allows billionaires to exploit the work of low-paid or unpaid artists for profit, without compensation, and without asking, you are NOT a champion of morality. You are a champion of skulduggery.

Especially when people with a lot of integrity and credibility support the idea of weakening copyright law, it hands ammunition to the skulduggerous parties who seek to bypass copyright for profit.

Take someone like Cory Doctorow for example. Popular guy with great integrity and sky-high cred. Doctorow has a fairly soft take on copyright, and he does support the rights of creators to designate a copyright of their choice. He's a creator himself.

But his speech in favour of copyright law being relaxed at the bottom end, is too easily seized upon and distorted by those who just want to entitle themselves to all creative work for free. Selected parts of Doctorow's message are generating citation fodder and propaganda for the anti-copyright campaigner. Alias the “anti-censorship” campaigner, who doesn't actually care about censorship, and who indeed indulges in censorship. But who very much does care about being free to steal other people's work for profit without being in any way accountable for it. Alias big tech.

Doctorow sees two separate realms of copyright control, which are at present covered by the same set of rules, but should, in his view, be considered separately. One is small scale personal redistribution, which is most often cited as “lending and borrowing of books”, but which in the 2020s more likely means individual consumers copying and passing on digital works to their friends and associates for free. And the other is corporate-level redistribution, in which commercial enterprise is gaining wealth through publication to the masses.

In short, Doctorow believes commercial redistributors should be subject to tougher controls than private individuals who are not sharing content for profit.

He absolutely has a point when he differentiates between commercial enterprise redistributing for profit, and private individuals community-sharing. As creators of content we feel differently when we see one consumer passing our work to a friend and saying: “Hey, you'll like this, it's great!”, from the way we feel when we see some PoS with a splog nicking it to ride out their ads. So I get it, Cory, I really do. And unlike the Internet-age organisations profiting from unauthorised content redistribution, Doctorow is not a hypocrite. He distributes his own work on Creative Commons licences. He's putting his money where his mouth is.

Conversely, big tech is as hypocritical as Hell. It will champion the rights of the people to steal (sorry, “curate”) creative content all day. "CoS TaKiNg AwAy ThE rIgHt To StEaL pHoToS wOuLd Be CeNsOrShIp". And it drives their giant ad machines. It's in their interests. But when it comes to the public's right to steal their software code and redistribute that for free, there's suddenly a one-eighty U-turn and they think that should be punishable with a jail term. When it's their intellectual property being debased and devalued, suddenly, "freedom of expression" magically becomes serious crime.

What's the difference? Whether you're redistributing a photo of Margate or the source code for Facebook, you are still adopting management rights over a piece of digital matter. And you either agree that that should be allowed, or you don't. You can't agree that it should be actively encouraged when it disadvantages other people, but should carry a death sentence when it disadvantages you. That's not an ethical stance. It's abject sociopathy.

And the problem with the Doctorow-type vision is that these massive tech companies, who have an unthinkable amount of control over society, will motivate private individuals to steal and redistribute on their behalf, under the banner of “community-sharing”. The problem is that there's no volume-ceiling to “community-sharing”. Where's the line? Is it okay to share with one person but not two? Or five people but not six? Or five hundred people but not five hundred and one? The digital world facilitates exponential upscaling of distribution, with NO STOP BUTTON. And that upscaling doesn't reach any kind of cut-off on the basis that the original share came from a private individual who only intended a couple of friends to see it.

So if you're going to make the copyright rules less stringent for private individuals, who are not sharing content for money, you're actually playing into the hands of these huge tech companies whose mission is to exploit community for profit. Their express goal is to use private community as a proxying system to feed their guzzling, devaluing, advertising treadmills. Even though the private individuals are not necessarily making money out of the content, money is still being made by the tech companies, and money is still being lost by the rightsholders.

Richard “copying is not theft” Stallman has a more extreme take on copyright, and doesn't like rightsholders being able to manage their rights at all. He's another voice against big tech in so many respects, but he's on the side of the cyber giants when it comes to their right to steal by proxy. This is a man who thinks the Stop Online Piracy Act would have been a bad thing.

To a point, I understand his angle on keeping the internet “free”. I know there's a flipside to the DMCA, which potentially allows it to be turned against free speech. And I'm a privacy advocate. I can see that corporations are abusing rights management as a means to inflict oppressive spying regimes on the public. For example, the likes of Adobe will no longer let you purchase a copy of Photoshop. You instead have to use the product as a service, which requires you to access their servers, which enables Adobe to spy on and log everything you do. It's both a copyright control mechanism (in that it genuinely does prevent the software being stolen and redistributed) and a spying mechanism. But it doesn't need to be a spying mechanism. It's only a spying mechanism because Adobe set it up that way.

So you stop the spying with clearer and much more aggressively enforced privacy law. Not by slackening the concept of copyright. Start prosecuting privacy-rapists and giving them long jail terms, and they will soon stop. When we catch a peeping tom climbing a ladder to leer through a woman's bedroom window, we punish the offender, for the actual offence. We don't go to the ladder manufacturers and say: “Hey, can you slacken your ladders a bit please?”.

Copyright needs to protect every creator, from rich to poor. And it can't do that while the people who control the World Wide Web are allowed to encourage copyright theft.

That's right, encourage. I'm not calling for social media to be banned. Or for search engines to be banned. I'm calling for the law to force the proprietors of these machines to act in a responsible manner. To show that when it's feasible, they are actually doing something to minimise copyright theft. If anyone really believes that these online megapowers are even vaguely attempting to do that at present, they are certifiably mad.

YouTube Copyright

Look at this...

YouTube search: "I do not own" "copyright"

That's over FOUR MILLION Google results for people on YouTube explicity saying they're in breach of copyright, and yet, oh look, the videos are not only still up, but also searchable on the video platform's companion search engine. Okay, so those results will incorporate some other uses of the keywords, so it's probably not four million in that specific search. But if you broaden the search to include alternative ways of making the same admission, it's bound to be in that area. You can argue about the precise volume, but it's still an unthinkable number of self-acknowledged copyright thieves that Google is happy to passively ignore.

Here's another 355,000 results for Facebook...

Facebook search: "I do not own" "copyright"

That is NOT managing copyright to the best of your ability but needing protection for the times where you're unaware of a breach. That is abusing a naïve copyright law to profit from theft by proxy.

That is why the DMCA safe harbour law needs to be scrapped, and replaced with a law for the 2020s. It's why the people whose support we need most - like the EFF - need to urgently revisit this issue and think it through more carefully. On the World Wide Web of the 2020s, the woefully outdated safe harbour renders copyright unfit for purpose. We do need freedom to speak. But stealing a video is not speaking. It's stealing a video.

And yes, I know I'm posting on a UGC platform, and I want that platform, and others like it, to remain available for legitimate use. But I believe that's possible without also granting such platforms complete immunity from liability, however irresponsible they are. As things stand, all a platform or site has to say after hosting stolen content for five years is "we weren't notified by the copyright holder". And that's enough to give them immunity from liability. They don't owe the copyright holder a penny of the ad revenue they made. That's theirs, for being the poor little victims of a thief, who gave them some stolen goods to profit from. Even if the thief is literally saying "I stole this content". Still immune. We're not given this immunity when we receive stolen goods in the offline world. It's discretionary. If it's shown that we knew we were receiving, there's a penalty. If it's shown that we didn't know, we're not liable. Does that not make more sense?

So we have to ditch the idea of complete immunity and introduce a new law that looks at intent, discerns whether site administrations are genuinely unaware of breaches, or are taking the piss, and places the liability on the most appropriate parties. If Twitter knew that 100,000 users were breaching copyright, but did nothing to address that problem, why should it not be liable? Give me one good reason? There isn't one.

I want people to be able to speak freely but responsibly, and I support fair use. Both of those concepts are discretionary. But blanket immunity is stupid, and I've shown in this post what happens when you award it. Let's have a discretionary law, for a discretionary matter. Let's scrap total immunity, and align the receipt of stolen digital goods with the receipt of stolen material goods. Let's force punishment upon bad intent.
Bob 'Interesting' Leggitt is a print-published writer, multi-instrumentalist and twice Guitarist of the Year finalist, Google-certified digital marketer, image manipulation expert, virtual musical instrument builder, "Twitter detective", and author of successful blogs such as Planet Botch, Twirpz and Tape Tardis. | [Twitter] | [Contact Details]