Death by MMC: How Wikipedia Strangled The Information Highway

Popzazzle | Wednesday, 13 July 2022 |
"It would not be an exaggeration to say that Wikipedia has done everything it could possibly get away with doing to deny the prosperity of its sources."
Vista comprising reservoir and trees, with text - The photographer's credit is in the reservoir. The person who wrote 'The photographer's credit is in the reservoir's credit is behind the trees.

DEATH BY MM-WHAT?

If you web-search the acronym MMC, I'm sure you'll find every trivial meaning you could possibly conceive. What you almost certainly won't find, is the meaning that resonates above all others within the cybertech cartel.

Google and Startpage will lead you to a Wikipedia (where else?) disambiguation, citing nearly 70 possible interpretations for MMC, but mysteriously excluding the one that most matters to Wikipedia. The one which, indeed, defines Wikipedia.

DuckDuckGo gives us a top result of Marsh McLennan, a couple of nice little plugs for Microsoft (obviously), and, oooh, a Free Dictionary rundown, giving us over 140 possibles. But alas, once again, the one we want is absent.

So what, actually, is an MMC? An MMC is a website or platform characterised by Massive Multiauthor Collaboration. The term is used by Silicon Valley lawyers to define... Well, in theory, all user-generated behemoths like Wikipedia. But in practise, the acronym's main purpose has been to retrospectively exclude Wikipedia, specifically, from its own copyright conditions. Betweeen 2007 and 2009, when the world's most prominent online encyclopedia found itself trapped in a copyright bubble of its own making (PDF file), the MMC categorisation gave the platform a very tenuous, very unfair, very slimy escape route. Of all websites on the WWW, Wikipedia knows what MMC means.

But screwing over thousands of contributors in the most brazen digital publishing rights swindle the mainstream Web has ever seen, is far from the only dark pattern that Wikipedia has exhibited. In this post, we'll see how the world's biggest MMC set out not to work in harmony with the valuable resources of the wonderful Web, but to destroy its own feeder mechanism and become a classic Big Tech parasite. Like Facebook, Wikipedia would seek to suck in and own everything of value, kicking the sources into the gutter with a campaign of organised anti-attribution.

If you've ever wondered why Wikipedia is nearly always at the top of your search results, and you can no longer find those specialist sites, written from first-hand experience by experts, all is about to become clear...

BUT ISN'T WIKIPEDIA A CHARITY THINGY?

It's a 501(c)(3) nonprofit. But only because that's the most profitable means of existence it could find. When Wikipedia launched, almost as a joke side-project in conceptual terms, it did so as a .com business, with the intention to monetise through advertising. But its content licencing model left the site open to replication, and its original, for-profit parent, Bomis, quickly found that a non-commercial replica of Wikipedia could gain more traction than Wikipedia itself. Could potentially, indeed, destroy a commercial Wikipedia. Fortunately for Wikipedia, replication only got as far as Spain before Bomis boss Jimmy Wales elected to renounce Wikipedia's commercial goals. Subsequently, Bomis was ditched in favour of Wikimedia, which is now the "nonprofit" Wikipedia parent.

But to be clear, the shift to nonprofit status was, ironically, to secure the best financial return for Wales personally, as well as the best future prospects for the project. We should not assume that a nonprofit was founded because its guiding hands were charitable. Sometimes, a nonprofit just proves to be the most lucrative option for the boss(es). And that was absolutely the case with Wikipedia.

CRUSHING COMPETITION THROUGH ANTI-ATTRIBUTION

Throughout most its life, Wikipedia has been fiercely anti-competitive, and has sought to bury the very sources from whom it acquired its information and content.

At the very beginning in the first half of 2001, Wikipedia had quite a laissez-faire aura about it. However, in 2002 the platform began to outwardly show a much more Machiavellian side. That year, it expressed a hard policy of NOT giving inline links to quality outside sources...

"Don't use external links where we'll want Wikipedia links [...] For example, if you're writing an article about Descartes and you know of a great article about rationalism online, don't link the word "rationalism" to that article."

"DON'T. LINK." Square that with the "share the love" message that Wikimedia and the Big Tech cartel promote in their anti-copyright drives.

This was a blatant measure to withhold both traffic and search visibility from source sites. Even in the early 2000s, Google's search algorithms ranked sites based on incoming links. So this "no linking" policy would inevitably deprive sources of due recognition in every sense. Wikipedia boss Jimmy Wales was an expert on Google's ranking system, and had, in 2000, given Wired an interview on the subject. He would not have been under the remotest illusion about the implications of denying external links.

Wikipedia also calculatedly sought to deprive photographers of recognition, instructing Wikipedians specifically to separate the image credit from the work. Again, from 2002...

"Don't put photo credits in articles or on the images themselves; put them on the description page."

So, you want article readers to see the photo, but not the photographer. Got it. "Share the love" indeed.

Around this time, Wikipedia articles broadly offered no links to external content at all. But this was a major issue for the platform's credibility, since links were references, and encyclopedic articles without any references tend to garner little trust among academics. Wikipedia's low cred with academics was something that former paid editor Larry Sanger publicly pointed out in 2004. By then, external links were well established on Wikipedia pages, but they were always located at the bottom of an article, where the fewest people would see or follow them. Moreover, the external link blocks were not a solution to the problem Sanger highlighted, since the article bodies still offered no references or citations.

The obvious solution would have been to add reference links into the article body text at the appropriate places. But Wikimedia knew that if it did this, its platform would not remain sufficiently anti-competitive. Linking from the article body would allow readers to discover great content elsewhere, and Wikimedia wanted to avoid that at all costs.

Duly, in 2006, the now explosively expanding MMC came up with the unnecessarily cumbersome system of referencing online sources with citation numbers and footnotes. Doing this where the source was hyperlinkable seemed insane. The whole point of the hyperlink is that you can cite without having to add visual interruptions and bloat out the footnotes. That's literally what hyperlinks were designed to eliminate. But Wikipedia was much more interested in crushing its sources than creating a user-friendly referencing system. So it referenced online content in exactly the same way it referenced printed matter - with an archaic, print-derived mechanism to obfuscate the path of attribution. All along, the express goal was to ensure that the vast majority of readers would not click external links.

The citation numbers were actually links. But rather than link to the source, they linked to a footnote at the bottom of the same Wikipedia page. A mindless and unnecessary obstruction to the reader, who obviously wants to read the citation itself rather than a pointless footnote offering another link to the citation.

So by 2007, the Wikipedia system of anti-attribution was sufficiently robust to gutter the sources, right? Wrong. From that year, all of Wikipedia's external links were full-on crippled for SEO potential by the insertion of a "nofollow" attribute. What is "nofollow"? To quote Wikipedia itself...

"nofollow is a setting on a web page hyperlink that directs search engines not to use the link for page ranking calculations."

This hidden signal to Google expressed that the linked content was of no relevance or value (a patent lie in Wikipedia's case), and meant that the source sites being "credited" would not receive their rightful search visibility status.

As a final insult, when Wikipedia increased the size of its article body text in 2014, its citation text and links retained the smaller print. Just another step in a relentless campaign to minimise attribution.

It would not be an exaggeration to say that Wikipedia has done everything it could possibly get away with doing to deny the prosperity of its sources.

THE GREAT COPYRIGHT SWINDLE

If you're also keeping an eye on my Neocities blog, you may have seen the most recent post about Creative Commons, which bares the reality of the kind of hack-ass "copyleft" schemes that have driven Wikipedia from day one.

It's worth reading that post for some additional background, even if I do say so myself. But here we'll see a working example of how those codged-up non-copyrights only serve the forces of power. For the digital working class who produce the content, they're not worth the split second their codeblocks take to download.

"We had hope for a genuine free culture. That hope has been destroyed. By people who want to own it."

When Wikipedia hit the Web, there was no such thing as Creative Commons. The rough equivalent during the encyclopedia's run-up to launch in 2000 and early 2001, was the GNU Free Documentation License (GFDL). This licence was really created for hard copy manuals that went with the software shipped out on the Free Software Foundation's General Public licence (GPL). So it wasn't a made-to-measure fit for Wikipedia, but there was nothing else suitable off the peg, and Bomis already had it in use on Wikipedia's then more serious sister site Nupedia, so the GFDL was baked in by default.

But through the mid 2000s, Creative Commons established itself as the new go-to licencing option for freely-distributable creative work. After years of Wikipedia leapfrogging other content venues through use of their info and a poisonous dose of anti-attribution, a separate stream of content licencing had at last sparked a strong basis for competition. A rapidly growing volume of Creative Commons content now provided a means for alternative publishers to build powerful rivalry to Wikipedia.

Best of all, it appeared that this time, there was no anti-competitive option left on the table for Jimmy Wales' gargantuan website. Wikipedia's existing GFDL licence was incompatible with Creative Commons, because the active GFDL 1.2 required that any modified work be released under the same licence. In other words, you couldn't release an edit to a GFDL article under Creative Commons. Or vice versa.

This left Wikipedia unable to leverage combinations of content from the GFDL and Creative Commons, and importantly, unable to legally import and integrate sections of Creative Commons from external projects. All of Wikipedia's existing content was confined to the GFDL forever more. And since it was Creative Commons, and not the GFDL, that was now the runaway choice elsewhere, Wikimedia was contained in a licencing bubble it could not, apparently, escape.

MOVING THE GOALPOSTS

What came next was pretty much the most breathtaking rights swindle the mainstream Web has witnessed. A move that perfectly illustrated how useless codged-up non-copyright schemes really are at protecting creators' rights, and how good they are at serving the dominant powers.

Double-desperate to re-licence all of its existing content, Wikimedia talked GFDL providers the Free Software Foundation into updating the licence's terms - specifically to allow Wikipedia to transition to Creative Commons. The FSF duly inserted a double-dodgy clause granting a time-limited exemption from the single-licence lock-in - for "MMCs" only. In other words. For Wikipedia only. This clause, released in a new version of the licence, would allow Wikipedia to preferentially sidestep a binding to which everyone else was subject, so as to suck in even more of the Web's external value, in its characteristic, anti-competitive, bullish, Big Tech stylee.

I was brought up to believe special help should be given to the smallest and most vulnerable. Not to the biggest and most powerful. But yeah, whatevs.

"When you contribute valuable, new information to the open Web, all you are doing is gifting a proxied update to a platform that is going to bury you."

Just one thing, as Lieutenant Columbo would say... The original contributors had licenced their work to Wikipedia on the old GFDL - not the new one. And the old GFDL did not have the clause. Wikipedia knew it could not get every contributor to accept the revised GFDL. Many authors would no longer even be contactable, and a huge number of the contactables would actively decline for a range of reasons I'll list shortly. So regardless of the FSF's special, discriminatory gift, this, surely, was now a dead end...

Not for a pathological, rights-trampling control freak like Wikimedia. Rather than respect the terms under which contributors had contributed, Wikimedia decided it would be sufficient simply to cast a vote among a calculatingly limited selection of contributors, and use a majority verdict as permission to re-licence the entire platform to the updated GFDL - claused for re-licencing to Creative Commons.

As I said. Breathtaking.

Despite thousands of Wikipedia contributors voting against (PDF file) the re-licencing of their work, the platform re-licenced regardless. It was thus written into history that when you contribute to an MMC, regardless of whatever bollocks is written in the licence, the platform will OWN your work. If, as a producer of content, you cannot uphold the original terms of the copyright licence on which you contributed, that copyright is a sham, and in reality you are signing away ALL your rights to "entrepreneurs". They call these licences "copyleft" for a reason. Unlike real, statutory copyright, their goalposts are movable, because changes to them can be made at Silicon Valley's behest, without due political process. They are literally not copyrights.

So not only was Wikipedia mugging off its external sources. It was now mugging off its internal sources too.

BUT HOW DID WIKIPEDIA CONTRIBUTORS LOSE OUT IF GFDL AND CREATIVE COMMONS SHARE ALIKE ARE SIMILAR?

The greater ill was the disregard for creators' rights, as originally agreed. With real copyright, intellectual property cannot be re-licenced by its consumer. What Wikipedia did was equivalent to you or I purchasing a software product, then re-writing the licence to suit our own ends, and then expecting the software vendor to accept our revised version of the licence. Imagine trying to tell Microsoft you've switched its software to a different licence, so now you can do X, Y and Z with it!

But there were also real differences between the GFDL and Creative Commons Share Alike, which every individual creator was entitled to reject. Broadly, the GFDL demands better recognition of collaborative contributors and seeks to more rigorously uphold the minutiae of the licencing terms. For example...

  • It requires retention of a modification history, which can better identify who did what in heavily modified work. This would be particularly beneficial to collaborators when work was exported outside Wikipedia.
  • It's arguably less prone to misuse than Creative Commons BY-SA, since the CC licence by default presents only a summary of its terms, and unlike the GFDL, does not require the full licence to accompany the work. Perhaps most critically, the full GFDL licence is much more readable than the full Creative Commons BY-SA, which is built around legalese, and is thus harder for a non-lawyer to understand.
  • It incorporates preservation clauses which additionally help creators to avoid "airbrushing".
  • Some authors of GFDL-licenced work supported the Free Software Foundation itself, and desired that their work remain licenced to represent their advocacy of the FSF and GNU.

Other disturbing elements of the re-licencing included the facts that...

  • The vote was not fairly conducted. A humungous volume of contributors - those who would be more likely to vote "no" - were disqualified from the vote. The vote was also publicised, communicated and informed in a biased fashion.
  • Voting per se was a charade, because Wikimedia would have re-licenced even if its rigged vote had managed to swing against the move.

The fiasco also demonstrated that supposedly freedom-oriented licencing is nothing of the sort. The GFDL actually ended up restricting Wikipedia and its content. The restrictiveness of "free licencing" is in fact so profound that it would theoretically still be possible to derail Wikipedia and Big Tech in the long term by introducing a new "copyleft" licence with much better creator benefits, and a clause expressly prohibiting use of the content by outlets with more than, say, 100,000 pages. Call it the "Give Someone Else a F***ing Chance!" licence. Let the content library grow, and watch more modestly sized resources drift back into the picture. I might even release some top drawer shit on that licence myself, if it ever hit cyberspace.

SPIRAL OF SPIN

If you produce original content - and particularly if you invest in information research - you will probably know what a spin is. A spin is a re-wording of existing written material, which has become the cornerstone of the online plagiarism industry. A spin exploits the fact that copyright can only be applied to intellectual property, and not to information or concepts. Where writing is concerned, that's basically just the wording. An author can't copyright the information their research uncovered, and they can't copyright a broad idea. What a spin does, is take the information and the ideas, but change the wording so the new publisher can't be sued for plagiarism.

By now, you probably won't be at all surprised to discover that Wikimedia has encouraged its contributors to spin other people's work...

"[I]t is legal to read an encyclopedia article or other work, reformulate the concepts in your own words, and submit it to Wikipedia"

Translation: Go FoRtH AnD sTeAl.

They even helpfully provided a guide on the exact extent to which their contributors should reword original researchers' hard work in order to dodge lawsuits.

And whilst we're on the subject of lawsuit-dodging, I should stress that the nice, caring, sharing people at Wikimedia took the trouble to warn contributors not to link to illegal copies of creators' work. Aww, shucks. Maybe we do matter after all... Nah. Before you post that thank-you card, let's have a quick peep at Wikimedia's rationale for this parently piece of advice. Did they want to avoid linking to copyright infringements because it harms creators? Nope. They wanted to avoid it because...

"Linking to a page that illegally distributes someone else's work sheds a bad light on Wikipedia"

Yep. That's Wikimedia. Always unselfishly thinking of other people.

THE DAMAGE

It's difficult for the general public to criticise Wikipedia. If you're not someone who's had a website plundered for value and then trampled out of the search results with anti-attribution and cartel collusion, Wikipedia seems friendly. Well, unless you wanna access an external source, obviously. But there are some wider streams of damage that the copyright-bashing, Big Tech cartel causes.

The top side-effect of Wikipedia and Big Tech's small-publisher-trampling campaign has been the withdrawal of many smaller (and even larger) publishers from the open Web.

There's a limit to the number of times a creator will place their work on the open Web, only to have it spun, stolen and source-diluted to the point where it becomes tantamount to public domain. Serving content behind a paywall keeps out the scraper and search bots that spew the work onto the redistribution highway. And a paying audience is much less likely to redistribute, because why would they want to be the mug who foots the bill for everyone else's free consumption?

The asset-strip and bury effect of Wikipedia has massively encouraged paywalling.

Accordingly, the open Web has lost many of its groundbreakers and true innovators. Some have taken their skills to private clients. Some have retreated back to the medium of print, where their work can at least undergo its initial distribution round without being sucked up by the tech cartel.

We might think we're basking in knowledge courtesy of Wikipedia. But it's biased, often wrong, and worst of all, it's now very frequently served as the sole reference source in a sea of search engine spam. In other words, it's hard to cross-reference. It has no style, no passion, no enthusiasm, no storification. It offers no inspiration. And all that is a product of its design, which refused to fairly recognise creators. We never see direct contributions to Wikipedia from creative stars, because, realistically, why on Earth would any creative star contribute to a project that treats creators like shit? Takes everything. Gives nothing.

And what happens when we, as online consumers, want to go deeper than entry-point, encyclopedic coveralls? Be honest - there's nothing there, is there? Outside of funded research, content-marketing and the commercial news machine, no one in the 2020s is going to spend weeks producing valuable insight, and then drop it in front of the Big Tech bulldozer to feed Wikipedia and Co, with no prospect of reward. Here's the deal...

If you would not contribute to Wikipedia itself, there is no point, in the 2020s, in contributing valuable, new information to the open Web at all. Because when you contribute valuable, new information to the open Web, all you are doing is gifting a proxied update to a platform that is going to bury you.

As we head into the future, we may slowly return to the culture of the 1980s, in which only adverts were free, and discerning consumers bought their entertainment. To say that we're not already part way down that regressive road would be a lie. We had hope for a genuine free culture. That hope has been destroyed. By people who want to own it.

The blame for Wikipedia's destruction of Web 1.0 - or at least its information sector - does not rest wholly with Wikipedia. Google, in particular, has awarded the platform not only unprecedented search visibility, but also huge "donations". In combination, these awards have given the MMC both a deeply unfair advantage over smaller sites, and a necessary bias towards Google's political agenda. A bias that manifests itself as much in what Wikipedia doesn't say as what it does.

Wikipedia is not the kindly, independent voice it so wants the world to believe it is. It's a go-faster lever in Big Tech's creator-abusing bulldozer. Power-crazed, Machiavellian, and anti-competitive to the hilt.

And when you pay it a visit, you should never forget that the only reason you're not looking at an ad-strewn sewer of product placement, is that the grant, donation and tax-relief business model assured Jimmy Wales a better financial return. For him, it was about money from the start, and it still is. That's why his policy is to crush and destroy, and not to share the love.

Bob Leggitt
Post author Bob Leggitt is a print-published writer and digital image creator, multi-instrumentalist, twice Guitarist of the Year finalist, web page designer and software developer.
[Twitter] | [Contact Details]