Wikipedia's editors cut out the Daily Mail
News that a few dozen Wikipedia editors had decided that the Daily Mail should be “generally prohibited” from being used as a source caused consternation among many in the media.
That’s not least because the anarchic way in which policy is set at the “world’s public library” is a million miles away from what most of the journalists and academics referenced by the site are used to.
The decision, disclosed recently, did not even involve Katherine Maher, executive director of the Wikimedia Foundation, which runs Wikipedia, and that’s by design: the foundation doesn’t interfere in editorial policy. Speaking from the US in February, Maher seems relaxed about the process that led to the decision, and the possibility that it could be reversed.
“We are always looking for what characterises reliability, and the various characteristics of reliability, and what the [discussion about using the Mail] really focuses on is issues of fact-checking prior to publication and the issuing of corrections when articles are wrong,” she said.
“It’s my understanding that in this instance they were looking at how well the Daily Mail adheres to those standards of reliability. I presume that should circumstances change, Wikimedians would be very open to reconsidering the usage of the Daily Mail as a source they can use as widely as in the past. There’s nothing to stop it being used again.”
Maher took charge of the foundation last summer, after two years as its chief communications officer following a career in IT and advocacy taking in Unicef and the World Bank. Does she find it stressful being responsible for the health of the such a huge resource while also having little control of what its millions of editors put on it, or the rules they apply?
“That would only be the case if somebody stepped into the Wikimedia Foundation with some sort of expectation it controls Wikipedia,. It doesn’t set editorial policy and everybody knows and accepts that. This is a community with a foundation, not a foundation with a community.”
That community, more than 30m registered accounts, more than 130,000 of which have edited in February, has become hugely influential.
But while Wikipedia has become an invaluable tool, it has also been criticised for inaccuracies both large and small. As the Mail pointed out to the Guardian in its response to the decision to prohibit its use as a source, the newspaper itself banned reliance on Wikipedia as a “sole source” in 2014.
That particular prohibition is one Maher would actually agree with, and she says Wikipedia should be a “jumping-off point”, rather than a source in its own right.
“If you are going to do original research, especially if you are going to write a paper or do a piece for publication, it should go into more detail, talk to primary and secondary sources and the like. Wikipedia fills a different role.”
Wikipedia fulfils that role surprisingly well, whether it’s to settle a pub bet about who won the FA Cup in 1976 (Southampton) or look up the name of the Prussian general who helped Wellington beat Napoleon at Waterloo (Gebhard Leberecht von Blücher). For many, it’s the only reference work we’ll regularly consult, and most of the time it’s right.
Much of its ongoing success is due to the behind-the-scenes work of the foundation, which is all the more impressive because of its size. While its commercial web peers such as Google, Facebook or Amazon have tens, if not hundreds, of thousands of employees, the Wikimedia Foundation’s latest budget makes room for just 277.
The foundation is funded through donations, with this financial year’s target set at $67m (£53m). The average donation is $15, and its December fundraising campaign hit its $25m goal in record time, though there has been criticism of the decision to keep the campaign going after it crossed the line.
About half that income goes on software engineering and new technology, and much of the rest goes on community outreach to editors, both organised by the foundation and grants to groups around the world. A big focus is building contributions to the non-English language versions.
“There are more than five million articles on English Wikipedia, but then the drop-off is significant,” Maher told the Guardian during a visit to London a couple of weeks before the Mail decision. “We cover some of the major European languages really well. Russian is quite large, Japanese as well. But when you then get into some of the other languages such as Arabic, it is only 464,000 articles, for 350 million speakers. There is a clear gap, and we need to be better at engaging with those communities.”
Another core job for the foundation, and Maher, is political advocacy. While copyright and press freedom are important issues for Wikipedia, there is one area even more fundamental to its operation, the rules that protect web firms from full liability for what their users post.
No matter how hard Wikipedia’s volunteers work, wrong and sometimes defamatory entries will inevitably appear, with editors engaged in a game of whack-a-mole to correct them. Like other web platforms, Wikipedia has some protection in law, which effectively treat it as a carrier of information rather than publisher, meaning its liability for what users post, is not as strict.
That protection is integral to the way Wikipedia operates, but increasing concerns about “fake news”, copyright violations and hate speech, particularly on Facebook, have led to pressure for stricter rules on legal liability. That would be a huge and costly problem for Facebook, but it could prove fatal to Wikipedia.
“When we talk about reform, regardless what position or side of the discussion you are on, rarely does that include what happens to the world’s public library [Wikipedia] or what happens to the world’s archive of all things digital, the internet archive,” says Maher.
“We think it’s critical to make sure there is space in those conversations for a voice that is not necessarily the biggest, not necessarily the loudest and not necessarily the best funded.
“We are one of the only non-commercial organisations out there that is engaged with making sure information is available and reliable online. Sometimes [in] those conversations, because of the dominance or commercial nature of the internet, there is not a perspective of a site like Wikipedia, so we want to be in some of those conversations.”
That doesn’t mean Maher is not in favour of improving the way platforms tackle problem content, and she thinks Wikipedia offers some pointers for the likes of Facebook. Wikipedia’s transparency around editing creates accountability that she says is lacking in most other web platforms.
“One of the things with that algorithmic news feed is that you have no idea why that information is being presented to you, you don’t know whether it is because it is trending, whether it is because it is popular in its network, whether it is relevant to the things you are interested in.
“That element of trust that people can make decisions, when you strip away the context and you strip away the transparency in the algorithmic feed, really goes away. That’s when it becomes harder and harder for us to be able to make sense of signal from noise.”
It sounds great until you realise quite how unusual the way Wikipedia operates is, which is underlined by Maher’s own wonder that it operates at all. “There’s this expression about Wikipedia that it’s amazing it works in practice, because in theory it is a total disaster, and yet it does,” she says. “And the fact that it does is a testament to the fact that there are people out there who care very passionately about doing this and getting it right.”