Tech News
← Back to articles

The academic community failed Wikipedia for 25 years — now it might fail us

read original related products more articles

As the free online encyclopedia Wikipedia marks its 25th anniversary this month, the academic community must confront an uncomfortable truth: we have systematically failed our greatest knowledge commons.

Why these scientists devote time to editing and updating Wikipedia

Despite a 2005 Nature investigation showing that Wikipedia’s accuracy was comparable to Encyclopaedia Britannica’s (see Nature 438, 900–901; 2005) — and years of follow-up research confirming that its specialist articles in areas such as health and psychology are often reasonable alternatives to professional sources — academia still treats Wikipedia with unwarranted scepticism. Many students trust it; most scholars do not.

I have witnessed this academic snobbery first-hand, through a decade of service on the Wikimedia Foundation Board and extensive research and publications on Wikipedia.

However, generative-artificial-intelligence systems trained heavily on Wikipedia are now threatening the future of this free, volunteer-driven resource. The stakes have changed — and academics must take note. Large language models offer instant, Wikipedia-derived answers without any attribution. When AI chatbots provide seemingly authoritative responses drawn from Wikipedia’s very pages, why would anyone navigate to the source, let alone contribute to it?

This parasitic relationship endangers the last bastion of freely accessible, human-curated knowledge and undermines the premise of collaboration on which many of Wikipedia’s knowledge-sharing practices rely.

Wikipedia’s top-cited scholarly articles — revealed

Wikipedia represents something unprecedented: the only major platform on which truth emerges through transparent debate, rather than algorithmic opacity or corporate interests. Every edit is logged, every discussion archived. In an era of AI hallucinations, black-box algorithms and widespread disinformation, Wikipedia’s radical transparency has become even more essential.

Humans bring irreplaceable elements to knowledge creation that AI cannot replicate: the ability to discover archival materials, photograph under-documented places, engage in nuanced — even if heated — debate about neutrality and verifiability and understand cultural contexts across more than 300 language editions.

We academics, who claim guardianship of knowledge, have abdicated our responsibility to what is arguably the world’s most-consulted reference work. We forbid students from citing Wikipedia while secretly turning to it ourselves, because we know that it significantly influences the scientific literature: papers receive a boost in citation count and social impact after being referenced by Wikipedia.