It can be hard to believe that an encyclopedia edited entirely by legions of faceless volunteers could prove to be so good at reporting the facts. But almost two decades after it sputtered into existence, Wikipedia has proven itself to be one of the web’s most trusted sources of information. It’s not only reliably excellent at all the stuff you’d find in an old, paper-bound encyclopedia, but also for the kinds of present-day political issues that tend to be the subjects of falsehoods and half-truths. There are two edits made every second on Wikipedia, and on pages for people like President Trump or Hunter Biden, volunteer editors are constantly waging a tireless battle against flocks of trolls and partisans wielding fake “facts” and partisan perspective.
Against the backdrop of elections and an ever-quickening news cycle, when the slightest tweet can shape political views or shake markets, Wikipedia’s lessons about misinformation are only growing more vital. Social media companies like Facebook and Google already use Wikipedia links and information in their own fight against fake news, and now Twitter says it’s considering combatting problematic tweets with a system that works “like Wikipedia.”
For this story—part of our series this week on Hacking Democracy
—I explore what makes the site so resilient to a barrage of bad actors, and what its volunteer editors can teach Big Tech and the rest of us at a time of growing distrust. “The biggest threat,” says Muboshgu, a long-time editor, “is that we lose sight of what’s actually true.” Read more