Skip to main content

Wikimedia CEO on facts, hoaxes and the promise of Wikipedians

#39 of 48 articles from the Special Report: Interviews with Linda Solomon Wood
CEO Katherine Maher has spent the past five years leading the Wikimedia Foundation through an environment of growing disinformation. Photo submitted by Wikimedia Foundation

Support strong Canadian climate journalism for 2025

Help us raise $150,000 by December 31. Can we count on your support?
Goal: $150k
$32k

Despite leading Wikipedia and its parent organization through the past five years, during a time when online disinformation has skyrocketed, outgoing Wikimedia Foundation CEO and executive director Katherine Maher says she’s optimistic about the web resource’s future.

The site is thriving thanks to its diverse community of contributors, known as “Wikipedians,” Maher said Thursday.

“When you meet Wikipedians from every corner of the globe, from every background imaginable, from grannies to 13-year-old students, from Kazakhstan to Uruguay to Korea, you are reminded of all the things that unite us, the ways in which our diversity enriches what we know,” Maher said during a Conversations event hosted by Canada’s National Observer.

“I leave knowing that one of the things that makes us human is our curiosity, and one of the things that is the record of our curiosity is our knowledge.”

The event, hosted by Canada’s National Observer editor-in-chief and founder Linda Solomon Wood, focused on how Wikipedia has dealt with disinformation and changing information ecosystems given its status as a first-line research resource for billions around the world.

Outgoing Wikimedia CEO Katherine Maher spoke with Linda Solomon Wood on March 18, 2021.


Maher recalled a 2005 hoax targeting the late John Siegenthaler, longtime publisher of The Tennessean newspaper, as the moment that Wikipedia realized it might be hijacked to help spread disinformation. The hoax occurred when an anonymous contributor created a Wikipedia article suggesting Siegenthaler was suspected of assassinating U.S. president John F. Kennedy. A friend of Siegenthaler discovered the article, and it was removed by a Wikipedia editor.

“(That incident) led Wikipedia to realize that we’d gone from being an experiment to really something that had an impact on the public discourse,” said Maher. That experience forced Wikipedia to “tighten up” a number of its policies around biographies of living people, given the potential ramifications on a person’s life. “That really set the stage for a close appreciation for Wikipedia editors for ‘What does it mean to hold the responsibility of not just being this public, free resource, but also perhaps the primary resources in many instances?’”

Much has changed since 2005. Quickly stymied bad-taste hoaxes have given way to a destructive and largely unstoppable global disinformation machine that spreads radical far-right conspiracy theories like QAnon alongside COVID-19 and climate change denial. Where a rogue Wikipedia article in the early 2000s might reflect poorly on an individual, now it could influence one of the site’s billion monthly readers.

Maher said although Wiki policies have been developed to combat this manipulation, tactics used to spread disinformation are likewise becoming more advanced. “They tend to be sophisticated technical efforts to circumvent the policies of Wikipedia, and they tend to be employed by people who have resources to do so,” she said.

“I leave knowing that one of the things that makes us human is our curiosity, and one of the things that is the record of our curiosity is our knowledge,” says @krmaher.

Wikipedia has for the most part kept this disinformation machine in check via a growing international roster of editors and contributors, Maher said. To assist editors in identifying bad actors — who Maher said span from paid public relations experts to government and non-government personnel alike — an algorithm tracks edits and flags contributions from suspicious or new accounts that haven’t previously been part of the Wikipedia system.

“We don’t always get it right,” Maher admitted. “But by and large, as soon as something comes to the attention of the Wikipedia editing community or the public, editors are extremely responsive and are able to not only go in and lock that article down, but also to correct the record and make sure that it’s reverted to the most accurate and most recent form.”

Maher also explained that despite Wikipedia’s efforts in recent years to grow and diversify its editor and contributor community, only 18 per cent of the site’s notable biography pages are about women. Maher tied the figure to a 2015 London School of Economics study that found media outlets cited men as expert sources five out of six times, a pattern that stunts the public record of women experts and leaders.

To address this gap, Maher said one of Wikimedia Foundation’s policy targets for 2030 is around knowledge equity, an initiative to map and fill gaps in Wikipedia’s knowledge base. Maher said the foundation offers grants to support this work. “What are the ways in which we can support communities in really thinking about proactively redressing gaps and exclusion that have both been structural because of lack of access to time, electricity, computers and also intentional and cultural aspects of power?”

Maher described this campaign, and Wikipedia’s push for trustworthy and open information, as “a lifetime of work” rather than a one-stop fix. “We’re not always going to get it right,” said Maher. She explained being honest with readers to build trust is among the most useful strategies in fighting disinformation.

Maher said the immediate goal was not to “win the war” against disinformation, but simply “to do our best, every day.”

Comments