Wikipedia’s Army of Volunteers Is Its Superpower
The massive online encyclopedia is driven not by a CEO or board but by 250,000 volunteers who rule by democratic consensus and help fund the group.
November 25, 2025 | Read Time: 12 minutes
By Tamara Straus, Senior Editor
In 2018, Jessica Wade, a research associate in Imperial College London’s physics department, came across a video on a CIA Twitter account about a then-unknown Black woman scientist, Gladys West. Intrigued, Wade began digging. She discovered that West’s work at the Naval Surface Warfare Center was key to the development of GPS used in smartphones and navigation systems and decided the mathematician deserved an entry in the online encyclopedia Wikipedia.
“I hit publish at about 11 p.m. London time,” Wade says, “and by the morning someone added a photo,” and the Gladys West page grew organically. A few months later, the BBC featured West in its top 100 Women list. West – who was 87 then and is now 95 – was inducted into the Air Force Space and Missile Pioneers Hall of Fame in 2018 and received the Royal Academy of Engineering’s highest individual honor, the Prince Philip Medal, in 2021.
All because of a Wikipedia page, researched and written by a volunteer miffed that so many women scientists were absent from history. Since then, Wade has written 2,300 Wikipedia articles on women scientists and scientists of color in her “spare time.”
For almost 20 years, Wikipedia has ranked among the world’s top 10 most visited websites, currently attracting 15 billion visits a month to its 65 million articles in 343 languages. It’s the only site in the top 10 that is a nonprofit and the only one that protects privacy by not tracking or selling user data. The encyclopedia is also a growth monkey: After becoming a nonprofit project of the Wikimedia Foundation in 2003, its revenue ballooned from $80,000 in 2004 to $185 million in 2024, which it uses primarily for technology infrastructure, supporting volunteer communities, and covering administrative costs. Wikimedia has received big grants — one of the first was $3 million from the Alfred P. Sloan Foundation in 2008, followed by large donations from the Google, Omidyar, and Stanton foundations — and recently raised a $170 million endowment. But Wikipedia’s financial success is actually driven by individual donations that average $11, some of which comes from volunteers like Wade.
“Those $11 donations make up about 85 to 90 percent of our budget,” says Lisa Seitz-Gruwell, president of the Wikimedia Endowment and deputy to the CEO of the Wikimedia Foundation.
Wikipedia’s strength lies in its vast army of volunteers and its unique structure. Unlike many volunteer-centric organizations, such as the Red Cross or ACLU, Wikipedia’s 250,000 monthly volunteer editors are not just additional hands; they are central to the nonprofit’s operation, even a reason for its existence. What one reads on Wikipedia — and how and why entries remain published, edited, or removed — are part of a collective decision-making process driven by volunteers, not by a CEO or board. Volunteers also collectively own the content they produce and ensure fellow contributors adhere to strict guidelines.
Because of this structure, the nonprofit is hard to sue even as it has been drawn into partisan political wars, with Tesla CEO Elon Musk launching a conservative alternative called Grokipedia and Senator Ted Cruz singling Wikipedia out as “woke.” Wikipedia volunteers and Wikimedia leaders believe the platform has become a mediating force against misinformation and conspiracy theories that have overwhelmed social media. But even with 25 years of accomplishments under its belt, Wikipedia is facing one of its biggest challenges yet. AI companies scrape its massive trove of information to develop their large language learning models and also divert readers from the site, cutting into those ever-important small donations.
Chris Albon, Wikimedia’s director of machine learning and data, says AI can be an amazing tool, for Wikipedia and many other organizations. But the nonprofit’s strategy is not to replace human-created content; it’s to use AI for automating tedious tasks, allowing volunteers to focus on work requiring human judgment, deliberation, and consensus-building.
“At the end of the day,” Albon says, “this is not a technology product — this is a human knowledge project. The greatest resource Wikipedia has is the only thing that’s ever made Wikipedia great, which is all the volunteers who have spent billions and billions of hours of their time. After they put their kids to bed, after they make dinner and do laundry, they sit down and donate hours of their life.”
Driven by Community, Not the C-Suite
Co-founder Jimmy Wales could have taken the encyclopedia private when the site went viral in 2003 and laden it with ads and cookies, becoming a billionaire. But as he’s explained in interviews and his recent book, The Seven Rules of Trust, he decided the site should remain a noncommercial, decentralized, community-driven project funded by donations. That way, it could make good on its vision “to create a world where every single human being can freely share in the sum of all knowledge.”
The road to that aspiration has not been without bumps. In the early days, Wikipedia was considered unreliable, belittled as user-generated junk. Around 2008, after the arrival of Sue Gardner, a Canadian Broadcasting Corporation journalist and the Wikimedia Foundation’s first executive director, the encyclopedia started to improve its accuracy through community policies — such as “neutral point of view,” “verifiability,” and “no original research” — enforced by volunteer editors who debated facts and corrected errors.
The user-generated approach, which attracted so much initial criticism, has turned out to be the organization’s strongest feature.

Today Wikipedia’s volunteer editors are organized in a loose global community, through smaller topic-specific “WikiProjects” (covering topics like AI-generated content and how to describe the Israeli–Palestinian conflict) and through elected roles that handle oversight and administration. Some specialized groups, like administrators and the arbitration committee, have elected roles with greater editorial control. The Wikimedia Foundation handles technological and financial support for the platform but does not edit the encyclopedia itself. Wikipedia acknowledges the encyclopedia should not be used as a primary source for research, though based on its monthly visits — double the number of the global population — much of the world is using it as a starting point.
Wade, who runs Wikipedia “edit-a-thons” in Britain to show people how to contribute, says she hasn’t encountered problems with factual inaccuracy or bias. “Wikipedia is almost immediately fact-checked by a team of international editors, none of whom have any kind of skin in the game,” she says. “There’s a huge amount more fact-checking compared to popular science books.”
Yet Wikipedia is still far from the global encyclopedia that many of its volunteers and Wikimedia’s staff of 650 want it to be. Although there are increasing numbers of contributors and editors from small and developing countries, more than half of its pages are in European or Slavic languages. Over the past five years, to address that lack of diversity, Zdenko (Denny) Vrandečić, head of Wikimedia’s special projects, has been working on a machine-driven project called Abstract Wikipedia that aims to cultivate volunteer contributors from more of the world’s 7,000 languages.
The project translates simplified content — about, for example, the Olduvai Gorge, an archaeological site in northern Tanzania — into text that can be used across Wikipedia’s language editions. Instead of needing to add to the Olduvai Gorge page in English or German, contributors who speak Swahili or any of Tanzania’s 120 indigenous languages can provide information in their native tongue, adding to what is known about the site and decreasing the risk the information is out-of-date or biased.
“We don’t want the situation where knowledge is only coming from the Western world,” says Vrandečić, a prominent computer scientist who came to Wikipedia as a volunteer on its Croatian pages. “We really want to make sure that people from everywhere can contribute to the projects in their languages.”
Abstract Wikipedia became a finalist for this year’s MacArthur Foundation’s 100&Change competition. Although it did not win the $100 million grand prize, the nonprofit intends to keep raising money for this project to train and bring more volunteers to Wikipedia.
“The next billion internet users will be coming online over the next five years,” says Maryana Iskander, Wikimedia Foundation CEO. “With Abstract Wikipedia, we’ll be able to share high-quality information with them and with billions more, regardless of the language they speak. And just as importantly, they will be able to share their knowledge on Wikipedia for people in all languages.”
Can 250,000 Volunteers Be Sued?
Despite its efforts to maintain neutrality, Wikipedia has been dragged into politics. In April, Ed Martin, then Washington, D.C.’s interim U.S. attorney, sent a letter that claimed Wikipedia was disseminating “propaganda” and threatened to rescind the tax-exempt status of the Wikimedia Foundation. In August, the House Oversight Committee launched an investigation into potential manipulation of Wikipedia articles for what it termed propaganda and for publishing “anti-Israel information.” In October, Senator Ted Cruz sent a letter to Iskander highlighting concerns about Wikipedia’s neutrality, alleging that its list of “reliable sources” favors left-wing outlets, that the foundation contributes to left-wing organizations, and that some coordinated editing campaigns on the platform have spread misinformation and propaganda.
“The reality is that threats have always been a fact of life for Wikipedia, whether it’s questions about privacy, our editors, or government policies that seem to go counter to the philosophy of free knowledge,” says Raju Narisetti, a longtime Wall Street Journal editor who is global publishing director of McKinsey & Company and a board member of the Wikimedia Foundation. “We have dealt with this before, and we will deal with it in the future.”
The group’s volunteer structure can insulate the organization from some critiques, but it can also be messy. For example, Wikipedia’s volunteer administrators decided to lock edits on a page called “Gaza genocide” because they could not agree on certain facts. Co-founder Jimmy Wales added to the article’s discussion page, arguing it failed to meet standards of neutrality for stating in Wikipedia’s voice that Israel is committing genocide in Gaza.

Iskander and Wikimedia’s chief communications officer, Anusha Alikhan say they spend a lot of time explaining to people how Wikipedia does and doesn’t work, pointing to the encyclopedia’s user-generated and -negotiated approach to knowledge creation, the “page history” that provides transparent documentation of every single edit, and the fact that Wikipedia’s content is owned by its individual contributors.
“Neutrality is one of its oldest and most fundamental principles, which means always striving to write articles based on reliable sources, fairly, proportionally, and without bias,” Iskander says. “I encourage anyone with concerns to engage constructively on Wikipedia to report violations or make suggested changes.”
Numerous Wikipedia analysts and insiders note that one of Wikipedia’s strengths is that it’s hard to sue because the Wikimedia Foundation doesn’t write, own, or control the content. Pressure, whether from governments or critics, often fails to stick.
“Wikipedia is uniquely positioned to resist this kind of pressure because ultimately the users have all say on the rules and the content,” says Sverrir Steinsson, a University of Toronto public policy professor who studies Wikipedia. “So governments or businesses or private actors who want to use lawsuits against Wikipedia to change content — those things have not, at least from what I’ve seen, had much of an impact.”
While social media has been swamped with misinformation, conspiracy theories, and rabbit holes of extremism, Steinsson’s research shows that Wikipedia’s content has been more reliable and reflective of scientific consensus. “The people engaging with one another on Wikipedia — while they do engage in conflict — there is a deliberative, democratic aspect,” he says. “They have to engage with each other in a civil manner, which seems to contrast with how people talk to each other on Twitter or how they argue with one another in Facebook groups.”
“The nonprofit is keenly aware of the importance of its volunteers and sees its mission as supporting the work of the volunteers rather than the other way around,” says Stephen LaPorte, general counsel of the Wikimedia. “The better work that volunteers do and the more content governance decisions that can be handled independently by volunteers, the better positioned the legal department is to defend and support the projects .”
Racing With and Against AI
Wikipedia’s sea of volunteers has spent the last quarter century creating a vast and structured trove of information. The nonprofit argues this feat has been a boon for humanity, unlocking information about the world for everyone. But in doing so, it has become one of the largest sources of training data for generative AI tools.
In April, the Wikimedia Foundation warned that heavy AI-driven scraping was overloading Wikipedia’s systems. The organization reported that since January 2024, bandwidth from AI bots jumped by 50 percent, creating increasing risks and expenses.
One solution has been to charge tech companies for scraping information used to build their highly profitable AI models. Revenue to Wikimedia Enterprise, a data service launched in 2021, helps the nonprofit pay for its six independent server farms and cover its engineering and product staff, who make up close to 60 percent of employees. Deals with some of the biggest tech companies have brought in about 5 percent of the foundation’s 2024-25 revenue, says Wikimedia Endowment President Seitz-Gruwell.
AI is also eating into the organization’s formidable reach. Wikipedia reported an 8 percent drop in human visitors from March to August 2025 compared with the same months in 2024. “More vital but less visible,” says Alikhan, resulting in two new problems. First, the reduced human traffic jeopardizes the nonprofit’s fundraising model, she says, which depends on people coming to the site and giving an average gift of $11.
Second, the drop in readers shrinks the pool of people who might get inspired to become Wikipedia editors. “‘Knowledge is human’ is our tagline, and that’s the message we’re trying to send to companies like OpenAI,” Alikhan says. “And not only knowledge is human, knowledge needs humans.”
Seitz-Gruwell says part of Wikipedia’s aim going forward will be to serve as a “human check” on knowledge increasingly created by AI and to keep “reaching people who the big for-profit companies have no interest in reaching … because they can’t pay.”
“Wikipedia can teach all of us about resiliency and adaptability,” says Iskander, who steps down as the organization’s fourth leader in January. “It’s faced countless changes and challenges, and it’s endured by keeping its mission and values as the north star. I’m certain whatever lies ahead, Wikipedia and its movement will keep on doing what they do best: Providing reliable, trustworthy, human-created, and moderated knowledge for all.”
Reporting for this article was underwritten by a Lilly Endowment grant to enhance public understanding of philanthropy. The Chronicle is solely responsible for the content. See more about the Chronicle, the grant, how our foundation-supported journalism works, and our gift-acceptance policy.