Wikipedia, arguably the internet’s most vital public resource, has shaped our collective understanding since its founding in 2001 by Larry Sanger and Jimmy Wales. Operating as a nonprofit, it thrives on a decentralized network of anonymous volunteers who adhere to strict guidelines for cordial engagement and transparent editing – a model that has earned it the reputation of being ‘the last best place on the internet’.
However, this digital commons now finds itself at the heart of a storm, frequently targeted by figures like Elon Musk, congressional Republicans, and various right-wing influencers (including co-founder Sanger himself), all alleging bias. This escalating tension mirrors wider societal debates about how we achieve consensus, maintain civility, and navigate the complex landscape of truth and facts.
In response, Jimmy Wales offers his insights in a new book, ‘The Seven Rules of Trust: A Blueprint for Building Things That Last.’ Set to be released this month, the book explores how Wikipedia’s principles can foster trust in our fragmented society. We sat down with Wales to discuss his vision for Wikipedia, the challenges it confronts, and his unwavering belief in humanity’s fundamental good faith. (Editor’s Note: This interview was conducted weeks before a recent incident where an armed individual disrupted a Wikipedia conference in Manhattan.)
Listen to more from this series: Find us on Apple Podcasts, Spotify, YouTube, Amazon Music, or iHeartRadio.
Q: Your new book addresses the precarious state of trust today. How do you characterize this widespread ‘trust deficit’?
A: I differentiate between the swirling chaos of politics, journalism, and cultural battles, and the quiet trust that still underpins our daily lives. Most people, day-to-day, believe others are inherently good, simply trying their best. The current crisis of trust in institutions—politics, media, business—stems from identifiable issues, and I believe we possess the tools to mend it.
Q: As the creator of Wikipedia, a platform widely recognized for its high level of public trust, you speak from a unique position. How do you foster that trust?
A: Wikipedia, in my eyes, is always a work in progress. Our commitment to transparency is a significant factor in why people trust us. Readers often see banners indicating, for example, ‘The neutrality of this page has been disputed’ or ‘This section lacks citations.’ This openness, a willingness to admit uncertainty, is rare and deeply appreciated in today’s information landscape.
Q: Wikipedia’s open-source, decentralized nature means it’s powered by thousands of volunteer editors. You don’t ‘run’ it, do you?
A: Oh no, it runs me! [Chuckles]
Q: How do these volunteer editors navigate and resolve disagreements on contentious subjects?
A: On highly controversial topics, such as abortion, our approach is to meticulously document the dispute itself. Instead of advocating for one side (e.g., ‘abortion is a sin’ or ‘abortion is a human right’), we present each viewpoint neutrally, stating, ‘The Catholic Church maintains this position, while critics offer these counterarguments.’ I firmly believe readers seek a comprehensive understanding of the various perspectives, not just a single narrative. They want to grasp the full spectrum of the debate.

Q: The ‘talk’ tab on every Wikipedia page, showcasing the history of discussions and disagreements, embodies the site’s commitment to transparency.
A: Precisely. These talk pages allow users to delve into past debates, contribute new perspectives, or even suggest compromises. My observation is that even highly ideological individuals often find common ground there, secure in their convictions. It’s often those less confident in their own beliefs – many of whom you’ll encounter on platforms like X – who react with fear, panic, or anger to differing opinions, rather than engaging in a reasoned articulation of their own stance.
Q: Wikipedia’s strength in human-driven editing also presents a vulnerability, as anonymous editors face threats, doxxing, and governmental pressure, even forcing some to flee their countries, as seen in Russia and India. Is this a growing concern?
A: Absolutely. We’re observing a global surge in authoritarian tendencies towards censorship and information control, often cloaked in benign rhetoric like ‘protecting children.’ Despite this, Wikipedians demonstrate remarkable resilience and courage. Much of this pressure stems from a fundamental misunderstanding by politicians and leaders about Wikipedia’s operational model. They often mistakenly believe it’s centrally controlled by the Wikimedia Foundation—the charity I established to manage the website—and thus susceptible to external influence. However, our community maintains genuine intellectual independence. Ensuring the safety of our volunteers in perilous situations is a paramount concern for us.
Q: In the U.S., Republican Representatives James Comer and Nancy Mace recently sent a letter to Wikimedia, seeking records and analysis on editors and bias concerning Israel, citing concerns about ‘foreign operations and individuals at academic institutions subsidized by U.S. taxpayer dollars to influence U.S. public opinion.’ What’s your take on this?
A: We’ve addressed the reasonable components of their inquiry. This situation, however, highlights a profound lack of understanding regarding Wikipedia’s functioning. The notion that perceived bias on our platform warrants a congressional investigation is, quite frankly, preposterous. As for ‘cloak-and-dagger’ activities, we have nothing useful to report; our Wikipedians are simply a community of dedicated, good-natured enthusiasts.
Q: The Heritage Foundation, known for Project 2025, has reportedly threatened to expose your editors. How do you safeguard them?
A: This is quite embarrassing for the Heritage Foundation; I recall a time when they were intellectually esteemed.
Q: There appears to be a concerted effort from the political right to target Wikipedia. What do you believe is driving this?
A: It’s complex. Some criticisms might stem from genuine concern, perceiving bias in our content. Elon Musk, for instance, argues that Wikipedia’s strict reliance on mainstream media, which he deems biased, results in a skewed perspective. This is a critique that deserves consideration from all media outlets. Elsewhere, and not just in the U.S., facts themselves are under attack. If your policies clash with established facts, then straightforward explanations can become uncomfortable for some. But we will not capitulate to misinformation. We’re not going to entertain notions like ‘science isn’t valid’ or ‘the Covid vaccine killed half the population.’ That’s absurd, and we simply won’t publish it. They’ll just have to accept that.
Q: Let’s address the recent controversy surrounding the assassination of Charlie Kirk. Senator Mike Lee harshly criticized Wikipedia for initially labeling Kirk as a ‘far-right conspiracy theorist,’ a description that has since been removed. While some on the left considered this accurate, those on the right decried it as biased. How do you navigate such conflicting viewpoints?
A: The proper approach is to acknowledge the full scope of the issue. Undeniably, Charlie Kirk was a controversial figure. He was seen as a hero by many and a villain by others. His views often diverged from mainstream scientific thought but resonated strongly with certain religious perspectives. Our role, which I believe we fulfilled in this instance, is to objectively describe all these facets. For someone unfamiliar with Kirk, our page should provide a comprehensive overview: who he was, why he garnered support, his arguments, and the statements that provoked controversy. It’s about enabling a complete understanding of a public figure within the context of their world.
Q: So, regarding the terms ‘far right’ and ‘conspiracy theorist’ used on Kirk’s page, do you concede that critics had a valid point about their inappropriateness?
A: It hinges on the nature of the criticism. If the issue is that a certain word was present for a brief period, one must remember Wikipedia’s dynamic, process-driven nature—it’s a continuous discourse. However, if prominent individuals have indeed labeled him a ‘conspiracy theorist,’ that becomes a verifiable part of his public narrative. While Wikipedia itself shouldn’t adopt such labels, we absolutely must document that these accusations were made.
Listen to the Conversation With Jimmy Wales
Attacks on Wikipedia are mounting. Its co-founder advocates for trusting the process.
Q: Elon Musk has repeatedly criticized Wikipedia, labeling it ‘Wokepedia’ and attempting to launch his own ‘Grokipedia’ to eliminate ideological bias. How do you perceive the impact of such attacks on public trust in Wikipedia, especially given how external narratives can erode confidence, as seen in journalism?
A: It’s difficult to quantify. Many people have very low trust in Elon Musk due to his frequent outlandish statements. Ironically, his attacks often spur increased donations to Wikipedia. While not my preferred fundraising method, it reveals a strong negative public reaction to his rhetoric. As I’ve told him personally, this kind of attack is ultimately counterproductive, even for those who align with his views. If people are falsely led to believe Wikipedia is overrun by ‘woke’ activists, two outcomes emerge: well-meaning conservatives, whom we genuinely want to engage, will disengage, thinking it’s a lost cause. Conversely, actual ‘woke’ activists might see it as an invitation to inject their agendas, which we also actively discourage.
Q: You mentioned discussing this with Elon Musk. Can you elaborate on when these conversations occurred and their nature?
A: We’ve engaged in various discussions over the years, often via text. He tends to be far more respectful and subdued in private, as one might expect from someone with such a prominent public persona.
Q: When was your most recent exchange?
A: That’s a good question. I believe it was the morning after the last election; he texted me, and I congratulated him.
Q: More recently, a controversy arose from a hand gesture Elon Musk made, which was interpreted in various ways and led to his dissatisfaction with its depiction on Wikipedia.
A: I did hear from him regarding that. In that instance, I defended our entry after reviewing it. Wikipedia’s account was purely factual: it stated he made the gesture, noted the extensive media coverage and diverse interpretations, and included his denial of it being a Nazi salute. I fail to see how such an objective presentation could cause offense. If Wikipedia had declared ‘Elon Musk is a Nazi,’ that would be fundamentally incorrect. But to simply report, ‘He made this gesture, it garnered significant attention, and some perceived it as a Nazi salute,’ is precisely what Wikipedia should do.
Q: Considering your observation that Elon Musk is personable in private but dramatically different publicly, do you believe he acts in good faith?
A: Attempting to decipher Elon Musk’s motivations strikes me as a futile exercise, so I won’t endeavor to do so.
Q: I’m not trying to be insistent, but you previously stated that people, on a human level, are generally kind and that we should assume good faith. Yet, despite your private interactions, Elon Musk publicly attacks your organization, potentially undermining support for Wikipedia.
A: I honestly don’t believe he possesses the power he, or many others, attributes to him to truly harm Wikipedia. We will endure for centuries; he will not. As long as we remain true to Wikipedia’s core principles, public affection for us will persist. All the external clamor and incessant ranting are mere distractions. The true essence lies in authentic human knowledge, meaningful discourse, and a sincere engagement with the complex issues of our time. That, fundamentally, is invaluable. I genuinely hope Elon reconsiders his stance. In the interim, I see no need for us to dwell on it excessively.
Q: Why do you think the internet, as a whole, didn’t evolve in a more Wikipedia-like fashion—more collegial, community-driven, and intellectually curious?
A: Having experienced the early internet, particularly Usenet—a vast, decentralized message board akin to today’s Reddit, but largely unmoderated by design—I recall its notorious toxicity. So, I believe some of these challenges are simply inherent human issues. The difference today is that our lives are so deeply intertwined with online spaces, amplifying the impact significantly.
Q: At a crucial juncture, you opted to establish Wikipedia as a nonprofit, foregoing significant commercial capitalization. Companies like OpenAI began with a similar ‘open source for the greater good’ ethos but have since transitioned into multibillion-dollar enterprises. What are your thoughts on this shift, and more broadly, do you believe financial motives fundamentally alter such projects?
A: Yes, I believe money profoundly shifts the equation in many ways. While there’s nothing inherently wrong with for-profit ventures, even a nonprofit must establish a sustainable business model. For Wikipedia, our operational costs are manageable; we don’t need billions to thrive. The community-driven development that defines Wikipedia would likely be compromised if our board prioritized investor profitability. My most popular tweet was a simple ‘Not for sale’ in response to a journalist suggesting Elon Musk buy Wikipedia. It is genuinely not for sale. I like to imagine I’d refuse a $30 billion offer if I owned it outright, but that hypothetical is irrelevant because we are a charity. Neither I nor the board are compensated in that manner, and that financial independence is crucial to our mission. We simply don’t operate with commercial interests.
Q: Wikipedia co-founder Larry Sanger recently spoke with Tucker Carlson, garnering considerable right-wing attention. Sanger has previously labeled Wikipedia ‘one of the most effective organs of establishment propaganda in history,’ alleging a liberal bias. He’s now proposing reforms like ‘reveal who Wikipedia’s leaders are’ and ‘abolish source blacklists.’ What’s your reaction to his current stance?
A: I haven’t watched the interview; I simply cannot tolerate Tucker Carlson. So, I can’t comment on the specifics. However, the premise that all sources are equally valid, or that Wikipedia is wrong to prioritize mainstream media, quality newspapers, and magazines while making editorial judgments, is something I will not apologize for. A core tenet of mine is that Wikipedia must always be open to criticism and adaptation. Therefore, if a critique identifies a specific bias or systemic flaw, we take it seriously. We ask: ‘Can we improve Wikipedia? Is our editor composition optimal?’ Simultaneously, our long-term vision demands that we don’t pander to fleeting public outrage. Our survival depends on upholding our values and trustworthiness. We will continue our work, striving for excellence; there’s little else to be done.
This interview has been edited and condensed from two separate conversations. You can listen to and follow ‘The Interview’ on major podcast platforms. (Video direction by Zackary Canepari).