Wikipedia stands as a monumental online resource, shaping our collective understanding of nearly every subject imaginable. It functions as the internet’s closest approximation to a public utility. Established in 2001 by Larry Sanger and Jimmy Wales, Wikipedia has always operated as a nonprofit, relying on a vast network of mostly anonymous volunteer editors. The platform is built on principles of courteous engagement and transparent change-making, earning it the reputation of being ‘the last best place on the internet.’
However, in recent times, Wikipedia has found itself under fire. Figures like Elon Musk, congressional Republicans, and prominent right-wing influencers frequently accuse the site of bias. (Interestingly, even co-founder Larry Sanger now echoes this sentiment.) This ongoing friction mirrors broader societal debates about consensus, civility, shared reality, and the very nature of truth and facts.
Now, Jimmy Wales is releasing his new book, ‘The Seven Rules of Trust: A Blueprint for Building Things That Last.’ In this work, Wales endeavors to apply Wikipedia’s foundational principles to our increasingly divided and trust-deficient global landscape. I had the opportunity to discuss with Wales what he believes makes Wikipedia so effective, the various challenges it currently faces, and his unwavering conviction in humanity’s inherent good faith. It’s important to note that our conversation took place several weeks before a distressing incident involving an armed individual at a Wikipedia conference in Manhattan. More information about that event can be found in our related coverage.
This is a very tenuous moment for trust, and your new book is all about that. Big picture: How would you describe our current trust deficit? I differentiate between the turmoil in politics, journalism, and culture wars, and the realities of daily life. In our everyday interactions, people generally still trust one another. Most believe that others are fundamentally good, navigating life with genuine intentions. Yet, the deep crisis we observe in political institutions, journalism, and business stems from distinct sources and is a problem that can be actively addressed.
One reason you can be an authority on this is that you created something that scores very high on trust. Wikipedia, in my eyes, isn’t perfect. But a significant part of why people place their trust in us is our commitment to transparency. You’ll often see banners at the top of pages indicating, for instance, ‘The neutrality of this page has been disputed’ or ‘The following section doesn’t cite any sources.’ Users appreciate this honesty. Few other platforms today are willing to openly admit when they aren’t entirely certain about something.
Wikipedia is famously open-source. It’s decentralized and essentially run by thousands of volunteer editors. You don’t run Wikipedia. It runs me. [Laughs]
How do those editors fix disputes when they don’t agree? Consider a contentious topic like abortion. Our approach is to report on the existence of the dispute itself. So, instead of declaring abortion a sin or a human right, we would state, ‘The Catholic Church maintains this position, and critics have responded with these arguments.’ I believe this is what readers truly seek. They don’t want a single, biased viewpoint; they want to grasp the various arguments and understand all sides of an issue.

Every page has what’s called a talk tab, where you can see the history of the discussions and the disputes, which relates to another principle of the site: transparency. Exactly. Often, you can delve into the talk page, review the historical debate, and even contribute your perspective: ‘Actually, I still believe you’re mistaken. Here are additional sources and information.’ You might even propose a compromise. In my experience, even highly ideological individuals on both sides are often more comfortable with this process because they are secure in their convictions. I find that those who lack confidence in their own values and beliefs—many of whom are active on platforms like X—tend to react with fear, panic, or anger when confronted with disagreement, rather than engaging in a constructive exchange.
One of Wikipedia’s superpowers can also be a vulnerability. Human editors can be threatened, even though they’re supposed to be anonymous. You’ve had editors doxxed, pressured by governments to doctor information. Some have had to flee their home countries. I’m thinking of what’s happened in Russia, India, where those governments have really taken aim at Wikipedia. Would you say this is an expanding problem? Yes, I would. We are witnessing a global surge in authoritarian tendencies toward censorship and control over information. These efforts frequently appear disguised as noble causes, such as ‘protecting children.’ However, Wikipedians exhibit remarkable resilience and courage. In many instances, the problem stems from a profound misunderstanding among politicians and leaders about Wikipedia’s operational structure. Many mistakenly assume it’s centrally controlled by the Wikimedia Foundation—the charity I established to own and operate the website—and thus believe they can exert pressure. The community, however, possesses genuine intellectual independence. Nevertheless, I am deeply concerned about our volunteers who face dangerous circumstances, and ensuring their safety is paramount.
I want to bring up something that just happened here in the United States. In August, James Comer and Nancy Mace, two Republican representatives from the House oversight committee, wrote a letter to Wikimedia requesting records, communication, analysis on specific editors and any reviews on bias regarding the state of Israel, in particular. They are, and I’m going to quote here, “investigating the efforts of foreign operations and individuals at academic institutions subsidized by U.S. taxpayer dollars to influence U.S. public opinion.” Can you tell me your reaction to that query? We responded to the reasonable portions of their request. We believe there’s a significant misunderstanding about how Wikipedia functions. The notion that an alleged bias is a legitimate subject for a congressional investigation is, frankly, absurd. Regarding their questions about clandestine activities, we simply won’t have anything useful to provide. I know the Wikipedians—they are a group of well-intentioned enthusiasts.
The Heritage Foundation, the architect of Project 2025, has said that it wants to dox your editors. How do you protect people from that? It’s quite embarrassing for the Heritage Foundation; I recall when they were intellectually respected.
But it does seem as if there is this movement on the right to target Wikipedia, and I’m wondering why you think that’s happening. It’s difficult to pinpoint an exact reason. Some of it might stem from genuine concern if they perceive Wikipedia as biased. For example, Elon Musk has argued that Wikipedia is biased due to its stringent rules about citing only mainstream media, asserting that mainstream media itself is biased. This is an interesting critique, worthy of reflection by everyone, including the media. Additionally, in various parts of the world, not just the U.S., facts can be perceived as threatening. If your policies or beliefs clash with established facts, then you might find it uncomfortable for people to simply present those facts. However, we are not going to concede by saying, ‘Perhaps science isn’t valid after all. Maybe the COVID vaccine killed half the population.’ No, it didn’t; that’s absurd, and we will not publish such claims. They will have to come to terms with that.

I want to talk about a recent example of a controversy surrounding Wikipedia, and that’s the assassination of Charlie Kirk. Senator Mike Lee called Wikipedia “wicked” because of the way it had described Kirk on its page as a far-right conspiracy theorist. I went to look, and at the time that we’re speaking, that description is now gone. Those on the left would say that description was accurate. Those on the right would say it was biased. How do you see that tension? The correct approach is to address all facets of the issue. The least controversial statement about Charlie Kirk is that he was a controversial figure. I don’t believe anyone would dispute that. To present him as someone who was both a hero to many and demonized by others, holding views that diverged from mainstream scientific thought but aligned closely with certain religious perspectives—these are the details we strive to describe effectively, as I believe we have in this instance. If you knew nothing about Charlie Kirk beyond his assassination, you’d want to learn about his supporters, their reasons, his arguments, and what he said that caused offense. That’s simply part of understanding the world.
So those words that were there — “far right” and “conspiracy theorist” — those were, in your view, the wrong words, and the critics of Wikipedia had a point? Well, it depends on the specific criticism. If the critique is merely that a certain word appeared on a page for 17 minutes, one must understand Wikipedia’s nature: it is a continuous process, a discourse, a dialogue. However, to the extent that prominent individuals labeled him a conspiracy theorist, that is part of his documented history. While Wikipedia itself shouldn’t necessarily label him that, we must certainly document the fact that others did.
Listen to the full conversation with Jimmy Wales, where he discusses the mounting attacks on Wikipedia and his belief in the process of collective trust and transparency. The interview is available on major podcast platforms including Apple Podcasts, Spotify, YouTube, Amazon Music, and iHeartRadio.
You mentioned Elon Musk, who has come after Wikipedia. He calls it Wokepedia. He’s now trying to start his own version of Wikipedia called Grokipedia. And he says it’s going to strip out ideological bias. I wonder what you think attacks like his do for people’s trust in your platform. Because as we’ve seen in the journalism space, if enough outside actors are telling people not to trust something, they won’t. It’s incredibly difficult to predict. For many, their trust in Elon Musk is quite low given his frequent outlandish statements. When he attacks us, people tend to donate more money. While it’s not my preferred fundraising method, the reality is that many react very negatively to his behavior. One point I emphasize in my book, and one I’ve shared with Elon Musk himself, is that this type of attack is self-defeating, even if you agree with his sentiment. If he falsely convinces people that Wikipedia has been infiltrated by ‘woke’ activists, two outcomes emerge: our genuinely kind and thoughtful conservative contributors—whom we highly value and wish to attract more of—will disengage, believing the platform is compromised. Conversely, actual ‘woke’ activists might mistakenly see it as their ideal haven to vent their frustrations against the world. We don’t desire either of these scenarios.
You said you talked to Elon Musk about this. When did you talk to him about it, and what was that conversation like? We’ve had various exchanges over the years. He texts me sometimes, and I text him. He is notably more respectful and reserved in private, which is to be expected given his prominent public persona.
When was the last time you had that exchange? That’s a good question. I’m not sure. I believe it was the morning after the last election; he texted me then, and I congratulated him.
The debate that happened more recently was because of the hand gesture he made that was interpreted in different ways, and he was upset about how it had been characterized on Wikipedia. I heard from him after that. In that specific instance, I pushed back because I checked Wikipedia’s entry. It was purely factual: it reported that he made the gesture, it received significant news coverage, many interpreted it in a certain way, and he denied it was a Nazi salute. I fail to see why anyone would be upset with such a straightforward presentation. If Wikipedia had stated, ‘Elon Musk is a Nazi,’ that would be absolutely incorrect. But to simply document, ‘He made this gesture, it garnered considerable attention, and some people said it resembled a Nazi salute’? That’s precisely what Wikipedia ought to do.

Do you think Elon Musk is acting in good faith? You’re saying that in private he’s nice and cordial, but his public persona is very different. I think it’s a futile exercise to try and decipher what goes on in Elon Musk’s mind, so I won’t attempt it.
I don’t mean to press you on this, but I’m just trying to refer to something that you said, which is that people, human to human, are nice, that we should assume good faith. And so you’re saying that Elon, one on one, is lovely. But he is attacking your institution and potentially draining support for Wikipedia. I don’t believe he possesses the power he attributes to himself, or that many others believe he does, to harm Wikipedia. We will endure for centuries; he will not. As long as we remain true to Wikipedia’s essence, people will continue to value us. All the external noise and all these rants are not the core of what we do. The true value lies in genuine human knowledge, authentic discourse, and a sincere effort to grapple with the complex issues of our time. That is extraordinarily valuable. Therefore, I hope Elon will reconsider and change his perspective; that would be beneficial. In the meantime, I don’t think we need to dwell on it.
Why do you think the internet didn’t go the way of Wikipedia — collegial, working for the greater good, fun, nerdy? I’m old enough to have experienced the internet in the Usenet era, which was essentially a massive message board—somewhat akin to today’s Reddit, but designed to be decentralized, uncontrollable, and largely unmoderatable. It gained notoriety for being incredibly toxic. So, I believe some of these challenges are simply inherent human issues. However, since we now live so much of our lives online, the impact is, of course, far greater.
You chose at a certain point to make Wikipedia a nonprofit. You chose not to capitalize on the success of Wikipedia. OpenAI started as an “open source for the greater good” project, kind of like Wikipedia. And they’ve now shifted into being a multibillion-dollar business. I’d love to know your thoughts on that shift for OpenAI, but more broadly, do you think the money really changed the equation? I absolutely believe that money fundamentally altered the dynamic in numerous ways. While there’s nothing inherently wrong with for-profit companies, even as a nonprofit, you must have a sustainable business model to cover operational costs. For Wikipedia, this isn’t an overwhelming burden; we don’t require billions to function. The community-driven development of Wikipedia, for example, wouldn’t necessarily thrive if the board were composed of investors primarily concerned with profitability. My most memorable tweet was in response to a New York Post journalist who suggested Elon Musk ‘should just buy Wikipedia’ when he was criticizing us. I simply replied, ‘Not for sale.’ It was very popular, and indeed, it isn’t for sale. I like to imagine I’d decline a $30 billion offer from Elon if I personally owned the entire entity. But would I, truly? This hypothetical is moot because we are a charity; neither I nor the board receive payment. I firmly believe this financial independence is crucial for our autonomy. We simply don’t operate with profit in mind, nor are we interested in it.
The co-founder of Wikipedia, Larry Sanger, gave an interview to Tucker Carlson that’s getting a lot of attention on the right. In the past he’s called Wikipedia “one of the most effective organs of establishment propaganda in history,” and he believes Wikipedia has a liberal bias. In this interview, and on his X feed, he’s advocating what he’s calling reforms to the site, which include “reveal who Wikipedia’s leaders are” and “abolish source blacklists.” I just wonder what you make of it. I haven’t watched it; I can’t tolerate Tucker Carlson. So, I cannot comment on the specifics. However, the notion that every source holds equal validity, and that Wikipedia’s prioritization of mainstream media and reputable newspapers and magazines is somehow flawed, is not something I will apologize for. A core belief of mine is that Wikipedia must always be open to criticism and adaptation. Thus, if a critique asserts that Wikipedia exhibits bias and highlights systemic flaws, we should take it seriously. We should ask ourselves, ‘Is there a path to improving Wikipedia? Is our current mix of editors appropriate?’ Simultaneously, we are building for the long term, and the only way to sustain that longevity is not by capitulating to the fleeting outrage of the moment, but by steadfastly upholding our values and maintaining our trustworthiness. We will simply continue our work, doing it to the best of our ability. I don’t see what other option we have.
This interview has been carefully edited and condensed from two separate conversations. The video’s director of photography was Zackary Canepari.