Wikipedia may be the largest compendium of human knowledge ever created, but can it survive?
Length: • 17 mins
Annotated by Benji Rogers
As the website turns 25, it faces myriad challenges from regulators, AI, the far right and Elon Musk
Darren Loucaides
The noise was rising in the main auditorium of Civic Hall near Union Square, Manhattan, as old friends and complete strangers greeted each other. I took a seat near the front. The crowd was eclectic: hipster-geeks with purple or green hair, young hackers in hoodies, boomer techies in tight T-shirts. More of them were male than female, but not by much. Everyone was excited, friendly and just a little awkward. It was like one part Star Trek convention, one part alternative music concert.
This was WikiCon North America, an annual gathering of Wikipedians, as those who contribute to the online encyclopedia are known, staged here last October to mark New York City’s 400th anniversary. Wikipedia’s own major milestone was round the corner: on January 15 2026, the website would turn 25.
Maryana Iskander, CEO of Wikimedia Foundation, the non-profit that runs Wikipedia, took to the stage. Dressed in a red trouser suit and white trainers, Iskander was two minutes into her speech when a tall young man draped in a multicoloured flag approached her on stage. He was holding something aloft, pointed at the ceiling.
“I’m an anti-contact, non-offending paedophile,” he addressed the audience. (“What the fuck!” came an outraged shout.) Raising his voice, the man went on, “I’m going to kill myself to protest Wikipedia’s ‘don’t ask, don’t tell’ policy . . . ” (Wikipedia’s policy is in fact that editors “who identify themselves as paedophiles will be blocked and banned indefinitely.”)
The penny dropped: what the man was holding above Iskander was a gun. Things happened quickly. An organiser on stage moved towards him. The pistol swept the crowd for an instant. Everyone ducked behind chairs. The organiser was joined by another and together they wrestled the man to the ground. They stayed wrapped around him, holding his still-firmly-gripped pistol away from the audience as we filed out. The gun, as later revealed by police, was loaded.
As we crammed into the next-door building, there were a few tears and at least one person having a panic attack, as well as some gallows humour: “That might be the only excitement here,” said a guy in a baseball cap. Later, someone pointed out that this is the sort of thing that gets updated in real time on Wikipedia; indeed within minutes there was a new edit about the incident on the event’s page.
The NYPD arrived. A bomb squad was on its way. When the gunman was brought out of the building and placed in the back of a police car, he looked impassive, almost emotionless. An editor who goes by Barkeep49 told me that it was inevitable that Wikipedia attracted people from extreme niches. “I think we have seen in the world that we’re living in, we’ve been reminded just how dangerous free information can be,” he added. We used to think of free information as a democratising force, innately good, but “that’s not the world that we have [now]”.
A couple of hours later, I sat down with Iskander. The 50-year-old, Cairo-born former COO of Planned Parenthood was tough and leaderly. She deflected my questions about how she was feeling. “Are you OK?” she asked, having noticed me sitting only a couple of rows away in the audience when the incident happened. “On the one hand, the thing that makes Wikipedia truly magical is that it’s open to anyone who shares our vision and our values,” opined Iskander, who is stepping down on January 20 and had quipped onstage about giving a speech on her way out the door. The incident is an example of the tension that can emerge “when a thing can belong to everyone and no one all at the same time”. It shouldn’t paint a whole community. Her phone lit up with the name of Wikipedia’s founder, Jimmy Wales. “Hey, Jimmy, I’m fine,” she said, excusing herself. “I am absolutely fine. Absolutely fine. Everybody, everybody is fine.”

This wasn’t, of course, the story I came to New York to report. As Wikipedia neared its quarter century, I wanted to investigate whether the website can survive myriad challenges from regulators, AI, the far right and Elon Musk. But the incident spoke to the febrile atmosphere in the US and beyond. As Wales himself points out in his new book The Seven Rules of Trust: A Blueprint For Building Things That Last, one in five Americans believe they may have to resort to violence to get their country back on track. The internet has made it feel like each of our tribes inhabits different, irreconcilable realities. And yet somehow, on Wikipedia, people manage to reach a consensus every day. How did that happen?
Before Wikipedia’s birth in 2001 there was Nupedia, and before Nupedia there was Bomis, a male-oriented web portal co-founded by Wales, a former options trader. Bomis was set up as an alternative to Yahoo, an index for the early internet, but it became a favourite for those seeking soft-porn content. In 2000, Wales founded an online encyclopedia with funding from Bomis, and hired its first employee, the philosophy PhD student Larry Sanger, who Wales knew from an Ayn Rand listserv he used to run. The idea was that it would be written by volunteers.
Nupedia was not a success. Volunteers had to submit a CV to prove they were experts on their topic and follow a seven-step process. A year in, Nupedia had just over 20 articles. Wales and Sanger decided to pivot, launching a “wiki” of Nupedia, a type of website that allows multiple users to create, edit and hyperlink pages. The expert volunteers weren’t keen, so the pair created a separate site called Wikipedia. This time people showed interest. By the end of the year, it had about 18,000 articles.
From the start, Wikipedia called for volunteers to use a neutral point of view, or NPOV. Wales and Sanger were not prepared for its vertiginous growth, though, and there were few other guidelines. Sanger, who oversaw the site’s day-to-day operations, lost the faith of many volunteers for seemingly overstepping his authority. When Bomis was hit hard by the dotcom bubble burst, Wales told Sanger that he was cutting staff and let him go. Wales decided the website didn’t need a manager. He became its “God-King”, as some Wikipedians ironically dubbed him, wading into debates only when strictly necessary, and ultimately becoming a sort of constitutional monarch.
In his book, Wales likens Wikipedia’s development to the four-century evolution of Britain towards an elected parliament with a head of state whose roles are largely ceremonial. (“So you’re the King Charles of Wikipedia?” I asked him when we met in November. “Well, I hope not. I go more for Elizabeth II.”) From Wikipedia’s start, Wales “deliberately decided I didn’t want to be the dictator of all human knowledge,” he told me. “That sounds terrible. I am not the boss of anybody, but I do think I can play a part and a role in reminding and coaching and cajoling.”
In its early years, Wikipedia was mocked for being unreliable. Unlike Nupedia, or Encyclopedia Britannica, which went online at around the same time, you didn’t need to be an expert to write or edit an article. Without even logging in, you could edit a page. As Wikipedia grew, it developed a rigorous system for what an article needed to contain. Debates in “talk” pages improved the quality, particularly on contentious issues like, say, abortion. From early on, a “talk” tab at the top of any Wikipedia article allowed readers to see the discussion between editors of the page about everything from the sources used to whether the language was objective, or what the title should be.
Today Wikipedia is, for many, a beacon of what the internet once promised to be: a democratising vehicle for information, bestowing free knowledge on one and all. If you Google something, the top result has long been a Wikipedia entry. Now, as people increasingly use AI tools like ChatGPT, the results they see are in no small part based on Wikipedia; today’s large language models have been trained on Wikipedia’s millions of articles. This has led to a decline in eyeballs on Wikipedia, Iskander tells me. Some longtime Wikipedians privately also worry about declining editor numbers. For now, though, Wikipedia remains in the top 10 most viewed websites, while eschewing the business model of the other top platforms: Google, Facebook, Instagram, YouTube and Amazon. This crowdsourced, non-profit website has become the largest compendium of human knowledge ever created. That doesn’t mean it will survive the most challenging moment in its history.
Not long after Elon Musk celebrated Donald Trump’s inauguration at the Capital One arena on January 20 2025, a revision appeared on the tech billionaire’s Wikipedia page. A user called PickleG13 added: “At a rally following the second inauguration of Donald Trump, Musk appeared to perform a Nazi salute.” Behind the scenes, PickleG13 acknowledged in a note to fellow editors that “this controversy will be debated”, but as news reports were already saying “Musk may have performed a Hitler salute,” PickleG13 felt the revision was warranted.
Today Wikipedia is, for many, a beacon of what the internet once promised to be: a democratising vehicle bestowing free knowledge
Two minutes later another user undid the revision. Debate ensued as to whether the incident was important enough to merit inclusion. A user called Johnny Rose 11 added a longer edit, writing that Musk made a gesture “widely viewed as being a Nazi salute”, only for another user to expand the paragraph with comment from the Anti-Defamation League claiming it was an “awkward” gesture and not antisemitic. Over the next few days, dozens of edits were made. Eventually a gif of Musk’s gesture was added to the article and a separate page created for the incident, with several thousand words spent in the talk page on what the article should be called. This was Wikipedia’s global wired knowledge machine in all its messy glory.
The day after the inauguration, Musk reposted a screenshot of the addition to his Wikipedia page, attacking the website for being “an extension of legacy media propaganda!” He also reposted a tweet about the site’s supposed leftwing bias, writing: “Defund Wikipedia until balance is restored!” Jimmy Wales replied to Musk’s tweets: “Is there anything you consider inaccurate in that description?” The Wikipedia entry wasn’t “MSM propaganda”, he insisted.
The two men have a history of sparring online. Back in 2022, before Musk’s acquisition of Twitter, Wales responded to a Musk post about Wikipedia losing its objectivity: “Reading too much Twitter nonsense is making you stupid,” he wrote. Last year, Musk launched Grokipedia, an AI-powered rival to Wikipedia, though early signs are that it’s awash with far-right disinformation and not a realistic alternative.
As both Wikimedia Foundation leaders like Iskander and ordinary Wikipedians put it to me, it’s not just their website that is under attack, but the entire information ecosystem: media outlets, universities, even libraries. The Heritage Foundation, which produced the Project 2025 plan for Trump’s second term, even mooted identifying and targeting Wikipedia editors it disagrees with using facial recognition. And beyond the US, the Wikimedia Foundation is currently in a legal battle in Indian courts over defamation, primarily linked to articles about the country’s largest newswire, Asian News International. The articles noted instances of anti-Muslim bias and a history of pro-government propaganda on ANI, which led Indian courts to demand that Wikimedia hand over the data of editors. Wikipedia was blocked in Turkey for almost three years, and remains blocked in China.
A global trend towards regulating online content risks making Wikimedia Foundation’s work untenable. Laws such as the EU’s Digital Services Act and the Online Safety Act in the UK fail to differentiate the website from for-profit platforms, Wikipedians claim. Where governments compel platforms to take more responsibility for content posted by users, they could force Wikipedia to go dark in their countries.
Wikipedians often use pseudonyms to protect their identity. The reasons vary, but editors have been harassed and, in some countries, prosecuted. As for the user who goes by ScottishFinnishRadish, it’s best he stays anonymous due to the sensitive nature of his work — as well as for his safety. The interruption of the gunman at WikiCon was in protest at ScottishFinnishRadish blocking him eight months earlier (in a public statement, the latter would later criticise the Wikimedia Foundation for, in his view, failing to keep attendees safe). He’s not your typical Wikipedian. New England born and bred, he’s a big guy with a shaved head, beard, khaki cargo pants, heavy boots. He’s a gun owner, hunter and homesteader. He raises his own meat. “I call myself a small ‘l’ liberal,” he told me on day two of WikiCon. In his worldview, “Gays, good; trans, good; guns, good.” People are entitled to their rights, and corporations aren’t people, he says, deep-voiced and jovial. “But on the Wikipedia spectrum, that puts me way over to the right.”
ScottishFinnishRadish has spent a lot of time over the years he’s been editing trying to address problems with articles on right-wing figures. “Most of the editors are liberal,” he doesn’t mind saying. “People are biased, and even if you’re aware of your own biases, it still will come out, especially in aggregate, over, you know, 1,000 people working on something.” Is an article on a particular political figure correct in calling them “far right”? Even if ScottishFinnishRadish personally believes the categorisation to be accurate, Wikipedia isn’t about personal feelings. Reliable sources must back up every assertion. If an opinion piece from the progressive magazine Mother Jones is your only source for someone being “far right”, that’s not going to cut it, he said.
Adherence to Wikipedia’s foundational neutrality was put to the test on September 10 last year with the shooting of Charlie Kirk, the right-wing activist and co-founder of Maga-aligned student organisation Turning Point USA. Kirk’s death led droves of people to his Wikipedia page, many trying to edit it. Right-wingers complained that it was biased — for highlighting that he had called Martin Luther King “awful”, perpetuated the Great Replacement theory, or repeatedly made offensive statements about Black people — while left-wingers turned up to disparage him.
The beauty of Wikipedia is that, in theory, anyone can edit anything. The downside is when there’s an avalanche of bad actors on a top-viewed page
The beauty of Wikipedia is that, in theory, anyone can edit anything without even having to log in. The downside is when there’s an avalanche of bad actors on a top-viewed page. As the debate raged in the days after the shooting and people kept trying to interfere with the article, ScottishFinnishRadish, who is a member of the website’s arbitration committee, twice changed the settings on Kirk’s page. First, he locked changes to all but administrators for 48 hours. Then he permitted edits only by “extended confirmed” users — that is, users who have had an account for 30 days and made more than 500 edits — due to “persistent vandalism” by autoconfirmed accounts.
It soon became apparent Kirk’s shooting merited its own page, which itself attracted controversy; entitled “Killing of Charlie Kirk”, some thought it should instead be “murder” as on the page “Murder of George Floyd”. After thousands of words of debate on the article’s talk page, it ended up being called “Assassination of Charlie Kirk”. In ScottishFinnishRadish’s personal view, this was inaccurate; reliable sources had so far avoided using the term, and given that the suspected assailant hasn’t been convicted yet, “assassination” seemed premature. “But it’s one of those situations where it’s not worth the argument,” he told me with a good-natured grin.
Kirk’s main page, which would become Wikipedia’s most-read article of 2025 with nearly 45 million views, was always going to be the focus of tensions. “We often jokingly say, ‘If it’s a problem in the real world, it’s probably a challenge on Wikipedia too,’” said Anne Clin, a longtime editor and administrator who goes by the username Risker. Over the years, the website has developed robust processes for limiting the impact of excessive zeal one way or the other, she said.
“A lot of energy and effort was taken to guide people and to make sure that bad things didn’t wind up in the article that were inappropriate,” Clin told me in a video call from Canada, where she lives. “I mean, we are talking about a human being who we have to take care of, not allowing somebody to harm their reputation unnecessarily . . . It was a horrendous, incredibly traumatic event for anybody who was in that crowd. Horrendous tragedy for the family.”
Despite the furore over Kirk’s articles, they ultimately demonstrated how well Wikipedia works. People with very different viewpoints managed to come together and agree on a middle ground. In 2019, researchers studying Wikipedia’s talk pages published a paper entitled “The Wisdom of Polarized Crowds” in the journal Nature Human Behaviour. They found that politically diverse groups of editors “create articles of higher quality than politically homogeneous teams”.
Wikipedia’s critics don’t see it that way. In October, Republican senator Ted Cruz sent a letter to Maryana Iskander about “ideological bias” on the website in his capacity as chairman of the commerce, science and transportation committee. This followed a letter in August from the House of Representatives’ Oversight Committee, which asked for “records” of Wikipedia’s arbitration committee in relation to studies and reports of “systematic efforts” to advance antisemitic and anti-Israel information on Wikipedia, as well as hostile nation-state actors manipulating the site to expose readers to pro-Kremlin and anti-Western propaganda. Even Sanger, questionably addressed as a Wikipedia “co-founder”, attacked the website for being biased against conservatives in an interview on Fox News. When I asked Iskander about this, she replied that the Foundation needs to do a much better job of explaining Wikipedia’s inner processes to the outside world.
Wikipedia’s most contentious topic area by far is Israel and Palestine. ScottishFinnishRadish had been “patrolling”, in Wikipedia jargon, the topic for more than a year before being elected in early 2025 to the arbitration committee, a kind of judicial body made up of trusted volunteers endowed with advanced permissions to protect the website. By then he had already pored over millions of words of discussion on talk pages and placed dozens of sanctions on users who had gone against Wikipedia’s rules. While his wife watched horror films, he would be on his phone, thinking to himself: “Yep, got to keep track of all this.”
The aftermath of Hamas’s October 7 attack on Israel made the work of administrators harder. “People would show up and say, you know, Israelis are evil and they deserve that, or Palestinians are evil and deserve that,” he recalled. He would immediately block such users. From the start, the word “genocide” was used by both sides.
It doesn’t matter what your political views are, you can turn to Wikipedia and get a pretty straight thing
Jimmy Wales
ScottishFinnishRadish soon realised he needed to go to the arbitration committee to request they tighten rules and require that only “extended confirmed” users take part in page discussions. This went against Wikipedia’s policy of being open to all, but if nothing changed, he said, every discussion would be “95 per cent people who don’t know how Wikipedia works, 3 per cent people trying to explain to them how Wikipedia works. And 2 per cent Wikipedians arguing [in good faith] and getting lost in the noise.”
As with any other topic on the site, the ultimate question is whether you’re there to build an encyclopedia, or to right wrongs. If you’ve come to Wikipedia to push your point of view, you’ve come to the wrong place, said ScottishFinnishRadish. With a sensitive topic, if the person isn’t contributing to healthy discussion and — crucially — consensus building, they are barred from the topic. They can continue editing, just not in that topic area.
As people crowded in to claim that this or that was wrong, ScottishFinnishRadish tried to remind himself that Wikipedia had a convoluted, complicated system that newcomers inevitably struggled to understand, and that tensions were high. Few realise that there are literal novels’ worth of text behind articles. For Wikipedians, discussions are measured in “tomats”, the length of Hemingway’s The Old Man and the Sea, which is 26,500 words. Sometimes a single sentence can take three tomats to resolve in the talk pages.
One of the pages in the Israel-Palestine topic that attracted the most heat was one entitled “Gaza genocide”. And into this debate, founder Wales himself would eventually wade.
There was a dog on the tracks near Kilburn, west London, causing my Tube train to terminate early. I was trying to reach The Arts Club, a distinguished member’s club in Mayfair where I was due to meet Jimmy Wales. My lateness only added to the trepidation I already felt. The previous week, a video of Wales storming out of an interview after one minute went viral on X; the interviewer had kept insisting he was the “co-founder” rather than founder of Wikipedia, alluding to Sanger’s role (which Wales freely discusses in his book).
When I arrived, Wales was gracious and amiable. It was almost December, a few weeks since WikiCon in New York, and we were seated in the garden with heaters and fairy lights, surrounded by smartly dressed people. Wales, 59, who lives in London with his wife Kate Garvey, was wearing skinny black jeans, white trainers and a colourful shirt of red Paisley teardrops and floral patterns. (His membership of the Arts Club is honorary; unlike most of his early internet-era-peers, Wales is far from a billionaire.)
He had an unassuming air that was rather endearing. At times he leaned forward and grew more animated, gesturing with one hand while the other remained in his lap, but he was generally mildly expressive and softly spoken. Flashing keen-eyed smiles from behind black specs, he tended to go off on tangents. His circuitous way of speaking reminded me of Wikipedia itself, as though he was amending things in real time, discarding, ending up somewhere unexpected, circling back. It made it a little difficult to talk about the site’s “Gaza genocide” article.
A few weeks before our meeting, Wales posted to the article’s talk page criticising the lede and overall presentation of the article for stating, in Wikipedia’s voice, that Israel was committing genocide. It was a “violation” of the website’s neutral point of view, he wrote, which “requires immediate correction”. Al Jazeera mistakenly reported (and later corrected) that Wales himself had locked editing on the page. The report seemed to misunderstand how Wikipedia really works. Whatever his personal feelings about Israel and Palestine, even Wales, the website’s founder, couldn’t force the wording to be changed: revisions could only be made through painstaking discussion by Wikipedia’s editors. An administrator had instead restricted the page to longtime “extended confirmed” editors on October 28 2025. The article remains restricted, with debate ongoing among users over issues with its content.
When I asked him about the page, Wales said its problems were a sign he needed to use his role more to emphasise Wikipedia’s neutrality. “I think that’s particularly true at a time where we’re being called ‘Wokipedia’,” he said. “And I’m really keen that we double down on neutrality in these times, because it’s part of what is so valuable and so trusted about Wikipedia, which is to say it doesn’t matter what your political views are, you can turn to Wikipedia and get a pretty straight thing.”
As he spoke, I was thinking about Wales’s self-avowed constitutional monarch role and how he determines when it’s suitable for him to intervene. Drawing him back to the question about the “Gaza genocide” article, I asked what exactly he saw his role as being when he got involved. “I just raised the question,” he replied. “I’m like, ‘This is not OK,’ right?” Recently Wales has been leading a “neutral point of view” working group with Wikimedia Foundation’s research team and Wikipedia community representatives to improve understanding and support for NPOV across a wide range of cases. It’s a conversation he thinks Wikipedians need to have: “If people feel like we’ve decided that we want to take sides on issues, it’s going to be a big problem in the long run,” Wales said, adding, “Nothing magically changes overnight, but I think we’ll get there. I’m always optimistic.”
The debate on the talk page will, indeed, continue for months. This is not one man dictating to his underlings. The bottom-up way Wikipedia works makes that largely impossible. Unlike other tech platforms, this is one built on incremental steps, even at the operational level. Wikimedia chief product and technology officer Selena Deckelmann told me that, due to extended consultation with the community, Wikipedia’s first new user interface in 10 years, which was given the name Vector 2022, couldn’t actually launch until 2023. Although a universal code of conduct was created in 2020, in-depth discussions meant that enforcement guidelines took more than two years to be negotiated and approved. It all reminded me of the Ents, the ancient tree people in The Lord of the Rings, taking forever to debate whether to go to war. But this laborious emphasis on process is what makes Wikipedia robust.
That doesn’t mean there aren’t risks, as we saw at WikiCon. With growing political violence in the US and elsewhere, Wales worries that people in positions of influence aren’t doing more to try to calm the rhetoric. Wikipedia is very important to a lot of people, firing up strong emotions just like the rest of the information ecosystem, but the only way we can move forward, Wales said, is to seek compromise. “There’s really big issues facing the world, and it’s not helping if we’re just at each other’s throats, and it definitely isn’t helping if there’s violence.”
At one point during WikiCon, longtime Wikipedian Rosie Stephenson-Goodknight told me that someone texted her after the incident with the gunman to say: “We’re just trying to write an encyclopedia!” She replied: “Amen.” The texter was Jimmy Wales.
Find out about our latest stories first — follow FT Weekend Magazine on X and FT Weekend on Instagram
Presented by

Follow the topics in this article
- Life & Arts Added
- FT Edit Add to myFT
- Wikipedia Add to myFT
- Wikimedia Foundation Inc Add to myFT
- FT Magazine Add to myFT