The Right-Wing Attack on Wikipedia

Wait 5 sec.

Late last month, Elon Musk launched Grokipedia, an AI-generated encyclopedia with 855,279 articles, no human editors, and no way for users to request improvements beyond a suggestion box addressed to its eponymous chatbot author. The tech entrepreneur is eager, he has said, to “purge out the propaganda” that he argues afflicts Wikipedia, the venerable user-generated reference source. But some Grokipedia articles are near replicas of Wikipedia entries. Other articles in the new source seem conspicuously sanitized: The article about the U.S. government’s now-defunct foreign-aid agency fails to mention Musk, who boasted about his role in “feeding USAID into the wood chipper.”The articles on Grokipedia are produced by Grok, Musk’s AI model, and they are roughly what you’d expect from replacing a dedicated community of human volunteer creators and editors with a chatbot. It confuses large-scale information retrieval for knowledge, and automation for neutrality. Yet Musk’s AI encyclopedia is also part of something broader: an escalating campaign to discredit Wikipedia and reshape what counts as a reliable source of basic information in the age of AI. Whatever the potential flaws of a crowdsourced reference site, many users often find Wikipedia more convenient, comprehensive, and reliable than any alternative. In a typical month, more than a billion people consult it. Over the past decade, Wikipedia has also become essential information infrastructure. It shapes what AI systems learn and what chatbots say. It’s used to provide context for YouTube videos, and influences what AI-powered answer engines present as truth. Control what Wikipedia considers reliable, and you control what machines—and then people—learn about the world.[Lila Shroff: Elon Musk wants what he can’t have: Wikipedia]This is why Republicans in Congress have recently begun sending letters that accuse the nonprofit Wikimedia Foundation, which operates the encyclopedia, of ideological bias and demand the names of certain volunteer arbitrators who help address factual disagreements. It’s also why some of the most powerful people in the world are demanding “reforms” to Wikipedia—or launching their own copycats.In sports, players who want more sympathetic treatment from game officials try to make them second-guess themselves, in some cases by loudly accusing them of making bad, or even biased, calls. This strategy is called “working the referees.” Politicians, particularly conservatives, have been using it against social-media companies for years. In 2016, Gizmodo published allegations by anonymous former Facebook contractors that editors of the social-media platform’s Trending Topics feature had been secretly “blacklisting” popular right-wing topics and domains while “injecting” mainstream news stories with less organic appeal. Material associated with Glenn Beck and Steven Crowder had allegedly been suppressed; mainstream coverage of the disappearance of a Malaysian airliner and the Charlie Hebdo attacks had been added. Bias allegations exploded across right-wing media and on Capitol Hill.Facebook investigated and released an explainer of how Trending Topics worked: Editors could remove clickbait, hoaxes, or stories with insufficient sources—fake news, as it was once quaintly called, which credible studies would show disproportionately catered to conservative audiences. The company maintained that its editors had based their decisions on story validity and source reliability, not on their own political preferences. Nonetheless, to avoid even the appearance of bias, and to placate angry critics, Facebook fired the humans who worked on Trending Topics, converting it to a fully algorithmic list—which quickly began amplifying conspiracy theories and untrustworthy outlets. The company let a useful feature be ref-worked into irrelevance.From that point on, fear of appearing biased against conservatives shaped not only Facebook’s decisions about how to handle low-quality information on its platform, but other companies’ decisions as well. Once ref-working proved effective, Republican politicians began to accuse many social-media companies of anti-conservative bias no matter how little evidence supported the claim.When co-founders Jimmy Wales and Larry Sanger started Wikipedia in January 2001, the idea that it would become a front line in the war for reality a quarter century later would have been laughable. In a new book, The Seven Rules of Trust, Wales recounts a joke that the comedian Stephen Colbert made about Wikipedia in 2006: “Any user can change any entry and if enough other users agree with them, it becomes true,” Colbert said. This is better than reality, he went on. It’s “Wikiality.”Colbert was being facetious, but Wikipedia does operate on the radical premise that people can collectively determine what’s true through reliable sourcing and methodical deliberation. Contributing is surprisingly easy: Go to a page, click into the editing window, and write. Registered accounts are optional; noncontroversial topics can be edited anonymously. Transparency is part of the ethos: When neutrality is disputed in a Wikipedia article, it is typically marked accordingly, right up at the top of the page. When an entry is thin on sources, it lets you know.[Renée DiResta: Rumors on X are becoming the right’s new reality]For controversial topics—abortion, the October 7 attacks—edits are limited to established users to minimize trolls defacing pages. The Wikipedia community’s formal policies and guidelines emphasize collaboration and neutrality; an ideal entry should lay out multiple sides of a controversial issue. As Wales notes, however, volunteer writers and editors inevitably must make judgments about the reliability of information: “Clearly we don’t treat crackpot, random websites as being the equal of the New England Journal of Medicine, and that’s fine.” In fact, Wikipedia maintains detailed guidelines on reliable sources; there is a long list of sources with lengthy discussions of their suitability, and a top-level note that “context matters tremendously” when deciding which to use and when (The Federalist, for example, is deemed suitable for attributed opinions, but is generally unreliable for facts). Editing disputes that can’t be resolved through public discussions on a topic’s Talk page move through a series of community-deliberation mechanisms; allegations of more serious manipulation may go to the Arbitration Committee, which follows an elaborate public process for conducting investigations and making decisions.Another of Wikipedia’s guiding principles is “assume good faith”—which its prominent critics are not doing. Musk and others have taken to calling it “Wokipedia.” Sanger, who has become an outspoken critic, argues that Wikipedia has adopted what he calls a “GASP” worldview—globalist, academic, secular, progressive. To fix this, he proposed reforms such as promoting accountability by de-anonymizing arbitrators and others with power over the Wikipedia community and abolishing the consensus model to allow parallel articles with declared viewpoints—separate “pro-life” and “pro-choice” entries for an abortion-related topic, perhaps. Arguing that partisan bias is what distinguishes the community’s acceptance of CNN and The Washington Post from its avoidance of right-wing outlets such as The Federalist and The Epoch Times, Sanger has also called on Wikipedia to eliminate what he calls “source blacklists,” and other conservatives have eagerly taken up that call.Some of Sanger’s ideas reflect legitimate tensions in Wikipedia governance. Wikipedia’s source-assessment lists do treat some advocacy groups, such as GLAAD and the Anti-Defamation League, as reliable information sources on some issues. Any given article can be edited in ways that unfairly lionize, smear, or otherwise distort its subject. An investigation by the tech outlet Pirate Wires alleged that a ring of pro-Hamas editors had succeeded in reshaping articles to favor their point of view. However, the Arbitration Committee quickly responded by banning six of the offenders—seemingly an act of effective community correction. And nothing prevents right-leaning writers from contributing.Musk, Sanger, and others have nonetheless advanced the argument that the site is systemically biased against conservatives, and that view has taken hold among Republicans in Congress.In August, Representatives James Comer and Nancy Mace, both Republicans, sent a letter demanding answers about foreign influence on Wikipedia, asking whether hostile actors or “individuals at academic institutions subsidized by U.S. taxpayer dollars” were inserting bias into entries on politically charged topics. In a recent letter, Senator Ted Cruz demanded answers about the site’s “ideological bias,” source list, and policies for how editors are removed or banned. Like earlier ref-working campaigns against social-media platforms, the letter seems intended to push a private organization toward policies favoring the right.Cruz’s letter helps explain the urgency of the campaign against Wikipedia.The site’s influence, he wrote, “extends even further in the age of artificial intelligence, as every major large language model has been trained on the platform. Wikipedia shapes what Americans read today and what technology will produce tomorrow.”Musk’s Grokipedia may not be used to train large language models anytime soon, but it is an attempt to elbow Wikipedia out of its position of prominence. Theoretically, it can generate new articles far more quickly and thoroughly than Wikipedia’s volunteer writers and editors can, and it is not subject to Wikipedia’s elaborate process for adjudicating factual disagreements.[Kaitlyn Tiffany: So maybe Facebook ]didn’t ruin politicsBut this is also one of Grokipedia’s greatest weaknesses. The remarkably thorough article about me contains nonsense that conspiracy theorists entered into congressional proceedings—including claims that my former research team at Stanford Internet Observatory censored 22 million tweets during the 2020 presidential campaign. The article also hallucinates that we were involved in Twitter’s moderation of stories about Hunter Biden’s laptop. We weren’t, and the cited source did not even make that claim. And there’s no reliable way to correct such problems. I reported these issues via the Suggest Edit tool included in Grokipedia’s user interface—so far, to no avail. On Wikipedia, I could appeal to an editor by dropping a note on a Talk page. But Musk’s version misses what gives Wikipedia authority: human consensus. (When I requested comment via the press account at xAI, the Musk-founded artificial-intelligence company that developed Grok, I received an automated response that said “Legacy Media Lies.”)Musk’s X platform recognizes that human consensus can be helpful in Community Notes, its fact-checking feature. Like Wikipedia, Community Notes recognizes that legitimacy and trust ultimately come from people getting together to decide that an explanation is accurate, needed, and fair. Grokipedia abandons this entirely. It’s pure algorithmic output with no community, no transparency, no clear process for dispute resolution.The irony is striking: Even as Musk and his friends attack Wikipedia for supposed bias, he is building something far more opaque and unaccountable.