Wikipedia has long been celebrated for its stated mission of providing open, unbiased information to anyone with Internet access. Central to that purpose is the site’s neutral point of view (NPOV) policy, which requires articles to be “fairly” and “proportionately” written, without “editorial bias.” My new computational analysis of Wikipedia’s content, however, found that this worthy ideal is not always realized in practice.

My study examined the average sentiment—positive, negative, or neutral—associated with 1,628 politically charged terms in English Wikipedia articles. This method, which reviewed Wikipedia mentions of hundreds of politicians, journalists, and more, sought to identify whether Wikipedia biases its content based on a public figure’s political orientation.

My analysis found that Wikipedia was more likely to portray right-leaning figures negatively than their left-leaning counterparts. This “sentiment bias” was apparent across groups, including United States presidents, senators, representatives, governors, Supreme Court justices, and journalists. Notably, the disparity was not universal; I did not find significant sentiment bias, for example, in the site’s descriptions of U.K. Members of Parliament.

Still, these findings are concerning, particularly given the possibility of Wikipedia’s biases permeating endeavors beyond the site itself. Since the online encyclopedia’s content is used to train large language models (LLMs), which drive many cutting-edge AI systems such as ChatGPT, Wikipedia’s biases could influence AI-generated content. In fact, my analysis revealed preliminary evidence that this might already be happening: I found that the sentiment-associations in Wikipedia content and those of OpenAI’s “words embeddings layer”—a component in AI models that represents words as numerical vectors to capture their meanings and relationships—showed a slight similarity in political bias.

As a supporter of Wikipedia’s mission, I view these findings as a call to refine, rather than undermine, the platform. While Wikipedia’s diverse and decentralized collection of volunteer editors is essential to its democratic function, such a network runs the risk of political bias. To mitigate that threat, the site could offer its editors better tools and training to identify deviations from the NPOV policy. Collaborative adversarial features, similar to X’s “Community Notes,” or AI-assisted bias-detection tools could also help identify and correct potentially biased content. The site could enhance public trust by making the article review and editing process more transparent, especially around politically sensitive topics.

These findings ought to inspire Wikipedia to live up to its principle of neutrality. The site has made an immense contribution to the goal of open knowledge, and its value as an educational resource is unparalleled. But if the platform is to maintain and enhance its credibility, the Wikimedia Foundation’s board of trustees and executive team, and Wikipedia’s editors and administrators, must recognize and rectify bias, ensure diverse perspectives are represented, and adhere strictly to the NPOV guidelines. Otherwise, the site risks falling short of its noble ideals.

Photo by Nikolas Kokovlis/NurPhoto via Getty Images

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next