Wikipedia Reports Traffic Decline as A.I. Changes How Users Seek Information
It’s hard to imagine a world without Wikipedia, the online encyclopedia that for a quarter of a century has stood as one of the enduring staples of the Internet. In recent months, however, the site’s foundation has been shaken by the rise of generative A.I., with Wikipedia attributing an 8 percent decline in human visitors this year to the proliferation of A.I.-generated content.
“These changes are not unexpected,” said Marshall Miller, senior director of product at the Wikimedia Foundation, in a recent blog post. “Search engines are increasingly using generative A.I. to provide answers directly to searchers rather than linking to sites like ours.”
Wikipedia first noticed the decline after, ironically, observing an unusually high amount of seemingly human visitors this spring. The platform later discovered that the uptick was due to a surge in bots designed to evade detection. A subsequent revamp of its bot detection systems revealed that real human traffic over the past few months had fallen roughy 8 percent compared with the same time period in 2024. “We believe that these declines reflect the impact of generative A. I. and social media on how people seek information,” said Miller.
One of the forces at play is Google’s AI Overviews feature, which was introducted last year to provide A.I.-generated summaries above search results, drawing information across the web. According to a Pew Research Center report, the share of users likely to click on traditional search result links is halved when a Google AI Overview summary appears. Of 900 surveyed U.S. adults, only 1 percent clicked on a link within an AI Overview summary itself. Google, however, maintains that its A.I. integration has done little to reduce overall click volume to linked websites.
Content from Wikipedia—alongside YouTube and Reddit—dominates not only traditional Google search results but the company’s new A.I. summaries. Together, the three websites accounted for roughly 15 percent of A.I. content and 17 percent of standard search content, according to Pew.
Wikipedia remains a prime target for data-hungry large language models (LLMs) that rely on vast troves of online information for training, often gathered through web-scraping bots. Nearly all LLMs train on Wikipedia datasets, according to Miller, who noted that the encyclopedia’s information is prioritized by search engines and social media platforms alike when they generate chatbot responses.
This means that even as site visits decline, Wikipedia’s content continues to be consumed. Still, fewer visitors pose an existential threat to the encyclopedia’s future, as the nonprofit relies on donations and volunteer editors to keep its vast trove of knowledge free and up to date.
How does Wikipedia use A.I.?
Wikipedia isn’t necessarily anti-A.I.—in fact, the encyclopediahas embraced A.I. to streamline tedious tasks for its editors. Its A.I. tools assist with workflows, giving volunteers more time for discussion and consensus building, while also automating translations of common topics and scaling up onboarding for new contributors.
To better manage A.I.’s risks, Wikipedia is now pursuing a range of new projects. These include developing a framework for content attribution and making it easier for volunteers to edit from mobile devices, said Miller. The foundation is also experimenting with new ways to reach younger audiences on platforms like TikTok and Roblox through mediums like videos, games and even A.I. chatbots.
“Twenty-five years since its creation, Wikipedia’s human knowledge is more valuable to the world than ever before,” said Miller. “As we call upon everyone to support our knowledge platforms in old and new ways, we are optimistic that Wikipedia will be here, ensuring the internet provides free, accurate, human knowledge, for generations to come.
