Source of this article and featured image is TechCrunch. Description and key fact are generated by Codevision AI system.
Wikipedia has proposed a strategy to ensure its website remains viable in the AI era by encouraging companies to use its paid API instead of scraping its content. The Wikimedia Foundation urged AI developers to attribute content properly and access it through the Wikimedia Enterprise platform. This approach aims to reduce the strain on Wikipedia’s servers and support its nonprofit mission. The organization recently identified AI bots as a major source of traffic, which it claims has led to a decline in human visitors. The post highlights the importance of transparency in information sourcing to maintain public trust in online content.
Key facts
- Wikipedia is urging AI companies to use its paid API instead of scraping its content.
- The Wikimedia Enterprise platform allows companies to access Wikipedia’s content at scale without overburdening its servers.
- Wikipedia recently discovered that AI bots were responsible for a significant portion of its traffic.
- The organization claims that human page views have declined by 8% year-over-year.
- Wikipedia emphasizes the importance of attribution to maintain trust in online information.
