New Platform Relies on Wikipedia Content Despite Criticism
Despite positioning itself as a superior alternative, the new AI encyclopedia heavily relies on Wikipedia content, with many articles appearing nearly identical to their Wikipedia counterparts. This dependency has sparked discussions about the irony of criticizing a platform while simultaneously using it as a primary content source.
Content Copying Revealed
Investigation by technology journalists has revealed extensive reliance on Wikipedia content across the AI-powered platform. Disclaimers on some pages acknowledge that content is "adapted from Wikipedia, licensed under Creative Commons Attribution-ShareAlike 4.0 License," but the extent of this dependency contradicts claims of original AI-generated content.
Dataconomy reported that the platform's "AI-verified pages show little change from Wikipedia," with examination revealing "direct reliance on Wikipedia material rather than original development." Articles on topics including PlayStation 5, Lamborghini, and AMD chipsets were found to be "near-identical copies of the corresponding Wikipedia entries."
According to Heise Online, while the platform "runs under version number 0.1 and currently comprises approximately 885,000 English-language articles," many of these are "complete copies of their Wikipedia variants."
The revelation is particularly striking given the platform's marketing emphasis on AI-generated content that would "exceed Wikipedia by several orders of magnitude in breadth, depth and accuracy." Instead, much of the content appears to be Wikipedia articles with minimal or no modifications.
Wikimedia Foundation's Response
The Wikimedia Foundation, which oversees Wikipedia, issued a measured response emphasizing the foundational role of human-created knowledge in AI systems. Their statement highlighted a fundamental irony in the situation: an AI platform criticizing Wikipedia while simultaneously depending on it for content.
"Wikipedia's knowledge is—and always will be—human," the Foundation stated. "This human-created knowledge is what AI companies rely on to generate content; even [AI encyclopedias need] Wikipedia to exist."
The Foundation's response underscored that while AI can process and reformat existing information, it fundamentally relies on human-curated knowledge as its source material. This dependency persists even when AI systems are marketed as replacements for human-edited platforms.
Community reactions within Wikipedia have been mixed, with some editors expressing frustration at the platform's appropriation of volunteer work, while others view it as validation of Wikipedia's value. The Creative Commons license permits such reuse, but the situation raises questions about proper attribution and the value-add of AI processing.
The Paradox of Criticism
The situation presents a notable paradox: a platform launched explicitly to provide an alternative to Wikipedia turns out to be largely built on Wikipedia content. This raises fundamental questions about what "AI-generated" actually means when the underlying content is human-written.
Critics have pointed out that simply running Wikipedia articles through an AI system for minor reformatting or "fact-checking" doesn't constitute creating original content. The platform appears to be more of a Wikipedia mirror with AI-powered modifications than a genuinely independent encyclopedia.
Technology analyst commentary has questioned the value proposition of the new platform. If users are essentially reading Wikipedia content, what benefit does the AI layer provide? The accuracy concerns identified in recent days suggest the AI processing may actually decrease rather than improve content quality.
As Plagiarism Today noted in their analysis titled "How Not to Make an Encyclopedia," the approach represents "a weaker approach to citation and heavier reliance on AI" compared to Wikipedia's transparent sourcing and edit history.
Legal and Ethical Considerations
From a legal standpoint, the reuse of Wikipedia content appears to comply with Creative Commons licensing requirements. Wikipedia content is explicitly released under CC BY-SA 4.0, which permits commercial reuse and modification as long as proper attribution is provided and derivative works carry the same license.
However, several concerns arise around implementation:
- Inconsistent attribution: Some pages include proper Wikipedia attribution while others do not
- Unclear modifications: Users cannot easily identify what has changed from the Wikipedia source
- Credit to contributors: Wikipedia's thousands of volunteer editors receive no recognition for their work
- Misleading marketing: Promotional materials suggest original AI content when much is adapted from Wikipedia
The ethical questions extend beyond legal compliance. Is it appropriate to build a platform criticizing Wikipedia while simultaneously depending on Wikipedia content? Does AI reformatting add sufficient value to justify positioning the result as a superior alternative?
These questions become particularly relevant given that Wikipedia is created by unpaid volunteers who freely share their work. The new platform monetizes this volunteer labor through its connection to a for-profit company, raising questions about appropriate ways to build upon open knowledge resources.
As the encyclopedia landscape evolves with AI technology, the relationship between human-created and AI-processed knowledge will require clearer definition. This launch suggests that AI may be better suited to supplementing rather than replacing collaborative human knowledge creation.