English Wikipedia now has a formal policy, WP:NEWLLM, prohibiting LLMs from generating or rewriting article content. The exceptions are narrow: light copyediting and translation assistance, with human review required in both cases.

German Wikipedia already went there first. A community vote closed February 15, 2026 with a similar ban. Two major language editions independently arriving at the same answer suggests the problem is real. WikiProject AI Cleanup existed before this policy, meaning someone had already organized a named effort to repair damage that had already happened.

The catch is enforcement. Wikipedia runs on volunteers. There’s no automated AI detector. The community explicitly acknowledged that available AI detection tools are too unreliable to deploy, too many false positives, too easy to beat with minor paraphrasing. What that leaves is a human reviewer trying to decide whether an edit “feels like” AI output.

That works sometimes. It probably doesn’t work when someone is actively trying to slip something past a reviewer.

The real problem isn’t tone or style. LLMs confabulate citations. They generate references that look completely legitimate, proper formatting, real-sounding journal names and page numbers, and are entirely invented. Wikipedia’s whole value proposition is a traceable chain from claim to source. AI breaks that chain in a way that’s hard to catch and easy to miss.

The principle is sound. Whether it’s actually enforceable is a genuinely open question.


Why Wikipedia’s AI ban is the right call, and why it may not matter as much as they hope.