Wikipedia bans AI-generated content in its online encyclopedia

Brajeshwar 76 points 18 comments March 28, 2026
www.theguardian.com · View on Hacker News

Discussion Highlights (11 comments)

longislandguido

Will this open the door to editors' deletion of any content they dislike, under the guise that it might (or might not) be AI generated? Can't wait for the 80 page Talk threads.

ChrisArchitect

Source: https://en.wikipedia.org/wiki/Wikipedia:Writing_articles_wit...

slyall

This policy has been shared a lot by the anti-AI crowd over the last week. They are celebrating it as a major site saying no to AI. It seems a smaller "win" than most think. Just discourages wholesale rewriting and creation of new articles using AI. Assistance with editing is explicitly allowed.

Eufrat

They seemed open to giving it a try if they were actively involved in the experiment. Instead, it feels like a lot of people don’t really understand how Wikipedia is managed and thought that they could use it as a freeform place to get credibility or just test their pet projects. Like, this attempt† where the bot then attempted to lecture users who were hostile towards it before it was eventually banned. † https://en.wikipedia.org/wiki/User_talk:TomWikiAssist

swingboy

I’ve contributed a fair amount over the past few months of primarily AI generated content that I mainly just edit for the usual AI tropes and it’s pretty much all still up.

jjmarr

This is the traditional "innovators dillema" where a skilled profession facing an imperfect technological threat decides not to adopt it until it is too late. AI generated articles are, on the balance, inferior, except for people that want simple, low quality content. But LLMs are moving up the value chain with Deep Research. They can give explanations tuned to a reader's knowledge/viewpoints and provide interactive content Wikipedia doesn't support. That is a killer app for math/science topics. Wikipedia will win against a generic corporate encyclopedia on neutrality/oversight, but it'll lose badly on UX, which is what matters. I think the tipping point will be direct integration of academic sources into ChatGPT/Claude/Gemini and a "WikiLink" type way to discover interesting follow-up topics. I can't trust AI answers for serious historical or social science topics because of the first. And generally my chat with AI ends once I get the answer I need because I can't get rabbitholed into other topics.

rox_kd

Well on time tbh. or at least some sort of better moderation, because there has really been some unfortunate cases imo

cozzyd

If you want an AI encyclopedia that already exists

56745742597

Even AI slop is too factual for the self-proclaimed arbiters of truth.

rose-knuckle17

This is about as intelligent and practical as banning school kids in the 80s from using calculators, based the logic that "you won't always have one with you".

hallole

I've encountered AI contributions on Wikipedia, and, although I wonder how they'll enforce such a rule, I think this is the proper stance to take. I think readers take for granted how concise Wikipedia's prose tends to be. AI, in comparison, seems built to ramble, being overly specific where it doesn't need to be and lacking specificity where it ought to have it. When you think about it, "what should go on a thing's Wikipedia page?" is an interesting question; the answer certainly isn't "anything and everything." AI just doesn't have a good sense for what belongs, I feel.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed