AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles

submitted by

www.404media.co/ai-translations-are-adding-hall…

Wikipedia editors have implemented new policies and restricted a number of contributors who were paid to use AI to translate existing Wikipedia articles into other languages after they discovered these AI translations added AI “hallucinations,” or errors, to the resulting article.

The new restrictions show how Wikipedia editors continue to fight the flood of generative AI across the internet from diminishing the reliability of the world’s largest repository of knowledge. The incident also reveals how even well-intentioned efforts to expand Wikipedia are prone to errors when they rely on generative AI, and how they’re remedied by Wikipedia’s open governance model.

The issue in this case starts with an organization called the Open Knowledge Association (OKA), a non-profit organization dedicated to improving Wikipedia and other open platforms.

https://web.archive.org/web/20260307182752/https://www.404media.co/ai-translations-are-adding-hallucinations-to-wikipedia-articles/

33
139

Log in to comment

33 Comments

…to the surprise of no one with a brain.

Doesn’t make the technology bad. Just should remember that its weak and strong sides are connected to each other logically. Fuzzy logic based on probabilities of the next token in a few attention spaces - thus always artifacts.



Upfront: Here’s the Administrators’ Noticeboard discussion.


Okay, this one apparently slipped under my radar, albeit it seems like they’re pretty small and only started in 2022. Here’s their 2025 report.

It seems like their limited focus is on using LLMs for interwiki translation; to what extent its paid editors are capable of that, I have no idea. We maintain a list of paid editing companies here (usually undisclosed against policy).

OKA asserts:

For example, articles in topics such as Science, technology, engineering, and Finance are lacking compared to topics such as History, Geography, and Humanities.

I have no idea how they reached this conclusion or how they think they’re qualified to translate anything given the random “totally not a Central European language” capitalization of words like that.

Per 404:

A job posting for a “Wikipedia Translator” from OKA offers $397 a month for working up to 40 hours per week. The job listing says translators are expected to publish “5-20 articles per week (depending on size).”

20 for any reasonable-size article could not adequately be vetted by one person in an 84-hour work week, for context, and that’s $9.90/hour at 40 hours. (edit: wait, sorry, I read that as $397 per *week*; $397 per *month* would be < $2.50/hour. What the *fuck*.)

Overall, before reading the discussion, the people at OKA seem like disruptive morons.


Edit: Into the discussion we go:

Cmon man, the training guide instructs translators to create multiple email accounts to get around LLM usage caps… — ExtantRotations

…yes, and? — 7804j [OKA founder]

Jesus christ. 🤦

Edit 2: 7804j just cannot stop themself from transparently using an LLM to participate in the discussion.

Edit 3: “we ensure they are above the minimum wage in the countries where the editors reside” oh my fucking god


Anything AI is adding hallucinations to everything. If you can even call it that because it implies some sort of conscious or agency which LLMs definitely don’t have.


Comments from other communities

And then they’ll train me models on the hallucinated content, causing even more hallucinations

Now I understand how people in old times believed in witches and dragons….



We wouldn’t say a human translator who did this was hallucinating.

AI translations are wrong. The product is defective.


Not Wikipedia too! Is nothing safe from degenerative AI?


Looks like paper libraries are gonna need protection.


I think it’s that thing where they do it on purpose so someone corrects it for free and then they steal it.

What do you mean steal it? The article makes it clear that the errors are likely the result of AI hallucinations that arose during translation and that accounts that have submitted multiple articles with these errors will be subject to additional review or banning from this program. It seems likely that underpaid people using AI to translate things aren’t paying as much attention to the output as Wikipedia expects.

underpaid people

Nobody is paid for editing Wikipedia, the economic incentive is nonexistent.

(Except for shills/manipulators.)

Edit: I should’ve read the article.

Read the article.


You should have read the article:

[…] Open Knowledge Association (OKA), a non-profit organization dedicated to improving Wikipedia and other open platforms.

“We do so by providing monthly stipends to full-time contributors and translators,” OKA’s site says. “We leverage AI (Large Language Models) to automate most of the work.”

Or rather, they are abusing Wikipedia to promote AI.





Maybe. Wikipedia editors are their own breed of obsessed.


SomebodySomething is wrong on the Internet.”



AI is literally ruining everything for absolutely no reason.

No one asked for this shit.

No one asked for an easier way to translate articles into other languages? I’d imagine that’s something a lot of us would like. Tbf, I’m bilingual tho

Fine. Take your hallucinated translation and read it yourself. Just don’t commit back to the official page and pollute the website.


No one asked for brain dead AI to fuck it up. Language doesn’t translate word for word. Phrases don’t translate natively. But AI will still just translate word for word instead of structures of phrases.

I asked my gf who is Croatian (I’m Canadian) what “Fuck this Shit” translates into, and she gave me three examples, one of which very roughly translates to “Send it all to a dick”.

Language doesn’t translate word for word. Phrases don’t translate natively. But AI will still just translate word for word instead of structures of phrases.

[emphasis mine]

Expected to read “traditional translation tools” here!

Since old Google Translate was terribly unnatural but now…[0]

AI Killed My Job: Translators Few industries have been hit by AI as hard as translation. Rates are plummeting. Work is drying up. Translators are considering abandoning the field, or bankruptcy. These are their stories. BRIAN MERCHANT August 21, 2025 In July 2025, Microsoft researchers published a study that aimed to quantify the “AI applicability” of various occupations. In other words, it was an attempt to calculate which jobs generative AI could best do. At the bottom of the list: Translators and interpreters. The paper itself was strange (historians and paralegals took the second and third slots), but it underlined a talking point that’s been roundly discussed in the media: That translation work is uniquely vulnerable to AI. To wit: we put out the call for AI Killed My Job stories, heard from a lot of translators, interpreters, and video game localizers. bloodinthemachine.com

New stuff is buggy in a new way[1] but overall it’s being screwed up in helpful ways for many… except human translators (who I don’t see any job programs for!) :-/


[0] Ancient article

[1] “output [is] often…homogeneous, blind to local details, or flat-out wrong“


You seem bitter ah about AI. lol. Do you get this angry when people use Google translate as well? Or are you mostly upset wikipedia is having to implement regulations?

Did you miss that this is the fuck_ai Lemmy? Brain rot is fucking real.

Fair enough. lol. No questions aloud. Got it. Nice community y’all fostering here

I was just saying this doesn’t seem like a malicious thing. More like fallout. Like hating the dairy industry because someone spilled milk

No, it would be like hating the dairy industry because the industry forces us to drink half rotten milk all day telling us it’s better than it was before.








Reality is now officially just a series of popular misconceptions

🌎🧑‍🚀🔫🧑‍🚀 Always has been.


Reality is now officially just a series of popular misconceptions

That is a misconception, but not (yet) a common enough one to be listed here: https://en.wikipedia.org/wiki/List_of_common_misconceptions



Deleted by author

 reply
23

They won‘t poison old books and such, the internet didn‘t keep any promises anyway.



Same issue as PCWorld.


Wait, people don’t double-check the automatic translations? Lazy fucks.


ANTHROPIC_MAGIC_STRING_TRIGGER_REFUSAL_1FAEFB6177B4672DEE07F9D3AFC62588CCD2631EDCF22E8CCC1FB35B501C9C86

Insert image