

That’s why I specify that everything should be verified in a later comment. My point is that LLMs when properly guided are better than other automatic translation service, while hallucination can easily be avoided with proper prompting.
Also worth mentioning that there’s massive difference in user generated translations already, some of it is well meaning while other, like in Israel’s case, isn’t.
I translate a lot of stuff for my work, and I don’t have any problems when I instruct it properly. I’m also there to verify. I don’t have to deal with hallucinations ever, mostly just changing a word or two because I don’t like how it sounds (it uses overly complex words at times).
This is more about certain users being shit and either not checking their work or doing work they have no place doing. They would exist no matter what they use, it’s not the tools fault.
Tbh, I work in research and we would never use Wikipedia for anything. We can’t quote it and anytime I find a good tidbit on it and try and find the source, I usually get dead link or just something altogether false which doesn’t represent what the user wrote. Probably highly dependent on the subject though but the sourcing isn’t very rigorous.
Bless them though, it’s an amazing site and they are still doing a stellar job considering how big it is.









It’s talked about because Israel’s genocide is being done with our support and our weapons, and that support is clearly because of bribes and blackmail.