• jungle@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    22 hours ago

    Everything an LLM outputs is hallucinated. That’s how it works. Sometimes the hallucination matches reality, sometimes it doesn’t.