• Beej Jorgensen@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    1 hour ago

    I often wonder about the stuff I write, what becomes of it. It’s a little disheartening since I love crafting it for best effect… But especially with computer books for beginners, people prefer to ask AI for the answers instead of studying.

    I also just bought 6 sci-fi books from an author I’d never heard of for cheap. I love supporting indy authors, the price was right, and they sold their books directly from the website, no middlemen and no DEI. Perfect.

    But was the author real? I actually did a bunch of research to find out their history and all that before pulling the trigger. I really don’t want to read AI stories. But I can see a future where the vast majority don’t care. Imagine an endless episode of Survivor or a soap opera, completely generated 24x7 forever. You know that shit would be massive.

    And there might only be a fringe that seeks human-generated content for the humanity of it.

  • SonOfAntenora@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    9 hours ago

    AI companies than blog and social-media posts. (Ziff Davis is suing OpenAI for training on its articles without paying a licensing fee.) Researchers at Microsoft have also written publicly about “the importance of high-quality data” and have suggested that textbook-style content may be particularly desirable.

    If they want quality data then, don’t kill them. Secondly, if they want us as gig workers providing content for AI, don’t act surprised when people start feeding gibberish. It’s already happening, llm are hallucinating a whole lot more than the earliest gpt 3 models. That means something, they just haven’t thought about it long enough. If a reasoning model gets stuff wrong 30 to 50% of the time, with peak of 75% bullshit rate, it’s worthless. Killing good journalism for this is so dumb.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      If you want quality data, then don’t kill them

      That is like telling cancer that if it wants to live it shouldn’t kill the host.

      You’re asking a lot from people without the ability to think about anything else than themselves

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Interestingly, I’m not seeing your quoted content when I look at this article. I see a three-paragraph-long article that says in a nutshell “people don’t visit source sites as much now that AI summarizes the contents for them.” (Ironic that I am manually summarizing it like that).

      Perhaps it’s some kind of paywall blocking me from seeing the rest? I don’t see any popup telling me that, but I’ve got a lot of adblockers that might be stopping that from appearing. I’m not going to disable adblockers just to see whether this is paywalled, given how incredibly intrusive and annoying ads are these days.

      Gee, I wonder why people prefer AI.

    • floofloof@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      4 hours ago

      If it gets wrong enough, people will stop using it. So it would be in the interests of AI companies to pay for good sources of data.

      Or at least you’d hope that. In actual fact they’ll be thinking: let’s keep stealing because most people don’t know or care whether what the AI says is true. Besides, they can make money by turning it into a tool for disseminating the views of those who can pay the most.