Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • gkpy@feddit.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 day ago

    Cheers for the explanation, had no idea that’s how it works.

    So it’s even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      15 hours ago

      AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

    • some_guy@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      17 hours ago

      There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.

    • Vinstaal0@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 hours ago

      You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

      People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.