• pavnilschanda@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    1 year ago

    There absolutely should, and perhaps law and ethics experts who dabble in AI are in the middle of discussing this.

  • Spaceman Spiff@lemmy.fmhy.ml
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    This isn’t just a matter of law, but of technology. Part of the point of these large language models is the massive corpus of raw data. It’s not supposed to mimic a specific person or work, but rather imitate ALL of them. Ideally, you wouldn’t even be able to pinpoint anyone or anything in particular.

    (If you’re asking about a different type of AI, then disregard)

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Likeness is just one aspect of copyright. Another side entirely involves the protection of the innovative and production parts of the creation of the original training material. I.e. ChatGPT wouldn’t be possible without the work of hundreds of millions of writers writing things.

      Not so different than compensating a course for providing you with their learning materials when you learn a new language.

  • LadyAutumn@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    1 year ago

    AI has only been capable of… half imitating those things for like a year and a half. And most uses are non commercial, and there is not established case law for personal usage of generative artificial intelligence. It would be hard to sue someone for something they aren’t profiting from, or that they are not using as a form of slander or libel or harassment.

    The EU, the UK, and the US are all currently developing new laws surrounding the usage of AI. But this is all incredibly new and therefore in progress.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Yup, we’re in the wild west of it now where anything goes, but that’s only because there are no laws currently. Actors have already been fighting this in the movie industry to determine how to work out “face rights” and “voice rights” after they die (so they can do things like Leia in Rogue One), and now that it’s for anyone I’m guessing we’re going to see laws start to take form very soon.

    • Atemu@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Um, no. Copyright exists and it applies to everything that is copyrightable. It doesn’t go away when you slap AI onto your imitative ML model.

      • LadyAutumn@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Yes, and personal use exists so that people who are not profiting in any way off of someone else’s copyright are not chargeable under copyright law. Hence why I can download a jpg of an artists painting and that action alone is not breaking any law.

        If my AI is not being used to make money, and it is also not being used to slander someone or harass someone, then what law exactly would I be breaking? How would I be harming someone else, by having an AI that I use solely personally and am in no way benefiting from? Can you sue someone for taking a DVD, ripping it, and then rearranging the scenes in a new video file? Have they broken a law solely by creating a derivative work using the original material, even if that work is not making them money and they are commiting no other crime using that work?

        • Atemu@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          personal use exists

          That would be news to me. Could you give a source for that? Because it’d mean I would be allowed to copy movies, music etc. for my “personal use” and also give them away because I’m not seeking profit, right?

          Copyright extends everywhere, no matter what use. There are certain specific limits such as licenses, fair use and private copies but outside of those, no, you may not make a copy and yes, even loading a file into memory counts as “making a copy”.

          • LadyAutumn@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            How could you save a png then? We’re talking about digital files where you can absolutely copy a file without facing any legal repercussions for doing so. Copy paste is the same as making a copy of a work and would make sense under copyright law to be illegal. But its not because that’s ridiculous, and no one will ever be charged for going to Google image search and looking at a copyrighted picture on the screen. Its also not illegal for humans to learn from the art of someone else and then create similar art. I dont know how you can or would be able to even detect that someone has trained an AI for this purpose. Training modern AI with pre-made models to draw a specific person or character in a specific art style is trivial. Its hard to do it if you’re unfamiliar but it requires very few training materials to become very effective. ControlNet and LORA have made this possible. So how could you even tell if someone made such a model, they’ve only downloaded a dozen pictures and there is no effective way to tell that they’ve then trained this model. It would have to come down to their creations being noticeable enough to charge them. Either through profiting from it or by posting that content online.

            Personal use is a misnomer by me, what I meant was fair use.

            • Atemu@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              How could you save a png then?

              A PNG of what? Copyrighted material? Depending on what you want to do with it (fair use, private copies in some places) or whether you have a license and perhaps a select few other factors, you would not be allowed to download a PNG of copyrighted material.

              where you can absolutely copy a file without facing any legal repercussions for doing so.

              The law and its enforcement are two separate topics. I don’t really care to discuss the latter.

              Its also not illegal for humans to learn from the art of someone else and then create similar art.

              Depends on how similar it is. If the work the human does to the original is transformative in some way, that falls under fair use and therefore legal. If they just apply some instagram filter or something, that would likely not be considered transformative and distribution would not be legal without permission of the copyright holder.

              This is the crux in all of this imitative AI art discussion. Read https://en.wikipedia.org/wiki/Derivative_work and (closely related) https://en.wikipedia.org/wiki/Threshold_of_originality

              Either through profiting from it or by posting that content online.

              Note that profit has no influence in any of this except for making the infringing party more noticeable to law enforcement. The law itself does not care whether you make a profit out of a copyright infringement or not.

      • kromem@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yes, but training is not infringement under current law.

        If you learn to draw by tracing Mickey Mouse, but then professionally draw original works, you haven’t infringed copyright.

        If you subsequently draw Mickey Mouse, you’ll hear from Disney’s lawyers.

        So yes, AI producing IP protected material when prompted results in infringement, just as any production would.

        The thing people seem to be up in arms about are things like copying style (not protected) or using for training (not infringing).

        If anything, all of this discussion over the past year around AI has revealed just how little people understand about IP laws. They complain that there needs to be laws for things already protected and prohibited, and they complain that companies are infringing for things that are not protected nor prohibited.

        For example, in relation to the OP question, in the US there is no federal protection around IP rights for voices and case law to the opposite.

        • Atemu@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          If you learn to draw by tracing Mickey Mouse, but then professionally draw original works, you haven’t infringed copyright.

          Tracing Mickey is already copyright infringement. It’s insane but that’s how it works.

          I think copyright is a weird idea in this day and age where you’re almost always standing on the shoulders of giants too but that doesn’t make it go away. Best we can do is hack it (copyleft).

          copying style (not protected)

          Not copyrightable but I’m not sure whether a style could be trademarked instead.

          using for training (not infringing)

          Whether that’s infringing or not is hotly debated topic. Key point here is whether training falls under fair use. If it doesn’t, training on copyrighted material without a license would be infringing under most jurisdictions.

          in the US there is no federal protection around IP rights for voices and case law to the opposite.

          Because US == world. I’d love to leave this implication behind on Reddit.

          • kromem@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

            What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

            But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

            And yeah, most of the discussion around this revolves around US laws. If we put aside any jurisdiction then there is no conversation to be had. Or we could choose arbitrary jurisdictions to support a position, for example Israel and Japan which have already said training is fair use.

            • Atemu@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

              That’s not what I think, that’s what the law says.

              I said what I think in the second paragraph. Sorry if I wasn’t being extra clear on that.

              What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

              Good luck with that.

              But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

              Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.
              Think about it like this: If I distributed a tarball of copyrighted material, that would be infringement, eventhough you’d need tar to unpack it. Whether you need a transformer or tar to access the material should make no difference in my layman interpretation.

              • kromem@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                That’s not what I think, that’s what the law says.

                No, it doesn’t. The scenario outlined squarely falls under fair use, particularly because of the non-distribution combined with research/education use. Fair use is not infringement.

                Good luck with that.

                We’ll see.

                Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.

                I mean, if we’re talking about hypothetical models that only produce infringing material, you might be right.

                But if we’re talking about current models that have no ability to reproduce the entire training set and only limited edge case reproducibility of training images with extensive prompt effort, I stand by being surprised (and that your tar metaphor is a poor and misleading one).

                If we’re going with poor metaphors, I could offer up the alternative of saying that distributing or offering a cloud based Photoshop isn’t infringement even though it can be used to reproduce copyrighted material. And much like diffusion based models and unlike a tarball, Photoshop requires creative input and effort from a user in order to produce infringing material.

  • m-p{3}@lemmy.ca
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Laws are usually a reactions to a behavior we want to eliminate, they’re rarely made to prevent something we anticipate.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    There is already laws regarding impersonation and the right to your own image.

    So not sure why AI would make it different from a costume or a drawing

  • Moghul@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Basically, governments and laws can be fairly slow to adapt. Consider that it wasn’t that long ago that you could absolutely tell that something was made by AI whether it was caused by stilted/unnatural speech or mangled fingers in fake pictures of people. In a few short months we’ve gone from dreamlike hallucinations of real things, to almost passable renditions. AI is just advancing too fast for most governments to make a decision and formulate a coherent and mostly future proof set of laws regarding it.

  • kromem@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Because this stuff was already extremely complicated before AI came along.

    For example, the thing you are actually dealing with isn’t copyright or trademark here, but “right of publicity” which relates to the right to one’s likeness for commercial purposes.

    Which isn’t protected federally in the US and comes down to a state-by-state basis.

    There were instances where you had humans impersonators mimicking voice or likeness of others for years before AI. Or even using old materials and re-editing them like Issac Hayes’ voice for Chef after the falling out with the South Park creators over their Scientology episode, where they subsequently had his character fully voiced as he joined a pedophile club.