• HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 days ago

    You can’t put toothpaste back in the tube. The only question going forward is how AI will be developed and who will control it.

    Fair enough, but even if the model is open source, you still have no control or knowledge of how it was developed or what biases it might have baked in. AI is by definition a black box, even to the people who made it, it can’t even be decompiled like a normal program.

    It’s funny that you’d bring up the drug analogy because you’re advocating a war on drugs here.

    I mean, China has the death penalty for drug distribution, which is supported by the majority of Chinese citizens. They do seem more tolerant of drug users compared to the US (I’ve never done drugs in China nor the US so I wouldn’t know), so clearly the decision to have zero tolerance for distributors is a very intentional action by the Communist party. As far as I know, no socialist country has ever been tolerant to even the distribution of cannabis, let alone hard drugs, and they have made it pretty clear that they never will.

    Personally, I have absolutely no problem with that if the model is itself open and publicly owned. I’m a communist, I don’t support copyrights and IP laws in principle. The ethical objection to AI training on copyrighted material holds superficial validity, but only within capitalism’s warped logic. Intellectual property laws exist to concentrate ownership and profit in the hands of corporations, not to protect individual artists.

    I never thought of it in terms of copyright infringement, but in terms of reaping the labour of proletarians while giving them nothing in return. I’m admittedly far less experienced of a communist than you, but I see AI as the ultimate means of removing workers from their means of production because it’s scraping all of humanity’s intellectual labour without consent, to create a product that is inferior to humans in every way except for how much you have to pay it, and it’s only getting the hype it’s getting because the bourgeoisie see it as a replacement for the very humans it exploited.

    For the record, I give absolutely no shits about pirating movies or “stealing” content from any of the big companies, but I personally hold the hobby work of a single person in higher regard. It’s especially unfair to the smallest content creators because they are most likely making literally nothing from their work since the vast majority of personal projects are uploaded for free on the public internet. It’s therefore unjust (at least to me) to appropriate their free work into something whose literal purpose is to get companies out of paying people for content. Imagine working your whole life on open source projects only for no company to want to hire you because they’re using AI trained on your open source work to do what they would have paid you to do. Imagine writing novels your whole life and putting them online for free, only for no publisher to want to pay for your work because they have a million AI monkeys trained on your writing typing out random slop and essentially brute forcing a best seller. Open source models won’t prevent this from happening, in fact it will only make it easier.

    AI sounds great in an already communist society, but in a capitalist one, it seems to me like it would be deadly to the working class, because capitalists have made it clear that they intend to use it to eliminate human workers.

    Again, I don’t know nearly as much about communism as you so most of this is probably wrong, but I am expressing my opinions as is because I want you to examine them and call me out where I’m wrong.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      3 days ago

      Fair enough, but even if the model is open source, you still have no control or knowledge of how it was developed or what biases it might have baked in. AI is by definition a black box, even to the people who made it, it can’t even be decompiled like a normal program.

      You can tune models for specific outputs actually. There are even projects that are exploring making models adapt and learn over time. https://github.com/babycommando/neuralgraffiti

      The fact that it’s a black box is not really a show stopper in any meaningful way. We don’t know minds of other people, yet we can clearly collaborate effectively to solve problems despite that.

      I mean, China has the death penalty for drug distribution, which is supported by the majority of Chinese citizens.

      Sure, there are tough laws against drugs in China as well as other countries, but that has not eliminated use drugs entirely. Meanwhile, there is no indication that any state would ban the use of AI, and it would be self defeating to do so because it would make it less competitive against the states that don’t. The reality is that there are huge financial incentives for developing this technology for both private companies and state level actors. This tech is here to stay, and I don’t think it makes any sense to pretend otherwise. The question is how this tech will evolve going forward and how it will be governed.

      I never thought of it in terms of copyright infringement, but in terms of reaping the labour of proletarians while giving them nothing in return.

      I don’t see it that way at all. Open-source AI models, when decoupled from profit motives, have the potential to democratize creativity in unprecedented ways. They enable a nurse to visualize a protest poster, a factory worker to draft a union newsletter, or a tenant to simulate rent-strike scenarios. This is no different from fanfiction writers reimagining Star Wars or street artists riffing on Warhol. It’s just collective culture remixing itself, as it always has. The threat arises when corporations monopolize these tools to replace paid labor with automated profit engines. But the paradox here is that boycotting AI in grassroots spaces does nothing to hinder corporate adoption. It only surrenders a potent tool to the enemy. Why deny ourselves the capacity to create, organize, and imagine more freely, while Amazon and Meta invest billions to weaponize that same capacity against us?

      And I have a concrete example I can give you here because AI tools like ComfyUI are already being used by artists, and they’re particularly useful for smaller studios. These tools can streamline the workflow, and allow for a faster transition from the initial sketch to a final product. They can also facilitate an iterative and dynamic creative process, encouraging experimentation and leading to unexpected results. Far from replacing artists, AI expands their creative potential, enabling smaller teams to tackle more ambitious projects.

      https://www.youtube.com/watch?v=envMzAxCRbw

      Imagine working your whole life on open source projects only for no company to want to hire you because they’re using AI trained on your open source work to do what they would have paid you to do.

      Right, I would not like a company to build a proprietary model using my open source work. However, I’d have absolutely no problem with an open model being trained on my open source. As long as the model is distributed under an open license then anybody can benefit from it, and use it in any way that makes sense to them. I see it exactly the same as open sourcing code.

      I do think capitalists will use this technology to harm works, that’s been the case with every advance in automation. However, I do think it’s going to be a far better scenario if this tech is open and can be used by workers on their own terms. The worst possible outcome is that we have corporations running models as subscription services, and people end up having to use them like serfs. I see open source models as workers owning the means of production.