• MaggiWuerze@feddit.org
    link
    fedilink
    English
    arrow-up
    83
    arrow-down
    4
    ·
    1 month ago

    In contrast to stuff like AI training or crypto, chips at least fulfill an actually useful function, so I don’t see the issue with their manufacturing consuming a lot of energy. Or should we compare the same for cars or medicine?

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    2
    ·
    1 month ago

    entire countries

    Not going to even read this horseshit. Which countries? Brazil or Vatican City?

    Fuck these headlines. If they have valid points to make, I’ll never see them. Grow the fuck up and be journalists or I don’t have time.

  • TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    40
    ·
    1 month ago

    Okay. What are we supposed to do, not use chips? They’re kind of a main character of the 21st century.

    This would be a great application of those nuke plants fuckin’ Google and Amazon want to build.

    • Ech@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 month ago

      What are we supposed to do[…]?

      All of these articles treat energy usage like a massive crime, but miss/ignore that the world’s energy use needs to go up as we increasingly turn to electric alternatives. The problem truly lies in how we generate electricity, not how we use it.

      So the actual answer to your question is intense and rapid investment in sustainable, non-carbon energy production. An infrastructure revamp to rival any other in history. It would’ve been far better to do so decades ago, but that’s no longer an option. Anything else is just half measures we can’t afford.

    • leisesprecher@feddit.org
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      1 month ago

      We could start by not requiring new chips every few years.

      For 90% of the users, there hasn’t been any actual gain within the last 5-10 years. Older computers work perfectly fine, but artificial slow downs and bad software cause laptops to feel sluggish for most users.

      Phones haven’t really advanced either. But apps and OSes are too bloated, hardware impossible to repair, so a new phone it is.

      Every device nowadays needs wifi and AI for some reason, so of course a new dishwasher has more computing power than an early Cray, even though nothing of that is ever used.

      • Bassman1805@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 month ago

        Tech companies are terrified of becoming commodities, even though a good chunk of them basically are at this point.

        Intel would probably be in a better spot if they’d just leaned into that rather than try to regain the market dominance they once had.

  • lnxtx@feddit.nl
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 month ago

    Do 7 nm chips are more energy intensive than older 100 nm?
    Or it’s just scale, more chips to manufacture, more energy needed.

    • n3m37h@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 month ago

      Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.

      Gamers Nexus has some good videos about chip manufacturing if you are interested

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        I’d be interested in a “payback period” for modern chips, as in, how long the power savings in a modern chip takes to pay for its manufacturing costs. Basically, calculate performance/watt with some benchmark, and compare that to manufacturing cost (perhaps excluding R&D to simplify things).

        • n3m37h@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          Honestly, if you go through all the node changes you could do the math and figure out. Like N3 to N2 is a 15-20% performance gain at the same power useage.

          It wouldn’t be exact. But I doubt any company will tell you how much power would be used in the creation of a single wafer

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 month ago

      Older chips definitely consume more watt per processor power, newer are usually better on top of that too.

      Talking about usage, not construction.

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Are the largest power consumption steps of semi conductor product happening 24/7? Could we simply align manufacturing times with useful solar production times? So no need to store all the solar power, with the idea of consuming most of it immediately for manufacturing. Then pass a run that Semi conductor fabs have to build out their own solar arrays to cover most of their power consumption.

    • Kidplayer_666@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Chances are yes. Simply because to build the machines, such an astronomical amount of money and energy is needed to build them, that even if electricity during dead times cost a bunch more (which for businesses probably does), it probably is still worth it, just to bring it to maximum capacity