You can play it in your browser here.

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    10
    ·
    edit-2
    6 months ago

    It’s an invented problem. A program takes what a program takes. Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).

    Applications taking 3mb take 3mb because they do next to nothing or they do it with a bunch of shared libraries … which is a whole other dependency management mess and wasting a few mb on a drive.

    There’s also a huge difference between being wasteful of something that pollutes the planet in mass and is not renewable like gasoline (which is the only reason you’d be upset about that now) and wasting a few mb on a drive.

    The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store… It’s insignificant and often necessary.

    You can say that program does way more than you need, but … nobody is catering to “only what you specifically need” and using the larger program almost certainly covers your needs.

    Furthermore, like I already said making things smaller often makes them slower… Since CPU is more expensive to improve, of course things are bigger, that’s what more people care about. Some video games take that to an extreme with uncompressed files and 250GB install footprints … but 200mb?

    • I Cast Fist@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      6 months ago

      Everyone cares way more about the code being legible, the code being fast enough, and the code not using a ton of memory (and even that last one is kind of shrugged off depending on context).

      And then you look at real life and notice that code everywhere is slow, bloated and inefficient. But hey, it’s “legible”! To one or two devs, hopefully.

      The equivalent of your complaint 3mb vs 200mb is like complaining about a person taking a trip to the grocery store

      Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.

      making things smaller often makes them slower

      It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        6 months ago

        And then you look at real life and notice that code everywhere is slow, bloated and inefficient.

        That’s not true in practice. I mean, that code does exist. However, the vast majority of code is reasonably performant.

        Not everyone is an expert at optimization and that’s fine … we’d have a lot less software in general if only the best of the best were allowed to author it.

        It would be great if more things went back to native (or at least not “I need an entire web browser for my app to function”) that to me is wasteful… But a few hundred MBs for a program as large, complicated, and feature rich as LibreOffice is not.

        Terrible analogy. A better equivalent is someone renting a garage to store stuff inside and now, because they have so much space, there’s that urge to fill it, whether it makes sense to or not.

        No, that’s … just wrong. It’s not like people are just writing code and leaving it there to do nothing except increase code size or are actively trying to fill the drive.

        It’s usually the other way around. As a rule of thumb, less code = smaller size = faster execution. In theory, 1k lines of code will require less computation, less processing, than 10k.

        That’s not inherently true, though it is a common misconception/oversimplification. When you do things like code inlining, you increase code size (because you’re taking that functions code and having your compiler copy it around to a bunch of places) but the increased locality speeds things up. There’s a reason -Os and -O3 are not the same option.

        Now sure, if you execute fewer instructions that’s better than executing more localized code (though even that can be wrong given process cache and relative instruction speed). Lots of programs have added features that you might not use, but that doesn’t really “hurt you”, that’s not the source of your program or your computer’s slowness, it’s just some bytes on the drive.

        We’re a long way from the Unix style “everything is a small program that gets piped into other programs to do interesting things” days. That paradigm just doesn’t work for GUI software. Nobody does that because … normal folks would rather have one office program than have to go shop for 275 programs so that they can have separate programs to edit the document, print the document, convert the document to pdf, update calculations in their spreadsheet, run macros, etc (which if you use all/most of them would likely be more expensive in terms of disk space anyways).