Alt account of @Badabinski

Just a sweaty nerd interested in software, home automation, emotional issues, and polite discourse about all of the above.

  • 0 Posts
  • 172 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2024

help-circle
  • Arch is a pretty good one if you want to control and tinker. I have personally found it to be very reliable over the years, and the AUR is exceptionally powerful (although you NEED to review your PKGBUILDs, there’s nothing stopping someone from putting malware on the AUR again). The packaging format is so simple and easy that I actually build a few performance-critical packages locally so I can tweak compiler flags (gimmie that -march native).

    Nix is cool and kinda crazy, but honestly? I’d hold off until you’re comfortable with Arch. Same with Gentoo.


  • Yep, this is why we use GPL! Using a permissive license is like lending money to a friend—you should never, ever expect to get your money back. “Good” companies aren’t altruistic, they’re ruthlessly self-interested. They’re not going to give back to your project unless there’s a damn good reason for them to do so. There are times when permissive licenses are totally fine (like when writing some kinds of libraries), but if you care about freedom of an application then you should stay the fuck away from MIT, Apache, BSD, or any other permissive license. Just use the GPL, folks.

    edit: Using GPL from the getgo would have prevented this atrocity from occurring: https://github.com/coredevices/libpebble3/commit/35853d45cd0ec51cb732be866f6f72467653a613

    They couldn’t have relicensed the project without community approval if it had been using a copyleft license in the first place.

    Also, fuck off with your fucking AGPL license with a copyright transfer CLA bullshit. I’d love to see a new version of the AGPL that expressly prohibits copyright transfers. Never let a company take your rights away from you. A copyright license makes even the GPL effectively meaningless if the company wants to rug pull at a later date.







  • We’ve had the template for this for decades. Put the solar panels in space where the thick soupy gunky spunky atmosphere doesn’t stop the little energy things from the sun. Collect the power in orbit. You just do that up there up in orbit okay? And then you fucking beam the power down to the surface you numpty fucks. Use a maser to send the power down to the surface and you can pick a frequency that isn’t affected by the gunky spunky and then the receivers on the ground can pick it up and they send the power through these things called wires to a building that uses the power and the building can use this neat little thing called CONVECTION to more efficiently remove the heat from the things using the electricity wow.

    Or just, y’know, use less power and make use of ground based solar. We don’t need fucking AI data centers in space. Don’t get me wrong, I think it might be useful to, say, have some compute up in geostationary orbit that other satellites could punt some data to for computation. You could have an evenly spaced ring of the fuckers so the users up there can get some data crunching done with a RTT of like 50ms instead of 700ms. That seems like a hard sell, but it at least seems a bit tenable if you needed to reduce the data you’re sending back to the earth down to a more manageable amount with some preprocessing. That is still not fuckass gigawatt AI data centers. Fuck













  • ngl, I do wish it was still used. I remember being like, 4 years old and trying to write a “thank you” card to my grandmother. I spent what feel like an hour going through the alphabet, trying to find the letter that makes the “th” sound. Apparently my mom found me laying on the floor sobbing and repeating the alphabet, which is both funny and sad lol

    Many years have passed, but a tiny grain of resentment at the English language remains. The thorn would have prevented that.