• 2 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle
  • Only if those device makers are willing to use it. And that has always been the tightrope linux has walked.

    Its very history as a x86 platform means it has needed to develop drivers where hardware providers did not care. So that code needed to run on closed hardware.

    It was bloody rare in the early days that any manufacturer cared to help. And still today its a case of rare hardware that needs no non free firmware.

    Free hardware is something I’ll support. But it is stallman et als fight not the linux kernel developers. They started out having to deal with patented hardware before any one cared.


  • proprietary

    Well related to the owner is the very definition of proprietary. So as far as upstream vs not available for upstream is concerned. That is what the term is used for in linux.

    So yep by its very definition while a manufacture is using a licence that other distributions cannot embed with their code. Marking it proprietary is how the linux kernal tree was designed to handle it.

    EDIT: The confusion sorta comes from the whole history of IBM and the PC.

    Huge amounts of PC hardware (and honestly all modern electronics) are protected by hardware patients. Its inbuilt into the very history of IBMs bios being reverse engineered in the 1980s.

    So as Linux for all its huge hardware support base today. It was originally designed as a x86(IBM PC) compatible version of Unix.

    As such when Stallman created GPL 3 in part as a way of trying to end hardware patients. Linux was forced to remain on GPL 2 simply because it is unable to exist under GPL 3 freedom orientated restrictions.

    The proprietary title is not seen as an insult. But simply an indication that it is not in the control of the developers labelling it.






  • High use Blender users tend to avoid AMD for the reasons you point out.

    This leads to less updates due to amd users not being to interested in the community.

    It is an issuw without any practicle solution. Because as I need a long overdue update. Again nvidia seems the only real choice.

    Everyone is sorta forced to do that unless we can convince amd users to just try out blender and submit results.

    So hi any AMD users who dont care about blender.

    Give it a try and submit performance data please.






  • Cool. At the time, it was one of the best. Although, I also liked sun-os.

    I also worked with VMS a lot after uni. Hated using it. But had to respect the ideals behind it.

    But watching the growth of Linux has been fantastic. In 2024. It does seem to have out evolved all the others. ( Evolved, defined as developed the ability to survive by becoming so freaking useful. )

    I am starting to think it is time for a micro kernel version, though.



  • Late 1990s my uni had unix workstations HPUX.

    So all projects etc were expected to be done on those. Linux at the time was the easy way to do it from home.

    By the time I left uni in 98. I was so used to it windows was a pain in the butt.

    For most of the time since I have been almost 100% linux. With just a dual boot to sort some hardware/firmware crap.

    Ham radio to this day. Many products can only do updates with windows.


  • Just of the top of my head discovered today.

    Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.

    Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.

    Making it pretty much unsuable for anyone with poor vision.

    Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.

    So if your looking for small projects. Youd at least make me happy;)



  • Yep pretty much but on a larger scale.

    1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.

    But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.

    As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.

    The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.

    A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.