No way, it’s a MASSIVE pile of standards. The entire internet and networking in general only functions because of standards. HTML5’s main benefit was standardizing a ton of BS everyone was playing around with.
What isn’t standard are the few higher level frameworks and BS people are playing around with, but saying that’s all of the computer industry is like that old meme of Homer getting pulled most of the way up the mountain by sherpas in a sleeping bag…
Yes I know the joke. My point is that it is only true for the developing front of engineering. Everyone is still, in fact, standing on a mountain of well established and followed standards while debating the future.
Even USB-C is a nightmare.
There’s 3.0, 3.1, and 3.2, which were rebranded as “3.2 Gen X” with some stupid stuff there as far as what speed it supports.
Then it can do DisplayPort as well. There used to be an HDMI alt mode too!
An Intel computer might have Thunderbolt over the same cable, and can send PCIe signals over the cable to plug in a graphics card or other devices.
Then there’s USB 4 which works like Thunderbolt but isn’t restricted to Intel devices.
Then there’s the extended power profile which lets you push 240 W through a USB C port.
For a while, the USB-C connector was on graphics cards as Virtualink, which was supposed to be a one-cable standardized solution to plugging in VR headsets. Except that no headsets used it.
Then there’s Nintendo. The Switch has a Type-C port, but does its own stupid thing for video, so it can’t work with a normal dock because it’s a freak.
So you pick up a random USB C cable and have no information on what it may be capable of, plug it into a port where you again don’t know the capabilities. Its speed may be anywhere between 1.5 MBit/s (USB 1.0 low speed) and 80 GBit/s (USB 4 2.0) and it may provide between 5 and 240 W of power.
Every charger has a different power output, and sometimes it leads to a stupid situation like the Dell 130 W laptop charger. In theory, 130 W is way more than what most phones will charge at. But it only offers that at I think 20 V, which my phone can’t take. So in practice, your phone will charge at the base 5W over it.
Dell also has a laptop dock for one of their laptops that uses TWO Type-C ports, for more gooderness or something, I don’t know. Meaning it will only fit that laptop with ports exactly that far apart.
The USB chaos does lead to fun discoveries, such as when I plugged a Chromecast with Google TV’s power port into a laptop dock and discovered that it actually supports USB inputs, which is cool.
And Logitech still can’t make a USB-C dongle for their mouse.
At least it’s not a bunch of proprietary barrel chargers. My parents have a whole box of orphaned chargers with oddly specific voltages from random devices.
But it only offers that at I think 20 V, which my phone can’t take
This is actually a big part of many of the high speed charging standards that phones use, is it will actually charge at a higher voltage to lower the amperage. I don’t know off the top of my head if USB-PD does this on phones but I know the old Qualcomm Quick Charge standard did it a lot. I think it went as high as 24V if I remember correctly
Then of course for a while lots of phones supported competing standards of quick charging and nobody allowed anyone else to use the same branding so identifying compatible chargers for your phone’s specific type of quick charge was a royal pain in the butt
Yes, and USB-C is barely a decade old. A single standard, about a universal by name port. Computer science, y’know the thing it’s all meant to benefit, is over a century old. There is a MOUNTAIN of standards we’re all standing on, whether or not you care to admit it.
Umm, that is my point. Due to the massive pile of “standards” there really is not one standard in any part of the industry as it will change within months etc.
No, you misunderstand the scope of standards I’m referencing. Computer science goes back over a century, yet you’re attempting to tell me there are no established standards based on something that’s barely even a decade old as a consumer product standard.
I spent 40 years in the computer industry. I learned one thing very early on.
The only standard in the computer industry is that there isn’t one.
Why don’t we just make one unifying standard? That can’t possibly go wrong right?
https://m.xkcd.com/927/
There are now sixteen competing standards.
Beginner in IT:
“The problem is that there isn’t one”
Expert:
“The problem is that there isn’t one”
No way, it’s a MASSIVE pile of standards. The entire internet and networking in general only functions because of standards. HTML5’s main benefit was standardizing a ton of BS everyone was playing around with.
What isn’t standard are the few higher level frameworks and BS people are playing around with, but saying that’s all of the computer industry is like that old meme of Homer getting pulled most of the way up the mountain by sherpas in a sleeping bag…
Pretty sure it was a reference to this problem specifically:

Yes I know the joke. My point is that it is only true for the developing front of engineering. Everyone is still, in fact, standing on a mountain of well established and followed standards while debating the future.
Even USB-C is a nightmare. There’s 3.0, 3.1, and 3.2, which were rebranded as “3.2 Gen X” with some stupid stuff there as far as what speed it supports.
Then it can do DisplayPort as well. There used to be an HDMI alt mode too!
An Intel computer might have Thunderbolt over the same cable, and can send PCIe signals over the cable to plug in a graphics card or other devices.
Then there’s USB 4 which works like Thunderbolt but isn’t restricted to Intel devices.
Then there’s the extended power profile which lets you push 240 W through a USB C port.
For a while, the USB-C connector was on graphics cards as Virtualink, which was supposed to be a one-cable standardized solution to plugging in VR headsets. Except that no headsets used it.
Then there’s Nintendo. The Switch has a Type-C port, but does its own stupid thing for video, so it can’t work with a normal dock because it’s a freak.
So you pick up a random USB C cable and have no information on what it may be capable of, plug it into a port where you again don’t know the capabilities. Its speed may be anywhere between 1.5 MBit/s (USB 1.0 low speed) and 80 GBit/s (USB 4 2.0) and it may provide between 5 and 240 W of power.
Every charger has a different power output, and sometimes it leads to a stupid situation like the Dell 130 W laptop charger. In theory, 130 W is way more than what most phones will charge at. But it only offers that at I think 20 V, which my phone can’t take. So in practice, your phone will charge at the base 5W over it.
Dell also has a laptop dock for one of their laptops that uses TWO Type-C ports, for more gooderness or something, I don’t know. Meaning it will only fit that laptop with ports exactly that far apart.
The USB chaos does lead to fun discoveries, such as when I plugged a Chromecast with Google TV’s power port into a laptop dock and discovered that it actually supports USB inputs, which is cool.
And Logitech still can’t make a USB-C dongle for their mouse.
At least it’s not a bunch of proprietary barrel chargers. My parents have a whole box of orphaned chargers with oddly specific voltages from random devices.
This is actually a big part of many of the high speed charging standards that phones use, is it will actually charge at a higher voltage to lower the amperage. I don’t know off the top of my head if USB-PD does this on phones but I know the old Qualcomm Quick Charge standard did it a lot. I think it went as high as 24V if I remember correctly
Then of course for a while lots of phones supported competing standards of quick charging and nobody allowed anyone else to use the same branding so identifying compatible chargers for your phone’s specific type of quick charge was a royal pain in the butt
Yes, and USB-C is barely a decade old. A single standard, about a universal by name port. Computer science, y’know the thing it’s all meant to benefit, is over a century old. There is a MOUNTAIN of standards we’re all standing on, whether or not you care to admit it.
Umm, that is my point. Due to the massive pile of “standards” there really is not one standard in any part of the industry as it will change within months etc.
No, you misunderstand the scope of standards I’m referencing. Computer science goes back over a century, yet you’re attempting to tell me there are no established standards based on something that’s barely even a decade old as a consumer product standard.
Not that deep, it was a joke.
A flippant joke based on a perspective I’m not even referencing does not in any way what so ever mean I am wrong.
The great thing about standards is that there’s so many to chose from.
At least not as persistent as RAM-DIMMs and PCIe