USB-C will be around for a long time, it’s a strong standard. Wireless inductive charging won’t take over for a long time because it’s limited in speed, and WiFi/Bluetooth are much slower for data transfer.
4-5 years ago I stopped buying products that had micro-usb, lightning or any other form of port that wasn’t usb C.
Last week I was looking at a gadget and it had micro-fucking-usb and was produced in early 25! What the fuck?!
And there’s are those gadgets that have a USB-C socket but don’t have the correct circuitry, so that they only work with a USB-A to C cable.
It is cheaper to have manufactured & is very much a “known”, but I’m right there with you. If it’s not USB-C, fuck 'em.
Yeah it’s usually a sign that there is no competition in the space since the manufacturer doesn’t want to redesign the item if they don’t have to
Or it’s just a very cheap item. I recently bought a rechargeable disposal cannabis vape while out of town for work, I asked the dude for the cheapest dispo they had, bought it and it had a fucking micro USB on it.
Some homeless guys problem though not mine, I probably only used 50mg of the 1g cart and didn’t have to recharge it, so gave it to some homeless dude before I left.
No exposed hardware ports seems to be the direction it’s been moving towards
You’ll have to convince the EU to change the USB-C rule.
Not unless they’ve managed to fix the wireless charging problem. Namely that it barely functions.
Almost all of the energy goes into heat, it’s ridiculously inefficient.
I’ve been charging my phones exclusively wirelessly (not counting plugging in cars or emergencies) for about 15 years. From Lumia, to OnePlus to Pixel now. Zero issues.
Even iphones finally invented wireless charging a few years ago and it works very well.
I realize it’s not as efficient, but I charge overnight and we’re probably talking about $10/year in losses.
You mean sealed black boxes
“Introducing! The internet!”
USBC has done something truly amazing. You used to be able to tell within reason what the capabilities of USB were by the connector or the color of the port. Now there’s dozens of options and there’s hardly anyway for you to tell what cable and port support what features.
Maybe your port and charger can throw out 20 volts at 3 and 1/2 amps. Maybe you can throw out 20 volts at 6 amps (dell) maybe your device doesn’t negotiate correctly and they say to only use an a-c cable
Don’t get me wrong, I love the port. Multidirectional, doesn’t really wear out, does have a tendency to get a little dirty though. Lightning was a little more forgiving on dirt.
Labeling on the ports are all vague labeling on the cables is non-uniform or not existent.
But, truth is they probably come up with half a dozen specs for USBC that half your it doesn’t support. And they’ll probably come out with God knows how many more before they Make a new connector.
I don’t agree with the good ol’ days, beyond the blue connectors of USB3, there was no way of telling if a cable was charge only or data+charge. No way to tell if it was USB 1 or 2. If it was standard 0.5 amp or “fast charge”, up to 3 amps. There was a lot of different plugs, regular, mini, micro, A and B types.
I agree with everything you say about USB-C tho.
It was extremely easy to determine if a cable carried data. If there were four wires/metal strips at the end it had data. If it was only the two fat ones it was power only.
Yeah. It was already happening circa USB3. It’s not because of the connectors, but the broadening spectrum of requirements of client devices.
Maybe USB-C was a missed opportunity to address it, but it certainly didn’t “start the fire”.
It wasn’t better, but it was readable. I don’t want to go back, I want them to fix what we have now to be readable.
Here’s an idea, all C cables supporting any level of PD must have the specs stamped on both plugs.
I’m down with it, but it’s a lot
Wattage/transferspeed/displayport/thunderbolt/PD
Even the current icons don’t tell you more than speed these days
Terrible news for all my USB-C cables less than 2 & 1/2 inches long!
:) yeah good point & thanks for the pic
People can hate all they want on USB-C with all these details that may be technically true, but the only issue I’ve had in years is with chargers varying in power output and occasionally that means I try to charge something that either takes forever or never charges. It’s an edge case and I consider it a charger issue, not a cable one…
Life is definitely simpler now with USB-C being pretty standard, and Lightning cables can burn in hell. Those anti-standard bullshits have caused me to buy a dozen of them for friends and test devices (I’m a web dev) and yet I’ve never owned a mobile Apple product and never would. Fuck Lightning – cannot possibly say it enough. I’m glad the EU agrees.
that means I try to charge something that either takes forever or never charges.
That’s a pretty significant failure IMO.
I don’t want to go back, but I want shit be labeled and work. You go to bed and wake up to a 7yo on a trip with a dead device, you’re going to have a bad day.
It’s not a failure to default to a safe level if it can’t negotiate properly. That’s a feature.
But it’s not really a failure of the cable (typically, I know there are edge cases but I don’t think I’ve run into them recently). In a perfect world, it possibly plugging in means it works as expected I guess, but I think it’s a better tradeoff to expect users to know that some devices require a bit more power, and have a plug that still works universally. “This charger doesn’t have enough power” is easy enough to be understood by a 90 yo I would think.
Chargers should be labelled with the output they provide (mine are), but you are right, devices probably should be labelled better with what they require.
Admittedly, i’ve only had a couple cable problems and one of those would have likely been labeled wrong as they were cheap cables with max wattage programmed.
One of my users came to me with a C-C phone cable that had used between a MacBook pro and a Mac charger. “This got really hot and started to stink when I used it”
uhh, crap, here have this handed them a 6 foot 100w cable and please throw that old one away I’ll give you a new phone cable too.
My other problem is with QC on random cables. I have some 6’ that won’t pass 25w. I have a Klein tester that will enumerate wattage. I throw away cables that won’t at least support fast charging.
I also have a couple Samsung trio wireless multi-device chargers, they insonsistent. If I use the 45 watt Samsung brick that came with it it works fine. If I use any other brick, it refuses to use anything but a full on 65 watt charger. I don’t know if they ignore the spec, or enumerate differently, maybe they give a different output on 9v than others, but we need to have this overall issue with compatibility and semi-functional usage that just feels janky.
I hope that eventually with GaN and other tech that everything will just do 100w or maybe all devices will go down to only needing 45 watt. It would be super nice if everything just played well together.
won’t pass 25w
Reminded of what looked to be a great deal on USB-C cables from a major manufacturer (Anker I think).
Fine print: not for MacBook Pro yaddayadda! Yeah the things can’t hang for it. Gotta spend way more.
I wonder what the best cable labeling solution is for new cables purchased with known specs. Bet a handful of well-resourced geeks do their own printing right on them.
Did you know that USB C cables can be unidirectional? As in, they only work plugged in in one direction. You know how I know that? Cause I’ve soldered usb-c cables myself. I own one that only plugs in in one direction (and works)! I’m honestly very surprised you’ve only had issues with charging, do you not need them for data? So many of my cables are charging only, they literally do not function for data at all. It’s a nightmare.
I’ve had exacrly 5x more failures of USB C ports than I’ve had of micro that is 5, and 1), and I’ve had way more micro devices over a much longer period (and still have some). It may technically be a “better” port, but my experience doesn’t reflect that.
I have to label cables and chargers because some C devices today still don’t support all charging specs, so I have to verify a device charges on a charger to know for sure.
At what point shouldn’t a device be able to negotiate down to the lowest charge capability, instead of not charge at all? That the spec permits this to happen is a major failure.
It’s fantastic that C is the convergence standard, but let’s not act like it’s close to perfect. I have to verify with every device I use if the charger actually works for it, and not just “is the charger powerful enough”, but “does it actually charge even though I know it should because it supports all the capabilities as the device”.
Yeah, but at least you’re not sol when you’re at an apple house with an android device at 10% battery any more. If you need a cable with very specific capabilities that’s on you to do that research imo. The alternative is making every cable more expensive when most people don’t need it.
I saw that one back in the day. Never seen any of it in the wild yet.
Would be nice to get TB/DP on there too.
And you’ll never see it on an Apple product, though you’ll never find an Apple product that isn’t supporting the full standard it’s still a problem when you’re trying to find a cable and aren’t sure if you’re using a real Apple one or a lookalike.
To solve the issue of identifying the capabilities of the cable: CaberQ.
Though a bit expensive for what it is.https://caberqu.com/home/20-43-c2c-caberqu-746052578813.html#/27-with_or_without_case-with_case It’s not awful for price but there are more complete testers like treedix: https://treedix.com/
What bothers me is all these testers assume you are a USB hardware wizard and know which pin combo supports which USB standard.
I want something that tells you how fast and how much power the wire can handle.
The newer cables have chips to talk to chargers to not exceed the power ratings. Why can’t these chips or testers also tell you how fast the wire can handle?
I thought I was smart going back to a video that featured two USB C cable testers. I onky watched the video and didnt check or paid attention what the brand was.
They are, in fact, these exact two brands.
Would have been nice for some kind of forethought on a labeling system.
But there’s so many combinations now of power, data, audio, and video, and sup glasses of thunderbolt, display port, HDMI. Even if you put a 4-digit code on every cable listing exactly what they support people would never be able to understand and track down backward compatibility.
I’d be surprised in the next port change if we don’t end up with some fiber optic in there.
Ikea PD-PPS charger + Ikea 100W labeled cable = done.
Well, that covers my phone, but then 45 watts won’t run my laptop, and if I plug in my phone and my laptop, they only get 22 watts each.
Then the cable: Can it be used for data transmission? What speeds does it cover? Will it transmit data through a DisplayPort or HDMI? If I unplug it from the power and plug it into the USB-C on my monitor, will I get video?
There are so many features, and it’s not like you can just go ohh I’ll get this USB-4_g cable and know what it does. Even the webpage for the Rundhult has no mention of what features are supported other than 100w.
The whole spec is complicated AF. You could spend $100 on a brick/cable that can do either 100W or high speed, but if you only need part of the equation, you can spend $30 on a brick and cable. What they support is almost never enumerated, even on the packaging.
Your laptop will charge at 22 or 45W. Easy!
Cable will work for data at usb 2.0, as it says on the packaging. So it won´t work for video alt modes. Easy!
My monitor has an input cable that allows for maximum video resolution and maximum power delivery. I never need to take it out. Easy!
Which cable are you talking about? This one supports data and mentions nothing about alt modes. https://www.ikea.com/us/en/p/rundhult-usb-c-to-usb-c-black-white-20581106/
The one that came with my monitor. Easy!
If I ever need a new one for such an extremely specific task, I´ll make sure to spend a few minutes to make sure I buy the right one. Takes a minute, but easy!
The Ikea one says it only supports 480Mbps, so that´s a no-no for video. Sad, but easy!
lol, no it’s not easy. You saying “easy” doesn’t mean it is in any sense of the word. Like the person you’ve been responding to before said, usb c can support many things and not support others. For example, USB C cables can literally be unidirectional! That sure isn’t listed because it’s assumed to work bidirectionally, but it’s not a requirement. I literally have a unidirectional usb c cable in fact.
Just cause it says 480 doesn’t mean jack when
-
You weren’t talking about this cable originally, you were making a claim about a cable that literally wasn’t mentioned in the article. I gave an example of a cable that directly disproved your comment in a facetious manner.
-
No consumer should be expected to know usb c standards (that’s literally the point of this conversation)
-
480 Mbps has nothing to do with supporting video. This Reddit thread explains it way better than I can, but support for a feature in the cable has absolutely nothing to do with data transfer rate. https://www.reddit.com/r/UsbCHardware/comments/ji87mc/usb_32_gen_2_typec_monitor_compatibility/j5dohy5/
-
Everyone around the world is benefiting from the EU common charger law: https://commission.europa.eu/news-and-media/news/eu-common-charger-rules-power-all-your-devices-single-charger-2024-12-28_en
Dear Europe. Please take me in. Do you have any English speaking countries? Your laws seem to be geared towards benefiting people. Not tyrants and corporations.
They did have one heavily English speaking country, but those guys peaced out a few years back. Now it’s just Ireland and Malta (where English is an official language).
I think the Netherlands has the highest amount of L2 English speakers.
In the Netherlands, the English language can be spoken by the vast majority of the population, with estimates of English proficiency reaching 90%[1] to 97%[2] of the Dutch population.
https://en.wikipedia.org/wiki/English_language_in_the_Netherlands
It’s not the official language though so all documents and legal stuff would be in Dutch.
Europeans from which country get upset when they hear their fellow countrypeople speak English poorly?
Was it Germans, because there’s compulsory English education in schools?
100% of Irish people can speak English and do so without sounding as ridiculous as the Dutch do.
Well kinda
:(
Chin up, Nederlander. I don’t think you’re ridiculous.
:)
I’m moving to Sweden soon, just about everyone there speaks English! And also Swedish is such a a pretty language I’m really excited to be immersed in it
Any Scandinavian country should have a population ranging from proficient to fluent in English.
Ireland speaks mostly English as far as I know.
Ireland, but housing is shite right now.
Is housing shit because the homes need repair? Or are they shit because a single room shack is owned by corporate interests and costs 8 billion dollars a month?
housing is shit because there is no houses
Oh, thats ok. I can just live with you. I’ll sleep in the master bedroom. And you can sleep…somewhere, I imagine.
USB-Cya
US-Blyat
USB… USB not going to work here anymore anyway.
What a fantastic reference
I spent 40 years in the computer industry. I learned one thing very early on.
The only standard in the computer industry is that there isn’t one.
Beginner in IT:
“The problem is that there isn’t one”
Expert:
“The problem is that there isn’t one”
Why don’t we just make one unifying standard? That can’t possibly go wrong right?
There are now sixteen competing standards.
The great thing about standards is that there’s so many to chose from.
No way, it’s a MASSIVE pile of standards. The entire internet and networking in general only functions because of standards. HTML5’s main benefit was standardizing a ton of BS everyone was playing around with.
What isn’t standard are the few higher level frameworks and BS people are playing around with, but saying that’s all of the computer industry is like that old meme of Homer getting pulled most of the way up the mountain by sherpas in a sleeping bag…
No way, it’s a MASSIVE pile of standards.
Pretty sure it was a reference to this problem specifically:
Yes I know the joke. My point is that it is only true for the developing front of engineering. Everyone is still, in fact, standing on a mountain of well established and followed standards while debating the future.
Even USB-C is a nightmare. There’s 3.0, 3.1, and 3.2, which were rebranded as “3.2 Gen X” with some stupid stuff there as far as what speed it supports.
Then it can do DisplayPort as well. There used to be an HDMI alt mode too!
An Intel computer might have Thunderbolt over the same cable, and can send PCIe signals over the cable to plug in a graphics card or other devices.
Then there’s USB 4 which works like Thunderbolt but isn’t restricted to Intel devices.
Then there’s the extended power profile which lets you push 240 W through a USB C port.
For a while, the USB-C connector was on graphics cards as Virtualink, which was supposed to be a one-cable standardized solution to plugging in VR headsets. Except that no headsets used it.
Then there’s Nintendo. The Switch has a Type-C port, but does its own stupid thing for video, so it can’t work with a normal dock because it’s a freak.
So you pick up a random USB C cable and have no information on what it may be capable of, plug it into a port where you again don’t know the capabilities. Its speed may be anywhere between 1.5 MBit/s (USB 1.0 low speed) and 80 GBit/s (USB 4 2.0) and it may provide between 5 and 240 W of power.
Every charger has a different power output, and sometimes it leads to a stupid situation like the Dell 130 W laptop charger. In theory, 130 W is way more than what most phones will charge at. But it only offers that at I think 20 V, which my phone can’t take. So in practice, your phone will charge at the base 5W over it.
Dell also has a laptop dock for one of their laptops that uses TWO Type-C ports, for more gooderness or something, I don’t know. Meaning it will only fit that laptop with ports exactly that far apart.
The USB chaos does lead to fun discoveries, such as when I plugged a Chromecast with Google TV’s power port into a laptop dock and discovered that it actually supports USB inputs, which is cool.
And Logitech still can’t make a USB-C dongle for their mouse.
At least it’s not a bunch of proprietary barrel chargers. My parents have a whole box of orphaned chargers with oddly specific voltages from random devices.
Yes, and USB-C is barely a decade old. A single standard, about a universal by name port. Computer science, y’know the thing it’s all meant to benefit, is over a century old. There is a MOUNTAIN of standards we’re all standing on, whether or not you care to admit it.
Umm, that is my point. Due to the massive pile of “standards” there really is not one standard in any part of the industry as it will change within months etc.
No, you misunderstand the scope of standards I’m referencing. Computer science goes back over a century, yet you’re attempting to tell me there are no established standards based on something that’s barely even a decade old as a consumer product standard.
Not that deep, it was a joke.
A flippant joke based on a perspective I’m not even referencing does not in any way what so ever mean I am wrong.
At least not as persistent as RAM-DIMMs and PCIe
USB-D!😎 Shaped like a crescent moon! Also, we’re going back to brick phones to accommodate the shape.
And they should be secured with thumb screws, like an old parallel cable.
Funnily enough, USB c can also do that.
Once you’ve used one you’ll be angry that they’re not standard, at least on desktop/kiosk devices. USB-A also had that as an option.
I use Goop adhesive for cables I don’t want falling out.
Only takes a tiny amount at the connector shoulder, it’ll never fall out but is still removable by hand.
Or if I really want it to stay, I use a lot of goop, and I know that fucker is never coming out without a sharp knife.
Nope. Micro USB-C.
Let’s not skip Mini USB-C along the way
But first… 5 different standards of normal USB-C and if you use the wrong type then it will damage your battery. Fuck you Nintendo.
That already happend and- oh you already said fuck Nintendo.
Usb-barrel jack
In all seriousness, this would make the most sense in terms of progression in design. It already sorta’ is just that, but it’s been stepped on.
Hah! Just hit me, if I’d seen a USB-C plug back in the day, my first thought would’ve been “oof, someone isn’t playing bootleg cartridge games for a while!”
My guess is USB D will be one dimensional, ie a cylinder pin, like the good old headphone jack. You can plug it with your eyes closed or in the dark
It would be hilarious if they made it exactly like a headphone jack, but the USB-D slot wouldn’t support headphones.
I can only dream of FireWire’s return
Not unless they want to go bigger. The USB-C pin pitch is too closely spaced for the lowest tier of printed circuit boards from all major board houses.
You might have some chargers get deprecated eventually because there are two major forms of smart charging. The first type is done in discrete larger steps like 5v, 9v, 15v, or 21v. But there is another type that is not well advertised publicly in hype marketing nonsense and is somewhat hit or miss if the PD controller actually has the mode. That mode is continuously adjustable.
The power drop losses from something like 5v to 3v3 requires a lot of overbuilding of components for heat dissipation. The required linear regular may only have a drop of 0.4-1.2 volts from input to stable output. Building for more of a drop is just waste heat. If the charge controller can monitor the input quality and request only the required voltage for the drop with a small safety margin, components can be made smaller and cheaper. The mode to support this in USB-C exists. I think it is called PPS if I recall correctly. A month or two back I watched someone build a little electronics bench power supply using this mode of USB-C PD.
What’s this about a pin pitch? Or drop losses. It sounds interesting but I don’t understand ☹️
Pin pitch is pin size and/or spacing. With physical plugs, you start to hit limitations with how small the wires can get while still being durable enough to withstand plugging/unplugging hundreds of times.
Drop losses. (I am keeping this at an ELI5 [more like ELI15, TBH] level and ignore some important stuff) Every electronic component generates heat from the power it uses. More power used usually means more heat. Heat requires physical space and lots of material to dissipate correctly. Depending on the materials used to “sink” (move; direct; channel) heat, you may need a significant amount of material to dissipate the heat correctly. So, you can use more efficient materials to reduce the amount of power that is converted to heat or improve how heat is transferred away from the component. (If you are starting to sense that there is a heat/power feedback loop here, it’s because there can be.) Since a bit of power is converted to heat, you can increase the power to your device to compensate but this, in turn, generates more heat that must be dissipated.
In short, if your device runs on 9v and draws a ton of power, you need to calculate how much of that power is going to be wasted as heat. You can Google Ohms Law if you would like, but you can usually measure a “voltage drop” across any component. A resistor, which resists electrical current, will “drop” voltage in a circuit because some of the current (measured in amperage) is converted to heat.
I kinda smashed a few things together related to efficiency and thermodynamics in a couple of paragraphs, but I think I coved the basics. (I cropped a ton of stuff about ohms law and why that is important, as well as how/where heat is important enough to worry about. Long story short: heat bad)
Pin pitch is ultimately the spacing between traces. The traces are not as big of an issue as the actual spaces between the traces. This clearance is where things get tricky with making printed circuit boards. The process of masking off some circuit is not that hard. The way the stuff you want to keep is isolated from the copper you want to remove is the hard part. One of the issues is that you need an acid to take away the copper, but not the mask, but copper has a thickness. As the copper is etched away the acid moves sideways into the thickness too. Copper never etches completely uniformly either. The larger areas of open copper that need to be removed will etch much faster than a bunch of thinly spaced gaps. One of the tricks to design is finding ways to etch consistently with the process you build.
If you want to make super tiny traces that still have the right amount of copper and have all the gaps etched away consistently, the process of the etching toolchain becomes more expensive. You will need a stronger acid with a very good way of removing the etchant that is close to the copper and loaded with copper already. This is usually done with a stream of small bubbles, but it is risky because it could impact the adhesion of the masking material over the traces you want to keep. The stronger, hotter, and now agitated acid requires that the copper clad board is extremely clean and the photoresist used to mask the stuff you want to keep must be a very high quality. Also the resolution of this photoresist requites a much more precise form of UV exposure and development (about like developing old film photos).
So you need a better mask development toolchain, better quality photoresist. You might get away with not using photoresist at all in some other cheaper low end processes. You need the highest quality copper clad that etches more evenly, and you need a stronger acid to etch quicker straight down because a slower acid will move further sideways and ruin the thin traces to keep.
The pic has old school dip chips in a static resistant foam. Those are the classic standard 1/8th inch (2.54mm) pin pitch. The easiest types of boards to make yourself are like the island soldering style board with the blue candy soldered on. That is a simple coalpits oscillator for testing crystals. Then there are protoboards like the homemade Arduino Uno pictured. Then you get into the etched boards. Some of these were done with a laser printer toner transfer method. That is like the least accurate DIY and somewhat analogous with the cheapest boards from a board house. Others were made using photoresist. This method is more accurate but involved and time consuming. One of the boards pictured is a little CH340 USB to serial board with a USB micro connector. That is getting close to my limits for etching easily. Another board has a little LCD and text. There is a small surface mounted chip pictured on the foam and that is a typical example of what kinds of pin pitches are common for the cheapest level of board production. Now there are two USB-C female connectors pictured. One has a larger pin pitch and is made for USB 2.0 connections and power. However, that other one with all those tiny tiny connections at the back – that is a full USB-C connector. That thing is a nightmare for tiny pin pitch. There is also a USB-C male connector with a little PCB attached. These are the types of solutions people have tried to come up with where only some small board is actually of a much higher resolution. It is not the best example but I’m not digging further through stuff to find better.
The actual pins on the little full USB-C connector are inverted to be able to flip the connector. There is a scheme present to make this a bit easier to match up the connections but it is still a pain in the ass to juggle everything around. All of the data trace pairs are differential too, which basically means they must be the same length between the source and destination. So any time they are not equal, the shorter trace must zigzag around in magic space you need to find just to make them even.
Pin pitch means how tiny the physical pins in the connector can be spaced apart.
IR drop losses happen because a wire has resistance, it isn’t a perfect conductor. 28AWG wire has about 0.22ohm/m. Given a 2 meter cable, you might expect to see 0.44ohm one-way. Current is also travelling back, so the circuit “sees” another 0.44ohm. That’s a total of 0.88ohm
A wire will cause voltage drop following ohm’s law. V=I/*R. So for 1A of current, you will see 0.88V lost.
Say you’re trying to charge at 15W (5V 3A), your phone is only going to ‘see’ 2.36 volts, and 7.9W are wasted in the cable.
For a 100W device (20V, 5A), 4.4V are lost, also meaning 22W are wasted.
(For others reading this, this is a perfect followup to my comment here explaining the “why”, while this is an excellent view into the “how” and picks up the bits I dropped about Ohms Law.)
Yeah, Programmable Power Supply mode can be programmed (in realtime) to deliver from 3.3 to 21 volts in 20mV steps. For current im not totally sure how it works, i think you can set a limit.
There is an issue of some kind where the current limit is not reliable and requires additional circuitry. I think GreatScott YT was who went into that one.
The connector itself is perfectly fine, which is incredible, and exactly what people wanted. Plus there’s a ton of room for technological improvement under the hood, if needed. USB/Thunderbolt standards spread for a whole range of specs now, all under the Type-C connector
This implies you have at least 100 chargers and 1 isn’t USB C
Had to scroll way too far to find this.
And that can only mean one thing!
He’s got 99 chargers and USB C ain’t one.
Yeah the planes I’ve been flying on have had usb in seats now, the plans ate old I’m sure but it’s just in time for me to have usbc to c cables and can’t use them still haha