Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try [email protected] or [email protected]
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
It’s been more of a pain in the arse than initially expected.
Most motherboards (for example) only have 2-4 USB-C ports, meaning that I still need to employ A-C and C-C cables for peripherals etc.
My main gripe is that the standard just tries to do too many things without clear delineation/markings:
Is it a USB 2.0 (480Mbit), 5Gbit, 10Gbit or 20Gbit cable? Can’t really tell from the plug alone.
More importantly, for charging devices: How the heck do I determine maximum wattage I can run?
For all its faults, at least the blue colour of a USB-3.0 plug (or additional connectors for B/Micro) made it easy to differentiate !
Now I’m eyeing up a USB Cable tester just to validate and catalogue my growing collection! 🤦🏻♂️
Great idea, and then:
I was actually thinking coloured O rings to define specs, but that still means I’d need to have a colours guide somewhere too..
..yours might be a more practical solution. 🤔
You could fit some key numbers and letters on those O rings, I like it!
I wonder about this too. Can I plug my laptop's USB-C charger into my phone? Or is that a big nono
Yes, you can. The charger and the device communicate between one another what they can support, and pick the highest one they both agree on.
E.G. my laptop charger can charge at full speed (100W) for my MacBook, but only at 20W for my iPhone.
That bit is pretty straightforward and transparent to end users (there are a few rare conditions where devices might not agree on the fastest, and have to fall back to a slower one); the issue is more with cables not having sufficient gauge wire, or missing connections that prevent the charger and device from communicating their full functionality.
Should be okay, that USB-PD would detect the correct voltage and current.
I charge by Bluetooth headphones ‘pod’ with my Steam Deck charger and it seems to be ok.
The deck charger uses USB PD. It will charge anything that supports the standard as fast as possible (up to its rated 65W) and use normal 5v USB for everything else.
Yes.
It's even more annoying that there are different possible pinouts in the port itself without clear labling. So always use the one cable that came with the peripheral, or you have a chance to fry it
For the power matter, you don't. The device being charged, the charger, and cable does.
If you mean what is the maximum wattage that will actually be used, that should be the maximum possible between the charger, cable, and device. So look at their specs. Whichever has the lowest maximum, is what the others will match.
USB PD defines a protocol for the device and charger to determine max safe power. If the cable is replacable (not attached to the charger), it must be rated for PD and be able to tell the charger it can handle more than just the usual 5 volts at 2 amps.
USB PD chargers only output the maximum safe amount of power. That's why I can use my 65W steamdeck charger to charge my phone if I want to. It just outputs normal USB charger power if the device on the other end can't verify it can handle more.
It's also why my SteamDeck charger is what I use to fast charge my phone, because it can actually talk to it using the USB PD protocol to request the voltage and amps it needs to fast charge.
To clarify; I have a 100W Ugreen Nexode 4 Port USB Charger that I use to charge my laptop (~60W), Steam Deck (~40W), iPhone (~20W) and AirPods (~5?W).
The problem is if my original product cable has gone walkabout temporarily and I need to use a random one to stand in - there is no clear way of telling if I’m accidentally using a 5W-max cheap cable to try and keep my laptop charged while working.
Obviously there are some context clues depending on cable thickness etc., but with how common cosmetic braiding is becoming a thing - even that’s getting harder to rely on.
Some kind of cable labeling would be nice.
Just recently I had a tech store guy gently but repeatedly insist to me that a certain USB cable was a USB 3 cable because it was type C on both ends. I didn't wanna argue with him, but the box clearly said "480 Mbit", so it was just a type C charging cable.
Of course the box designers were hoping you'd make that mistake so they didn't write USB 2 on there, just the speed. And most boxes won't even have that, you'll just have to buy it and see.
But I mean if someone who spent their whole life fixing computers can get something that basic wrong, then it's really a hopeless situation for anyone who isn't techy.
And of course once it's out of the box it's anyone's guess what it is. It's a real mess for sure.