this post was submitted on 24 Nov 2024
1514 points (92.4% liked)
Technology
60052 readers
2774 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Fuck firewire. Glad it's dead. USB C is the best thing to happen to peripherals since the mouse.
I would agree with you if there were a simple way to tell what the USB-C cable I have in my hand can be used for without knowing beforehand. Otherwise, for example, I don't know whether the USB-C cable will charge my device or not. There should have been a simple way to label them for usage that was baked into the standard. As it is, the concept is terrific, but the execution can be extremely frustrating.
Hey that's a fair point. Funny how often good ideas are kneecapped by crap executions.
I’m pretty sure the phrase “kneecapped by crap executions” is in the USB working groups’s charter. It’s like one of their core guiding principles.
If anyone disagrees with this, the original USB spec was for a reversible connector and the only reason we didn't get to have that the whole time was because they wanted to increase profit margins.
USB has always been reversible. In fact you have to reverse it at least 3 times before it'll FUCKING PLUG IN.
That’s the reason Apple released the Lightning connector. They pushed for several features for USB around 2010, including a reversible connector, but the USB-IF refused. Apple wanted USB-C, but couldn’t wait for the USB-IF to come to an agreement so they could replace the dated 20-pin connector.
Burn all the USBC cables with fire except PD. The top PD cable does everything the lower cable does.
IDK I’ve had PD cables that looked good for a while but turns out their data rate was basically USB2. It seems no matter what rule of thumb I try there are always weird caveats.
No, I’m not bitter, why would you ask that?
There are many PD cables that are bad for doing data.
Correct. The other commenter is giving bad advice.
Both power delivery and bandwidth are backwards compatible, but they are independent specifications on USB-C cables. You can even get PD capable USB-C cables that don’t transmit data at all.
Also, that’s not true for Thunderbolt cables. Each of the 5 versions have specific data and power delivery minimum and maximum specifications.
You forgot thunderbolt and usb4 exists now
You can buy a single cable that does 40GB and USB4 and charges at 240w.
Buying a basic, no-frills USB-C cable from a reputable tech manufacturer all but guarantees that it'll work for essentially any purpose. Of course the shoddy pack-in cables included with a cheap device purchase won't work well.
I replaced every USB-C-to-C or -A-to-C cable and brick in my house and carry bag with a very low cost Anker cable (except the ones that came with my Google products, those are fine), and now anything charges on any cable.
You wouldn't say that a razor sucked just because the cheap replacement blades you bought at the dollar store nicked your face, or that a pan was too confusing because the dog food you cooked in it didn't taste good. So too it is not the fault of USB-C that poorly manufactured charging bricks and cables exist. The standard still works; in fact, it works so well that unethical companies are flooding the market with crap.
Do not all USB C cables have the capability to do Power Delivery? I thought it was up to the port you plugged it in to support it?
Nope. My daughter is notorious for mixing up cables when they come out of the brick. Some charge her tablet, some are for data transfer, some charge other devices but not her tablet. It's super confusing. I had to start labeling them for her.
Come to think of it, all the USB C cables I have are from phone and device chargers so I just took it for granted. Good to know. Thanks for sharing some knowledge with me
USB-c cables can vary drastically. Power delivery alone ranges from less than 1 amp at 5 volts to over 5 amps at 20 volts. That's 5 watts of power on the low end to 100 watts of power on the high end and sometimes more. When a cable meant to run at 5 watts has over 100 watts of power run through, the wires get really hot and could catch fire. The charger typically needs to talk to a very small chip in the high power cables for the cables to say, yes I can handle the power. Really cheap chargers might just push that power out regardless. So while the USB-c form factor is the one plug to rule them all, the actual execution is a fucking mess.
I agree with USB-C, but there are still a million USB-A devices I need to use, and I can't be bothered to buy adapters for all of them. And a USB hub is annoying.
Plus, having 1-2 USB-C ports only is never gonna be enough. If they are serious about it, why not have 5?
Yeah, I'd love at least one USB A type cause most of the peripherals I own use that.
It’s not that bad
I bought some adaptors in China for around $0.50 each. It really isn't that big of a deal
I hated when mice became the primary interface to computers, and I still do.
tell me you use i3 without telling me you use i3
I agree with OP and I haven't used a tiling WM in years (used XMonad BTW; i3 was okay). I currently use KDE Plasma 6 because it doesn't have many drawbacks (used GNOME until Wayland worked properly on KDE), and I can use it pretty well w/o a mouse.
Is this for real?
Even for like 20 years after mousing became the primary interface, you could still navigate much faster using keyboard shortcuts / accelerator keys. Application designers no longer consider that feature. Now you are obliged to constantly take your fingers off home position, find the mouse, move it 3cm, aim it carefully, click, and move your hand back to home position, an operation taking a couple of seconds or more, when the equivalent keyboard commands could have been issued in a couple hundred milliseconds.
I love how deeply nerdy Lemmy is. I'm a bit of a nerd but I'm not "mice were a mistake" nerd.
I don't think mice were a mistake, but they're worse for most of the tasks I do. I'm a software engineer and I suck at art, so I just need to write, compile, and test code.
There are some things a mouse is way better for:
But for almost everything else, I prefer a keyboard.
And while we're on a tangent, I hate WASD, why shift my fingers over from the normal home row position? It should be ESDF, which feels way more natural...
Thanks, I got you beat on ESDF though because i'm a RDFG man, since playing counter strike 1.6. With WASD they usually put crouch or something on ctrl but my pinky has a hard time stretching down there, but on RDFG my pinky has easy access to QW AS ZX, and tab caps and shift with a little stretch. It's come in handy when playing games with a lot of keybinds.
Pfff, minutes after trying to minimize your nerdiness, you post this confession.
What pisses me off even more is many games bind to the letter instead of physical key position (e.g. key code), so alternative layouts get a big middle finger. I use Dvorak, and I've quit fighting and just switch to QWERTY for games.
I don't have a problem with hitting control (I guess I have big hands), but I totally agree that default key binds largely suck. I wish games came with a handful of popular ones, and bound to key codes so hs Dvorak users (or international users) didn't have to keep switching to QWERTY.
I always rebind to ESDF if the game doesn't do stupid things preventing it from being practical. The addition of the 1QAZ strip being available to the pinky is a killer feature all on its own. I typically use that for weapon switching, instead of having to stretch up to 1234 and take my fingers off the movement keys.
Tablets are better than mice at drawing, modelling, and photo editing. Mice are good for first person shooters. Game controllers are better for most other games. You can mouse in
dired-mode
i guess, if you're a casual.It's also an age thing. My visual processing is getting worse and worse. My disorientation facing a busy screen with literally thousands of objects that can be interacted with by mouse is a cognitive drain compared to a textual interface where I do most of the work abstractly without having to use visual processing at all. Like reading a book vs watching a movie.
I probably have a lot more experience using pre-mouse era computers than most people. It's like being asked to start using a different language when you are 20. Yeah, you'll become perfectly fluent for a couple decades... but you'll also lose that language first when you get old.
I have noticed that millenials navigate multilayer mouse interfaces (like going down a few chained drop down menus) way faster than I ever did. And zoomers use touch screen keyboards almost as well as I ever touchtyped. Brains are only plastic to a degree, and it just plain feels good to use all those neurons that you first laid down when you were young and your mind was infinite.
I just use a mouse to type in stuff using the on screen keyboard. It's annoying having to take the ball out and clean it, but you get used to it.
Hey they made new technology where you can just yell at the computer and it'll understand 60% of what you're saying.
When I'm "computering" for efficiency, I don't take my hands off the keyboard. Half of my job is on a standard keyboard, and so familiarizing myself with all the shortcuts and whatnot saves a lot of time versus having to travel back and forth to a mouse or track pad.
When I am just satisfying the dopamine urges, it's mouse all the way.
That functionality (first necessary, then required by guidelines, then expected, and then still usual) disciplined UI designers to make things doable in a clear sequence of actions.
Now they think any ape can make a UI if it knows the new shiny buzzwords like "material design" or "air" or whatever. And they do! Except humans can't use those UIs.
BTW, about their "air". One can look at ancient UI paradigms, specifically SunView, OpenLook and Motif (I'm currently excited about Sun history again), Windows 3.*, and also Win9x (with WinXP being more or less inside the same paradigm). And one can see that of these only Motif had anything resembling their "air". And Motif is generally considered clunky and less usable than the rest of the mentioned (I personally consider OpenLook the best), but compared to modern UIs even Motif does that "air" part the way it seems to make some sense, and feels less clunky, making me wonder how is that even possible.
FFS, modern UI designers don't even think it's necessary to clearly and consistently separate buttons and links from text.
And also - freedom in Web and UI design has proven to be a mistake. UIs should be native. Web browsers should display pages adaptively (we have such and such blocks of text and such and such links), their appearance should be decided on the client and be native too, except pictures. Gemini is the right way to go for the Web.
To an extent. Early 90's I could navigate WordPerfect in DOS faster than I've ever been able to work in MS Word, because it was all keyboard even before I learned proper home key 10 finger typing in high school. Technically my first word processor was Wordstar on one of those Osborne "portable" computers with the 5-inch screen when I was a young kid, but Wordperfect was what I did my first real 'word processing' on when I started using it for school projects. So I might just be older in that 'how do you do fellow kids' in this sort of discussion.
To this day, I still prefer mc (Midnight Commander, linux flavored recreation of Norton Commander that does have a Windows port (YMMV on the win port)) to navigate filesystems for non-automated file management.
I've been thoroughly conditioned for mouse use since the mid-late 90s (I call it my Warcraft-Quake era, we still used keyboard only for Doom 1/2 back in the early days), and I feel like it's a crutch when I'm trying to do productive work instead of gaming. When I spend a few days working using remote shells, I definitely notice a speed increase. Then a few days later I lose it all again when I'm back on that mouse cursor flow brain.
You have passed the test. We can be friends.