this post was submitted on 09 Apr 2024
82 points (87.3% liked)
Technology
59207 readers
3159 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Intel is not fine on servers - ARM servers are about 20% faster for outright performance and 40% faster for performance-per-dollar. Since it's literally just selecting a different option in a dropdown menu (assuming your software runs well on ARM, which it probably does these days), why would anyone choose Intel on a server?
And they're not fine no a laptops either - unplugged my ARM Mac from the charger seven hours ago... and I'm at 80% charge right now. Try that with an Intel laptop with an i9 Processor and a discrete NVIDIA GPU (those two would be needed to have similar performance).
They're only really doing well on desktop PCs, which is a small market, and people who can't be bothered changing to a new architecture — a big market but one that is going away.
When you say 20% faster - per what metric? Is that per watt power consumption, per dollar cost?
If it's per either of those, that's pretty impressive, it's a massive difference.