this post was submitted on 09 Apr 2024
82 points (87.3% liked)

Technology

59207 readers
3159 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 7 months ago* (last edited 7 months ago) (1 children)

Intel is not fine on servers - ARM servers are about 20% faster for outright performance and 40% faster for performance-per-dollar. Since it's literally just selecting a different option in a dropdown menu (assuming your software runs well on ARM, which it probably does these days), why would anyone choose Intel on a server?

And they're not fine no a laptops either - unplugged my ARM Mac from the charger seven hours ago... and I'm at 80% charge right now. Try that with an Intel laptop with an i9 Processor and a discrete NVIDIA GPU (those two would be needed to have similar performance).

They're only really doing well on desktop PCs, which is a small market, and people who can't be bothered changing to a new architecture — a big market but one that is going away.

[–] [email protected] 6 points 7 months ago

When you say 20% faster - per what metric? Is that per watt power consumption, per dollar cost?

If it's per either of those, that's pretty impressive, it's a massive difference.