Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
Too late to the grift...
Why even collab with that clown, when apply has money to hire talent and buy NVIDIA cards to do the job...
Apple and Nvidia hate each other due to some failed business history between them. Apple didn't like Nvidia chips failing and hurting the MacBook reputation. Nvidia didn't like being thrown under the bus when the MacBook problem started to surface.
Since 2008/09 they have refused to even joke about doing business together.
learn something new everyday... well apple better tuck that dick and buy these GPUs then it seems, i doubt their own design can compete tbh.. .if it did, nvidia would not be running a train on the global GPU market.
Considering how long Apple has been putting neural cores in all their chips, and the speed at which their in house chips have outpaced competitors (like the M series for example), I feel like not only will Apple beat nvidia at this, Apple will do so by a decent amount.
That said, nvidia will continue to sell world wide in this market as Apple will keep their chips in only their own hardware, so if you’re not running a Mac/IOS device, you’ll be using nvidia chips.
Either way, even if Apple just keeps up, competition is still best for everyone, so I welcome this development.
The m chips are only good for inference and not for training. That is still unparalleled with CUDA. Pun intended.
I still don’t understand how an open source alternative with better hardware support hasn’t happened yet.
The problem has two sides: software and hardware. You can open source the software side all you want, it's not gonna go very far when it has to fight against the hardware instead of working with it.
ROCm is open source, but it's AMD. Their hardware has historically not been as powerful and therefore attractive to the target audience, so it's been going slow.
yes.. apple is reserving their market leader to not capture a new market...