this post was submitted on 24 Jun 2024
741 points (95.6% liked)

Technology

60033 readers
2857 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple's claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won't be able to use it. There's a memory requirement for Predictive Code Completion in Xcode 16, and it's the closest thing we'll get from Apple to an admission that 8GB of memory isn't really enough for a new Mac in 2024.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 127 points 6 months ago (4 children)

And now all the fan boys and girls will go out and buy another MacBook. That's planned obsolescence for ya

[–] [email protected] 60 points 6 months ago (1 children)

Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

[–] [email protected] 18 points 6 months ago (1 children)

Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think "oh I must need to buy a new MacBook".

If Apple didn't purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

[–] [email protected] 8 points 6 months ago* (last edited 6 months ago)

I wouldn't be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.

This feels like short term gains vs. long term reputation.

[–] [email protected] 23 points 6 months ago (3 children)

And why they solder the RAM, or even worse make it part of the SoC.

[–] [email protected] 47 points 6 months ago (2 children)

There are real world performance benefits to ram being as close as possible to the CPU, so it's not entirely without merit. But that's what CAMM modules are for.

[–] [email protected] 23 points 6 months ago (4 children)

But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

[–] [email protected] 17 points 6 months ago (1 children)

That's extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.

[–] [email protected] 17 points 6 months ago (1 children)

It's an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it's some miraculous new invention of theirs, for something that should just be how it's done.

[–] [email protected] 9 points 6 months ago

Parts pairing will do. That's what Apple known for, knee capping consumer rights.

[–] [email protected] 5 points 5 months ago (3 children)

Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

[–] [email protected] 1 points 5 months ago

I feel like this is an arguement for new specialized computers at best. At worst it shows that this AI crap is even more harmful to the end user.

[–] [email protected] 1 points 5 months ago

That’s a fantastic explanation! Thank you!

[–] [email protected] -3 points 5 months ago

Bus goes Vrrrroom vrrooom. Fuck AI.

[–] [email protected] 1 points 5 months ago

And even if the out-of-the-box RAM is soldered to the machine, it should still be possible to add supplementary RAM that isn't soldered for when the system demands it. Other computers have worked like this in the past with chip RAM but a socket to add more.

[–] [email protected] 0 points 5 months ago

It’s highly dependent on the application.

For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

[–] [email protected] 7 points 6 months ago

Apple's SoC long predates CAMM.

Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.

That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.

[–] [email protected] 8 points 6 months ago* (last edited 6 months ago) (2 children)

In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.

You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.

[–] [email protected] 3 points 5 months ago (1 children)

People do like to downplay this, but SoC is the future. There's no way to get performance over a system bus anymore.

[–] [email protected] 1 points 5 months ago (1 children)
[–] [email protected] 0 points 5 months ago (1 children)

Funny that within one minute, they state the exact same problem.

[–] [email protected] 1 points 5 months ago

If you actually watch past the first minute of the video, they explain that LPCAMM solves that problem...

[–] [email protected] -5 points 6 months ago (1 children)

BUT BUT you'll get 5% fasTEr SpeED!!! And MOrE seCuRiTy!!!

[–] [email protected] 7 points 6 months ago

Well. The claim they made still holds true, despit how I dislike this design choice. It is faster, and more secure (though attacks on NAND chips are hard and require high skill levels that most attacker won't posses).

And add one more: it saves power when using LPDDR5 rather DDR5. To a laptop that battery life matters a lot, I agree that's important. However, I have no idea how much standby or active time it gain by using LPDDR5.

[–] [email protected] 2 points 5 months ago (4 children)

And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

Be more original.

[–] [email protected] 6 points 5 months ago (1 children)

Nice attempt to justify planned obsolescence. To think apple hasn't done this time and time again, you'd have to be a fool

[–] [email protected] -4 points 5 months ago* (last edited 5 months ago) (2 children)

👍

-posted from my ten year old MacBook which shows no need for replacement

[–] [email protected] 3 points 5 months ago

And is what, 3 or 4 operating systems behind due to it being obsolete

[–] [email protected] 2 points 5 months ago (1 children)

At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?

[–] [email protected] 2 points 5 months ago* (last edited 5 months ago)

Still gets security updates. All the software I need to run on it runs on it.

My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.

I guess if you need the latest and greatest then you might have a point, but I don’t.

This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄

[–] [email protected] 4 points 5 months ago (1 children)

They will keep making the same comment as long as it keeps being true.

  • Typed from my 2009 ThinkPad

Meanwhile your 2014 MacBook stopped receiving OS updates 3 years ago.

[–] [email protected] -1 points 5 months ago (1 children)

Weren’t you just complaining about software bloat?

[–] [email protected] 1 points 5 months ago
[–] [email protected] 2 points 5 months ago (1 children)

This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You're right, though, your Mac can run easily for 10+ years. You're good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.

[–] [email protected] 2 points 5 months ago

In fairness, most computers built after around 2014-2016+ last way longer, performance started to level off not long after that. After all, devs write software for what people have, if everyone had 128 gigs of RAM we'd load everything we could think of into memory and you'd need it to keep up

Macs did have some incredible build quality though, the newer ones aren't holding up even close to as well. I'm still using a couple 2012 Macs to play videos, it's slow as hell when you interact, but once the video is playing it still looks and sounds good

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago) (1 children)

I still have a fully functioning Windows 95 machine.

My daily driver desktop is also from around 2014.

[–] [email protected] 1 points 5 months ago

That’s pretty sick actually

[–] [email protected] -3 points 5 months ago (1 children)

These were obsolete the minute they were made, though... So it's not really planned obsolescence. I got one for free (MacBook Air), and it's always been trash.

[–] [email protected] 3 points 5 months ago

I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.