this post was submitted on 22 Mar 2024
626 points (98.9% liked)

Technology

59374 readers
7261 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The attack has been dubbed GoFetch: https://gofetch.fail/

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 208 points 7 months ago (8 children)

This requires local access to do and presently an hour or two of uninterrupted processing time on the same cpu as the encryption algorithm.

So if you're like me, using an M-chip based device, you don't currently have to worry about this, and may never have to.

On the other hand, the thing you have to worry about has not been patched out of nearly any algorithm:

https://xkcd.com/538/

[–] [email protected] 170 points 7 months ago* (last edited 7 months ago) (2 children)

The second comment on the page sums up what I was going to point out:

I'd be careful making assumptions like this ; the same was true of exploits like Spectre until people managed to get it efficiently running in Javascript in a browser (which did not take very long after the spectre paper was released). Don't assume that because the initial PoC is time consuming and requires a bunch of access that it won't be refined into something much less demanding in short order.

Let's not panic, but let's not get complacent, either.

[–] [email protected] 31 points 7 months ago

That's the sentiment I was going for.

There's reason to care about this but it's not presently a big deal.

[–] [email protected] 15 points 7 months ago (1 children)

I mean, unpatchable vulnerability. Complacent, uncomplacent, I'm not real sure they look different.

[–] [email protected] 11 points 7 months ago

Can't fix the vulnerability, but can mitigate by preventing other code from exploiting the vulnerability in a useful way.

[–] [email protected] 29 points 7 months ago (2 children)

Sure. Unless law enforcement takes it, in which case they have all the time in the world.

[–] [email protected] 27 points 7 months ago (1 children)

Yup, but they're probably as likely to beat you up to get your passwords.

[–] [email protected] 16 points 7 months ago

No way! Even the evil ones will try to avoid jail.

Meanwhile they might have a friggin budget for the GrayKey, the Stingray

Definitely believe rights are more likely to be violated when they can just plug in or power on without getting their gloves dirty.

[–] [email protected] 1 points 7 months ago (2 children)

It still requires user level access, which means they have to bypass my login password first, which would give them most of that anyways.

Am I missing something?

load more comments (2 replies)
[–] [email protected] 16 points 7 months ago

Ah yes, good old Rubber-hose cryptanalysis.

[–] [email protected] 7 points 7 months ago

I want to say "passkeys" but if I'm honest, that too is susceptible to this attack.

[–] [email protected] 6 points 7 months ago (1 children)

So if someone somehow gets hold of the device then it is possible?

[–] [email protected] 10 points 7 months ago (1 children)

It depends, some M-devices are iOS and iPadOS devices, which would have this hardware issue but don't have actual background processing, so I don't believe it's possible to exploit it the way described.

On Mac, if they have access to your device to be able to set this up they likely have other, easier to manage, ways to get what they want than going through this exploit.

But if they had your device and uninterrupted access for two hours then yes.

Someone who understands it all more than I do could chime in, but that's my understanding based on a couple of articles and discussions elsewhere.

[–] [email protected] 11 points 7 months ago* (last edited 7 months ago) (2 children)

So it's been a while since I had my OS and microcomputer architecture classes, but it really looks like GoFetch could be a real turd in the punch bowl. It appears like it could be on par with the intel vulns of recent years.

which would have this hardware issue but don’t have actual background processing

So I've read the same about iOS only allowing one user-space app in the foreground at a time, but... that still leaves the entirety of kernal-space processes allowed to run at anytime they want. So it's not hard to imagine that a compromised app could be running in the foreground, all the the while running GoFetch trying to mine, while the OS might be shuffling crypto keys in the background on the same processor cluster.

The other thing I'd like to address, is you're assuming this code would necessarily require physical access to compromise a machine. That is certainly one vector, but I'd posit there's other simpler ways to do the same. The two that come to mind really quick, are (1) a compromised app via official channels like the app store, or even more scary, (2) malicious javascript hidden on compromised websites. The white paper indicates this code doesn't need root, it only needs to be executed on the same cluster where the crypto keys keep passing through by chance; so either of these vectors seem like very real possibilities to me.

Edit to add:

I seem to recall reading a paper on the tiktok apps with stock installation were actually polyglot, in that the app would actually download a binary after installation, such that what's executed on an end user's machine is not what went through the app store scanners. I had read of the same for other apps using a similar technique for mini-upgrades, which is a useful way to not have to go through app store approval everytime you need to roll out a hotfix or that latest minor feature.

If these mechanisms haven't already been smacked down by apple/google, or worse, aren't detectable by apple/google, this could be a seriously valuable tool for state level actors able to pull off the feat of hiding it in plain sight. I wonder if this might be part of what congress was briefed about recently, and why it was a near unanimous vote to wipe out tiktok. "Hey congress people, all your iphones are about to be compromised... your tinder/grindr/onlyfans kinks are about to become blackmail fodder."

[–] [email protected] 2 points 7 months ago (1 children)

Doesn't it require a separate process to be using the cryptographic algorithm in the first place in order to fill the cache in question?

If it's done in-process of a malicious app that you're running, why wouldn't the app just steal your password and avoid all of this in the first place?

An efficient and fast version of this in Javascript would be worrisome. But as-is it's not clear if this can be optimized to go faster than 1-2 uninterrupted hours of processing, so hopefully that doesn't end up being the case.

[–] [email protected] 1 points 7 months ago

Doesn’t it require a separate process to be using the cryptographic algorithm in the first place in order to fill the cache in question?

Yes, that's my understanding. I haven't looked at the code, but their high level explanation sounds like their app is making calls to an API which could result in the under-the-hood crypto "service" pulling the keys into the cache, and there's an element of luck to whether they snag portions of the keys at that exact moment. So it seems like the crafted app doesn't have the ability to manipulate the crypto service directly, which makes sense if this is only a user-land app without root privileges.

why wouldn’t the app just steal your password and avoid all of this in the first place?

I believe it would be due to the app not having root privileges, and so being constrained with going through layers of abstraction to get its crypto needs met. I do not know the exact software architecture of iOS/macOS, but I guarantee there's a notion of needing to call an API for these types things. For instance, if your app needs to push/pull an object it owns in/out of iCloud, you'd call the API with a number of arguments to do so. You would not have the ability to access keys directly and perform the encrypt/decrypt all by yourself. Likewise with any passwords, you would likely instead make an API call and the backing code/service would have that isolated/controlled access.

[–] [email protected] 1 points 7 months ago

Fetching remote code isn't allowed on the play store at least, though I'm not sure how well they're enforcing that.

That's the reason termux isn't updated in the play store anymore iirc, it has its own package manager that downloads and runs code.

[–] [email protected] 3 points 7 months ago

Yeah I don't think this is a big-ish problem currently. But by having this vulnerability to point to, other CPU vendors have a good reason not to include this feature in their own chips.

[–] [email protected] 2 points 7 months ago

What I’m worried about is Apple overreacting and bottlenecking my M3 pro because “security”. We already saw how fixes for these types of vulnerabilities on Intel and AMD silicon affected performance; no thank you.

load more comments (1 replies)
[–] [email protected] 55 points 7 months ago (1 children)
[–] [email protected] 18 points 7 months ago (1 children)
[–] [email protected] 21 points 7 months ago (1 children)

Apple's gonna need you to get ALL the way off their back about this.

[–] [email protected] 3 points 7 months ago

Oh sorry, lemme get off that thing!

[–] [email protected] 54 points 7 months ago (2 children)

Apple is not a secure ecosystem.

[–] [email protected] 82 points 7 months ago (11 children)

No system is free from vulnerabilities.

load more comments (11 replies)
[–] [email protected] 13 points 7 months ago

This issue is extremely similar to problems found with both Intel and AMD processors too (see: Meltdown, Spectre).

[–] [email protected] 38 points 7 months ago

“Govt-mandated backdoor in Apple chips revealed”

There, fixed that for you.

[–] [email protected] 36 points 7 months ago* (last edited 7 months ago) (6 children)

Wow, what a dishearteningly predictable attack.

I have studied computer architecture and hardware security at the graduate level—though I am far from an expert. That said, any student in the classroom could have laid out the theoretical weaknesses in a "data memory-dependent prefetcher".

My gut says (based on my own experience having a conversation like this) the engineers knew there was a "information leak" but management did not take it seriously. It's hard to convince someone without a cryptographic background why you need to {redesign/add a workaround/use a lower performance design} because of "leaks". If you can't demonstrate an attack they will assume the issue isn't exploitable.

[–] [email protected] 10 points 7 months ago* (last edited 7 months ago)

So the attack is (very basically, if I understand correctly)

Setup:

  • I control at least one process on the machine I am targeting another process on
  • I can send data to the target process and the process will decrypt that

Attack:

  • I send data that in some intermediate state of decryption will look like a pointer
  • This "pointer" contains some information about the secret key I am trying to steal
  • The prefetcher does it's thing loading the data "pointed to" in the cache
  • I can observe via a cache side channel what the prefetcher did, giving me this "pointer" containing information about the secret key
  • Repeat until I have gathered enough information about the secret key

Is this somewhat correct? Those speculative execution vulnerabilities always make my brain hurt a little

load more comments (5 replies)
[–] [email protected] 27 points 7 months ago

newly discovered side channel

NSA: "haha yeah... new..."

[–] [email protected] 20 points 7 months ago

This is the best summary I could come up with:


A newly discovered vulnerability baked into Apple’s M-series of chips allows attackers to extract secret keys from Macs when they perform widely used cryptographic operations, academic researchers have revealed in a paper published Thursday.

The flaw—a side channel allowing end-to-end key extractions when Apple chips run implementations of widely used cryptographic protocols—can’t be patched directly because it stems from the microarchitectural design of the silicon itself.

The vulnerability can be exploited when the targeted cryptographic operation and the malicious application with normal user system privileges run on the same CPU cluster.

Security experts have long known that classical prefetchers open a side channel that malicious processes can probe to obtain secret key material from cryptographic operations.

This vulnerability is the result of the prefetchers making predictions based on previous access patterns, which can create changes in state that attackers can exploit to leak information.

The breakthrough of the new research is that it exposes a previously overlooked behavior of DMPs in Apple silicon: Sometimes they confuse memory content, such as key material, with the pointer value that is used to load other data.


The original article contains 744 words, the summary contains 183 words. Saved 75%. I'm a bot and I'm open source!

[–] [email protected] 17 points 7 months ago

I get to this part and feel like I'm being trolled.

"meaning the reading of data and leaking it through a side channel—is a flagrant violation of the constant-time paradigm."

[–] [email protected] 16 points 7 months ago (1 children)
[–] [email protected] 3 points 7 months ago

Nope since it's an intended feature.

[–] [email protected] 4 points 7 months ago

So does this mean it’s being actively exploited by people? How screwed is the average person over this?

load more comments
view more: next ›