Actual article <a href="https://arstechnica.com/security/2024/03/hackers-can-extract-secret-encryption-keys-from-apples-mac-chips/" rel="nofollow">https://arstechnica.com/security/2024/03/hackers-can-extract...</a>
Google, in 2021 [0]:<p>> While the PoC demonstrates the JavaScript Spectre attack against Chrome 88's V8 JavaScript engine on an Intel Core i7-6500U 'Skylake' CPU on Linux, Google notes it can easily be tweaked for other CPUs... It was even successful on Apple's M1 Arm CPU...<p>And Augury [1] in 2022 also affected Apple's A14 and M1 chips.<p>So have Apple been attempting to mitigate and failing, or ignoring the issue?<p>Surely chip manufactures can't keep ignoring these fundamental flaws<p>[0] <a href="https://security.googleblog.com/2021/03/a-spectre-proof-of-concept-for-spectre.html" rel="nofollow">https://security.googleblog.com/2021/03/a-spectre-proof-of-c...</a><p>[1] <a href="https://www.prefetchers.info/" rel="nofollow">https://www.prefetchers.info/</a>
The title to article ..."secret keys"... had me thinking that this vuln might be a path to extracting the private keys from the secure enclave.<p>I'm not sure, but after a bit more reading, it sounds like private-keys or symmetric-keys can be extracted from other user-space or possibly kernel-space code execution. And NOT from the secure enclave.<p>Just for what it's worth.
Unfortunately, I don't think the real world applications of this exploit are explained anywhere. From skimming the paper , it looks like the attacker needs to be able to a) run code on the victim's machine and b) trigger the encryption process ("For our cryptographic attacks, we assume the attacker runs unprivileged code and is able to interact with the victim via nominal software interfaces, triggering it to perform private key operations.")<p>So for a) it might be sufficient to run javascript and for b) of course there are ways to inject data into server processes, processing data submitted by clients is what servers are for.<p>But a happens on clients (web browsers) and b would be a way to extract encryption keys from servers. But in what case can an attacker run code on a machine where they can also trigger the encryption (constantly for an hour like in the demonstration)? The only thing that comes to my mind would be a server side code-execution-sandbox that runs SSL termination on the same machine.<p>edit: Maybe stealing client certificate keys?
Actual paper:<p><a href="https://gofetch.fail/files/gofetch.pdf" rel="nofollow">https://gofetch.fail/files/gofetch.pdf</a>
Wow, didn't this happen with Intel? I think that was a noticeable drop in performance.<p>This is probably worse given people were trying to experiment with local LLMs on CPU. Its not like they even offer Nvidia.
Clickbait. How can someone lacking the real docs for the CPU claim that this “can’t be patched”? How could they possibly know what chicken bits exist to disable what features?
[dupe]<p>Discussion on the actual vulnerability post: <a href="https://news.ycombinator.com/item?id=39779195">https://news.ycombinator.com/item?id=39779195</a>
some people's real world take <a href="https://www.reddit.com/r/MacOS/comments/1bkd3m4/unpatchable_vulnerability_in_apple_chip_leaks/" rel="nofollow">https://www.reddit.com/r/MacOS/comments/1bkd3m4/unpatchable_...</a>
Another day, another speculative execution vuln..
IMHO: all this speculation is a local maximum and it show we have fundamental issue with how we design 'computers'
> The threat resides in the chips’ data memory-dependent prefetcher, a hardware optimization that predicts the memory addresses of data that running code is likely to access in the near future.<p>Are we nearing any sort of consensus that any form of speculation is bad? Is there a fundamentally secure way to do it?
As usual nobody cares about the "Average users". This is a flaw, this is a very high risk issue for everyone and should be threaded as a big problem by Apple but as the "average user" is not important anymore...
if this is confirmed I'm really interested into how exactly Apple will somehow deflect this and make it vanish like they somehow always manage to do with the myriad of issues they're facing over and over
Security through obscurity is really a bad idea, and Apple is no exception. In the long run, this will likely drive the adoption of RiscV as a better alternative.