The video of the talk is online now too: <a href="https://www.youtube.com/watch?v=7VWNUUldBEE" rel="nofollow">https://www.youtube.com/watch?v=7VWNUUldBEE</a>
That's pretty astonishing. The MMIO abuse implies either the attackers have truly phenomenal research capabilities, and/or that they hacked Apple and obtained internal hardware documentation (more likely).<p>I was willing to believe that maybe it was just a massive NSA-scale research team up until the part with a custom hash function sbox. Apple appears to have known that the feature in question was dangerous and deliberately both hidden it, whatever it is, and then gone further and protected it with a sort of (fairly weak) digital signing feature.<p>As the blog post points out, there's no obvious way you could find the right magic knock to operate this feature short of doing a full silicon teardown and reverse engineering (impractical at these nodes). That leaves hacking the developers to steal their internal documentation.<p>The way it uses a long chain of high effort zero days only to launch an invisible Safari that then starts from scratch, loading a web page that uses a completely different chain of exploits to re-hack the device, also is indicative of a massive organization with truly abysmal levels of internal siloing.<p>Given that the researchers in question are Russians at Kaspersky, this pretty much has to be the work of the NSA or maybe GCHQ.<p><i>Edit: misc other interesting bits from the talk: the malware can enable ad tracking, and also can detect cloud iPhone service hosting that's often used by security researchers. The iOS/macOS malware platform seems to have been in development for over a decade and actually does ML on the device to do object recognition and OCR on photos on-device, to avoid uploading image bytes: they only upload ML generated labels. They truly went to a lot of effort, but all that was no match for a bunch of smart Russian students.<p>I'm not sure I agree with the speaker that security through obscurity doesn't work, however. This platform has been in the wild for ten years and nobody knows how long they've been exploiting this hidden hardware "feature". If the hardware feature was openly documented it'd have been found much, much sooner.</i>
Steve Weis on Twitter described it best:<p>“This iMessage exploit is crazy. TrueType vulnerability that has existed since the 90s, 2 kernel exploits, a browser exploit, and an undocumented hardware feature that was not used in shipped software”<p><a href="https://x.com/sweis/status/1740092722487361809?s=46&t=E3U2EI7EXIhlBQmxg6oZ2g" rel="nofollow">https://x.com/sweis/status/1740092722487361809?s=46&t=E3U2EI...</a>
For those interested in the talk by the Kaspersky researches, the cleaned video isn't uploaded yet but you can find a stream replay here:<p><a href="https://streaming.media.ccc.de/37c3/relive/a91c6e01-49cf-4227-baae-aece190e9de5" rel="nofollow">https://streaming.media.ccc.de/37c3/relive/a91c6e01-49cf-422...</a><p>(talk starts at minute 26:20)
As its about a 37c3 presentation here a comment from Fefe¹ in German
<a href="https://blog.fefe.de/?ts=9b729398" rel="nofollow">https://blog.fefe.de/?ts=9b729398</a><p>According to him the exploit chain was likely worth in the region of a 8-digit dollar value.<p>¹ <a href="https://en.wikipedia.org/wiki/Felix_von_Leitner" rel="nofollow">https://en.wikipedia.org/wiki/Felix_von_Leitner</a><p>I guess somebody is going to get fired.
Coresight is not some backdoor - it's a debug feature of all ARM CPU's. This looks like a necessary extension to coresight to work with Apples memory protection stuff.<p>Even though no public documentation exists, I'm sure thousands of Apple engineers have access to a modded gdb or other tooling to make use of it.
iMessage can be disabled by local MDM for supervised devices, via free Apple Configurator in macOS app store, <a href="https://support.apple.com/guide/deployment/restrictions-for-iphone-and-ipad-dep0f7dd3d8/web" rel="nofollow">https://support.apple.com/guide/deployment/restrictions-for-...</a><p><pre><code> For Wi-Fi–only devices, the Messages app is hidden.
For devices with Wi-Fi and cellular, the Messages app is still available, but only the SMS/MMS service can be used.
</code></pre>
SMS/MMS messages and non-emergency cellular radio traffic can be disabled by a SIM PIN, e.g. when using device for an extended period via WiFi.
Notice that the hash value for a data write of all zero's is zero...<p>And for a single bit, the hash value is a single value from the sbox table. That means this hash algorithm could reasonably have been reverse engineered without internal documentation.
What are the chances this MMIO register could have been discovered by brute force probing every register address?<p>Mere differences in timing could have indicated the address was a valid address, and then the hash could perhaps have been brute forced too since it is effectively a 20 bit hash.
The extra hardware registers might have been discovered by examining the chip itself. One could find where the registers were on it, and notice some extra registers, then do some experimenting to see what they did.
Philip Zimmermann a while back was working on a secure phone product called the Black Phone. I tried to convince him that a secure phone should not contain any microphones of any kind. That sounds a bit weird for a phone, but it's ok, if you want to make a voice call, just plug a headset into it for the duration of the call. He wasn't convinced, but this iphone exploit makes me believe it more than ever.
Maybe I'm too dumb to find it on this page but if you are looking for the actual recording instead of a calendar entry in the past, it's here (a stream dump for now, fast forward to 27 mins):<p><a href="https://streaming.media.ccc.de/37c3/relive/11859" rel="nofollow">https://streaming.media.ccc.de/37c3/relive/11859</a>
>Hardware security very often relies on “security through obscurity”, and it is much more difficult to reverse-engineer than software, but this is a flawed approach, because sooner or later, all secrets are revealed.<p>The later works when you are not as big as Apple. When you are as big as Apple, you are a very hot target for attackers. There is always the effort vs reward when it comes to exploiting vulnerabilities. The amount of effort that goes into all this is worth thousands of dollars even if someone is doing it just for research. If I was doing this for some random aliexpress board it would be worth nothing and probably security by obscurity would mean no one really cares and the later part works here. But I wonder what Apple is thinking when they use obscurity cause people must start working on exploiting new hardware from day 1. You literally can get one on every corner in a city these days. Hardware Security by obscurity for example would be fine for cards sold by someone like nvidia to only some cloud customers and those are then assumed obsolete in a few years so even if someone gets those on eBay the reward is very low. iPhones on the other hand are a very consumer device and people hang on to their devices for very long.
I see that one of the steps in exploit was to use GPU registers to bypass kernel memory protection. Does it mean that the vulnerability cannot be fixed by an update and existing devices will stay vulnerable?
I didn't hear anyone mention fuzzing once. I guess there was probably very specific insider knowledge being made use of and they wanted to point a finger, which is fair enough I guess. I'm just a bit surprised that it has not been mentioned so far in the discussion. Anyhow it seems that a allow-list approach by Apple would have been better than a deny list approach! Literally not checking out of expected bounds!
> <i>If we try to describe this feature and how the attackers took advantage of it, it all comes down to this: they are able to write data to a certain physical address while bypassing the hardware-based memory protection by writing the data, destination address, and data hash to unknown hardware registers of the chip unused by the firmware.</i><p>Did the systems software developers know about these registers?
See also the article from Ars Technica in June 2023: <a href="https://arstechnica.com/information-technology/2023/06/clickless-ios-exploits-infect-kaspersky-iphones-with-never-before-seen-malware/" rel="nofollow">https://arstechnica.com/information-technology/2023/06/click...</a>
I'm curious to know from experts if there's anything Apple can do to create a step-change in terms of security of iPhones? Like if the going rate for a zero day is $1 million, is there anything Apple can do that can drive that up to $2 or $3 million? Or is it just going to be a perpetual cat and mouse game with no real "progress"?
At least the first version of the recording is now up:
<a href="https://media.ccc.de/v/37c3-11859-operation_triangulation_what_you_get_when_attack_iphones_of_researchers" rel="nofollow">https://media.ccc.de/v/37c3-11859-operation_triangulation_wh...</a>
Knowing more about the exfiltration component where it sends data to a remote server would be helpful. According to the article it’s sending large audio microphone recordings. I assume a company like Kapersky would explicit deny all outgoing network connections and then approve one by one.
Are hashes of the data ever used in known chip debugging features?<p>Since they're supposed to be disabled in production, what would be their point?<p>I'm no electronic engineer, but isn't it best for them to be fast and simple, to reduce the chance that they cause interference themselves..?<p>And isn't it strongly unlikely that an attacker in the supply chain (TSMC??) would be able to reliably plant this in all Apple chips from the A12 to the A16 and the M1 ??
This made me laugh:
"Upon execution, it decrypts (using a custom algorithm derived from GTA IV hashing) its configuration [...]"<p>From <a href="https://securelist.com/triangulation-validators-modules/110847/" rel="nofollow">https://securelist.com/triangulation-validators-modules/1108...</a>
That’s going to be a Chinese tool. Knowing the hardware that intimately and having all these convenient undocumented areas to play with is exactly the kind of thing you can put in place if you control the manufacturing.
Related:<p><i>4-year campaign backdoored iPhones using advanced exploit</i> - <a href="https://news.ycombinator.com/item?id=38784073">https://news.ycombinator.com/item?id=38784073</a><p>(We moved the comments hither, but the article might still be of interest)
Reminder that Lockdown Mode helps reduce the attack surface of your iPhone. It also helps tremendously with detection.
<a href="https://support.apple.com/en-us/105120" rel="nofollow">https://support.apple.com/en-us/105120</a>
Isn't the most obvious answer that Apple, like other US tech firms such as Google, simply creates these wild backdoors for the NSA/GCHQ directly? Every time one's patched, three more pop up. We already know Apple and Google cooperate with the spy agencies very eagerly.
Years ago i argued about the danger of pdfs with another account and was told not to be a paranoid nutjob.<p>Told you so.<p>edit: The fact that this obvious statement gets upvoted above the apple backdoor on 22:40 of the talk also says alot.<p>edit1: <a href="https://imgur.com/a/82JV7I9" rel="nofollow">https://imgur.com/a/82JV7I9</a>
>This attachment exploits vulnerability CVE-2023-41990 in the undocumented, Apple-only TrueType font instruction ADJUST for a remote code execution. This instruction existed since the early 90’s and the patch removed it.<p>This is getting ridiculous. How many iMessage exploits have there now been via attachments? Why aren't Apple locking down the available codecs? Why isn't BlastDoor doing its job?<p>This is really disappointing to see time and time again. If a simple app to send and receive messages is this hard to get right, I have very little hope left for software.
It’s quite unfortunate that Apple doesn’t allow users to uninstall iMessage, it seems to be the infection vector for advanced threats like this, NSO group, etc. Presumably it’s to avoid the support burden, but they could gate it behind having Lockdown Mode enabled for a week or something to shake out the vast majority of mistaken activations.
Who had motive to target Russian government officials, knowledge of the attack vectors, history of doing so, and technical and logistical ability to perform it leads Kaspersky and myself to the only rational conclusion: that Apple cooperated with the NSA on this exploit. I assume they only use and potentially burn these valuable methods in rare and perhaps desperate instances. I expect the Russian and Chinese governments' ban on use of Iphones will not be lifted and expand to other governments. Similarly to how the sanctions have backfired, this tactic will also backfire by reducing trust in Apple which is the core of their value proposition.
Now I am thinking Kaspersky should not have published this information. What a wrong decision. Instead they should have sold it to Russian government which I am sure could find lot of interesting uses for these "debugging features" and offer a good reward.
>The resulting shellcode, in turn, went on to once again exploit CVE-2023-32434 and CVE-2023-38606 to finally achieve the root access required to install the last spyware payload.<p>Why isn't Apple detecting the spyware\malware payload? If only Apps approved by Apple are allowed on an iPhone, detection should be trivial.<p>And why has no one bothered to ask Apple or ARM about this 'unknown hardware'?<p>>If we try to describe this feature and how the attackers took advantage of it, it all comes down to this: they are able to write data to a certain physical address while bypassing the hardware-based memory protection by writing the data, destination address, and data hash to unknown hardware registers of the chip unused by the firmware.<p>And finally does Lockdown mode mitigate any of this?
It’s kind of simple imo. Apple is an American company and after Jobs died, Apple quickly signed up to working with the NSA and enrolled in the Prism programme.<p>Apple, like any other USA company, has to abide by the laws and doing what they are told to do. If that means hardware backdoors, software backdoors, or giving NSA a heads up over a vulnerability during the time it takes to fix said vulnerability (to give time for NSA to make good use of it) then they will.<p>Only someone with great sway (like Jobs) could have resisted something like this without fear of the US Govt coming after him. His successor either didn’t have that passion for privacy or the courage to resist working with the NSA.<p>Anyone, anywhere with an iPhone will be vulnerable to NSA being able to break into their phone anytime they please, thanks to Apple. And with Apple now making their own silicon, the hardware itself will be even more of a backdoor.<p>Almost every single staff member at Apple will be none the wiser about this obv and unable to do anything about it even if they did - and their phones will be just as fair game to tap whenever the spies want.<p>I am speculating. But in my mind, it’s really quite obvious. Just like how Prism made me win an argument I had with someone who was a die hard Apple fan and thought they would protect privacy at all costs… 6 months later, Snowden came along and won me that argument.