The fact that Apple clearly cares a <i>lot</i> about software security from at least two angles--both securing the device for the user from external attackers (I know they truly care about this, even though they frankly seem to focus only on users in the west and have been known to throw people from the east under the bus without a fight: see their attempt to downplay Google Project Zero's critical discoveries a few years ago) as well as (sadly, but critically, as it establishes increased direct incentives) securing the software running on the device written by themselves and other content providers from the ostensible "owner" of the hardware (whether for digital rights management or anti-competitive purposes)--and has a centralized <i>dedicated</i> team that can learn from experience (to avoid making the same mistakes over and over across different releases, which was the norm at companies like Samsung for a long time) made up of incredibly smart people (including mercenary turncoats from the jailbreak community who either cared more about working on fun problems than fighting for the user or simply needed the cash so much the morals had to lose out) that has been staring at this problem now for well over a decade and who are on the cutting edge of deploying "mitigations" throughout both their software <i>and hardware</i> even if said mitigations cause multi-percent loss of performance for the user and even if it requires changes to some <i>or all</i> (omg) existing software AND YET--even if they have certainly made progress (the jailbreak ecosystem has been severely affected... I have given much longer talks on the status--such as the final segment of my "Hindsight can be 50/50" talk from 360|iDev a couple years ago, but the important thing to appreciate is that the time from having high-quality exploits for specifically-modern devices to when the vulnerabilities they rely on have been patched has gone from averaging months to averaging <i>negative</i> months as of a few years ago)--it SOMEHOW is STILL the case that people continue to find ways (even if they are highly costly to pull off and even if the resulting abilities are "limited": you really don't need much to do a denial of service or exfiltrate data... we focus a lot on what is required to build a high quality easy to use software modification stack, due to alternatives to the App Store like my Cydia, but that is "overkill" to someone merely trying to be malicious) to hack and slash through all of these defenses (even remotely!! the yearly iMessage exploit has almost become a <i>trope</i>) should be taken as a <i>visceral</i> demonstration that our industry simply seems <i>incapable</i> of developing secure software, and we either need an industry-wide "come to Jesus" moment whereby we reboot the entire thing with higher/safer abstractions written by fewer/smarter developers working in languages and with tools that allow us to <i>prove</i> the security of what we build (and no: Rust <i>isn't enough</i>... I give lectures at both college courses and hackathons on how real world jailbreak exploits have worked for both iOS and Android, and part of what makes it fun even for beginning developers is how many of the bugs are "conceptual"... it is absolutely true that memory safety <i>helps</i>, but we need to be striving for near-perfection here as the stakes are so high, and yet, somehow, we <i>often</i> manage to develop software that fails to provide safety for users without even a single incorrect use of memory storage primitives) which is going to require groundbreaking research that honestly might simply prove the impossibility of the task, or, maybe, we just need to <i>give up</i> and make sure that everyone everywhere lives in the same constant <i>and healthy</i> state of fear that every single computer we are surrounded by is a liability that could turn on us at any moment as those of us who "know better" do... and one day, it <i>is</i> going to happen: it isn't going to be a freedom fighter like Charlie Miller who figures out how to remotely disable the brakes on every Jeep Cherokee on the road simultaneously with a remote exploit--a <i>true story</i> that I believe every software developer should be taught in school in the same way that physicists are routinely taught about tragic historical mistakes in the handling of radioactive materials, to make sure no one casually deploys something that puts people's safety so directly tied to bugs in centralized servers--but it is going to be a "rogue nation state", "terrorist group", or, at this rate and with our luck the last few years, someone we might best describe as a "supervillain" (a movie where someone like George Hotz is the antagonist would be absolutely amazing: if anyone writes that script and needs a science advisor to help with making the technical details ridiculously accurate, hit me up) that gives civilization a serious and deadly lesson in cyber-security. :| :/ :(