(Context: I teach computer security at Princeton and have a paper at this week's Usenix Security Symposium describing and analyzing a protocol that is similar to Apple's: <a href="https://www.usenix.org/conference/usenixsecurity21/presentation/kulshrestha" rel="nofollow">https://www.usenix.org/conference/usenixsecurity21/presentat...</a>.)<p>The proposed attack on Apple's protocol doesn't work. The user's device adds randomness when generating an outer encryption key for the voucher. Even if an adversary obtains both the hash set and the blinding key, they're just in the same position as Apple—only able to decrypt if there's a hash match. The paper could do a better job explaining how the ECC blinding scheme works.
It won't be long until these type of systems are mandated. Combined with a hardware root of trust it's not inconceivable that modifying your hardware not to report home will also be made a crime. It never stops with CSAM either, pretty soon it's terrorism and whatever vague new definition they use.<p>The focus on CSAM seems extremely hypocritical when authorities make such little effort to stop ongoing CSA. I would encourage everyone to research the Sophie Long case. Unless there is image or video evidence the police make little effort to investigate CSA because it's resource intensive.
Regardless of whether this attack works or not, you'd assume this scheme produces a wider attack surface against pictures in iCloud and against iCloud users. One attack I could imagine is a hacker uploading child porn to a hacked device to trigger immediate enforcement against a user (and sure, maybe there are more controls involved but would you carry around a very well-protected, well-designed hand grenade in your wallet just so you're bad, it'll explode).
For some reason, after reading the initial reporting on this system, I thought it was running against <i>any</i> photos on your iPhone, but now I read the actual paper, it seems like it only applies to photos destined to be uploaded to iCloud? So users can opt out by not using iCloud?
The question presumes the database leak also comes with the server side secret for blinding the CSAM database, which is unlikely (that’s not how HSMs work) and would be a general catastrophe (it would leak the Neural Hashes of photos in the NCMEC database, which are supposed to remain secret).
Pretty soon housing your own infra and not using the mandated govt phone could be made a crime.<p>But think of the children and security of the society. Couple that with constant monitoring of your car and you can be monitored anywhere
Why does Apple even bother with encryption? They should just skip all of the warrant requirements etc and use their iCloud keys to unlock our content and store it unencrypted at rest.<p>Maybe they can also build an api so that governments can search easily for dissidents without the delays that the due process of law causes.