Pretty clear cut for me: iMessage either offers robust end-to-end encryption , or it doesn’t. (Intentional backdoors place it in the latter ofcourse).
I want true end to end enc.
I want to ignore most of this article to complain about an inaccuracy that keeps coming up.<p>> More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.<p>But... it wouldn't. The iMessage feature doesn't expose the contents of your message to anyone else under any circumstance.<p>If you're a child under 13 and your parents have opted in to this feature, you get a choice of seeing naked-pictures sent to you and having your parents be notified that you chose to, and bypassing it with no notifications of anything. (But once you're 13+, no notifications would occur.)<p>There <i>are</i> potential issues with this, mostly relating to abusive families being controlling. They'd have to do weird things like forcing their teenaged children to keep making new under-13 accounts to actually take advantage of it like that, though. And none of these issues impact the e2e status of iMessage in any way.<p>Apple really screwed up PR by launching the iMessage feature alongside the scanning-and-reporting iCloud Photos feature. There's so much confusion out there about this.<p>(The breaking-e2e aspect <i>does</i> exist with the iCloud Photos scanning... not that it's currently e2e, of course.)
So, what parts of my phone are actually being scanned by this system? It seems like it's rummaging through any images that are touch iCloud, which would include:<p>- iMessage
- WhatsApp
- Camera
- Matrix (?)
- Any other application that you give 'photo' permission to?
Is aljazeera.com the only outlet willing to cover this aspect of Apple's decision? It seems like the US media is effectively ignoring the complaints.
After seeing all these posts here and not hearing much about it outside of this space I’m starting to think that the cost benefit analysis that Apple did was that the profit from very likely convincing parents to spend a little more to get their children an iPhone as their first phone (and probably lock those children into their ecosystem) because of these features was greater than the amount of users who would switch to something else and that any blowback would be diminished because “what about the children”
Siri a few years into the future: Dave I noticed that when you sent your last tex message my AI determined with a 99% probability that you were driving. I’ll be notifying the local authorities and you will be receiving a citation for that. Please note that after the citation has been paid and you have been through rehabilitation they send me an unlock code so I can reenable your sms account.<p>Or:<p>Hi Dave i’ve been noticing that you have been discussing Covid vaccines with your friends. Apple thinks your position on this causes public harm so we will be adding additional information to each of your messages that discuss this subject. Thank you for being part of our team.
Is there a possibility that this could be some kind of Dual_EC type of backdoor they are trying to insert under the guise of something "positive" -- i.e. CSAM? As others have said elsewhere, it seems so out of the blue like they were pressured from somewhere, which I guess I default to assuming government. After reading "The Hacker and the State" by Ben Buchanan, it made me even more aware of how much depth and penetration there is of the private-government partnership in the US.
Im not sure if im missing something here, but from what I understand, what Apple is proposing is an enormous privacy boon that seems to be completely misunderstood.<p>All cloud providers are required to, and do, make these scans.<p>The proposal provides a way for them to comply with that requirement, but at the same time, allows them to completely lock themselves out of being able to decrypt your data in the cloud, except in this specific case, secured and controlled by your device.<p>This means they <i>couldn't comply with a nation states demand to secretly decrypt your data</i>, even if they were legally compelled too, unless they change how the cryptography at play here works.<p>Which I expect, would not go unnoticed.<p>Am I missing something?
Well-intentioned program, but ultimately unforeseeable bad consequences in other areas.<p>Good for Apple for stopping, and good for society to pressure them to stop. Seems like a win-win all around.