The clear distinction being that iCloud Mail scanning doesn't happen on device.<p>For my part, all Apple needs to do is move CSAM scanning to the cloud. No service provider can be expected to keep images of child abuse on their servers. Apple would join myriad cloud service providers in scanning for and reporting such material.<p>My problem is the use of my own device to run the scan. It's a waste of system resources. Presumably, a trivial software update down the line could expand its ambit to locally stored files. And we have the issue of Apple being compelled to run searches for non-CSAM hashes.
"Privacy is a human right" really does seem like it was only an advertising slogan.<p>When Apple says "privacy" they seem to have only meant from advertisers and hackers. I'm surprised and disappointed.
Something wasn’t squaring between the claims that Apple has already been scanning iCloud Photos for years, that Apple reports hundreds of instances of CSAM while Facebook reports millions, and that Apple knows they have a major CSAM problem. Good to find out the problem was with the first one.<p>I wonder if the “you know they already do this server-side, right?” people feel the slightest bit chastened.
It definitely makes sense to scan things that are passed around - mail is an example of that, and Gmail [and also Facebook, Twitter] scan for this, along with scanning for computer viruses, and in some cases (public posts) copyrighted content.<p>It makes absolutely no sense to scan people's photos that they aren't sharing with anyone else. Why are they bothering with scanning people's photo backups at all?<p>With the exception of "shared photo albums", I don't see why they're doing this.
Apple doesnt give a shit about the children, this is such a simple and transparent play. If they did, the kids wouldn't still be assembling their phones in China.