My opinion about:
1. Every pedophile know about the existence of this system, so I don't think it will be useful to fight those monster, maybe only marginally;<p>2. Anyway, is that legal ? Even if some crazy store material on his Apple hardware isn't that illegal search non usable in law courts ?<p>3. Child abuse is often used as Trojan horse to introduce questionable practice. What if:<p>- the system is used to looking for dissidents: I look for people that have a photo of Tiananmen Square protests on their pc, for example;<p>- for espionage: I have the hash of some documents of interest, so all the PCs with that kind of documents could be a valuable target;<p>- profiling people: you have computer virus sample on your PC -> security researcher/hacker;<p>I think that the system is prone to all kind of privacy abuse.<p>4. this could be part of the previous point, but, because I think it's the final and real reason for the existence of that system, I give to this point its own section: piracy fight. I think that the one of the real reason is to discourage the exchange of illigal multimedia material to enforce copyrighs.<p>For the listed reasons, I think that is a bad idea. Let me know what are you thinking about.
An Apple recruiter recently reached out to me. I am in a fortunate position to turn down opportunities, so I made sure to explain that I am not interested in working for a company that is at the forefront of enabling further infringement on people's privacy. If you are able to push back, do it in any small way that you can.
I am beginning to wonder if this was the plan all along. Back in 2013 via the Snowden leaks, it was revealed Apple was associated with the NSA domestic surveillance program, PRISM. It appears they (NSA and Apple, et al) pulled out due to the level of negative PR.<p>After 8 years, the intelligence community and tech companies figured out they could sell their surveillance through a thinly veiled effort to “protect X group” (in this case it was children).
Let me tell you my perspective as a screeching minority long time Apple user. You may be right. We as professionals that evangelized a lot of people for Apple, don't have power or influence over core target of Apple of today. Yep.
But I can assure you that I personally, as my colleagues will do everything in our powers to hurt Apples public image and brand, to give real information to our clients, friends and families. To educate people why smartphone convenience is slavery to the Tech Lords and how in the future all this data will shape a Digital ID in Social Credit System which will render peoples freedom obsolete.<p>To all apologists, Apple employees and shareholders who will hold their stock after this, I have a simple message: F*ck You. No. Seriously. Go to hell.
You are created and supported the monster which will eat you at the end.
Pushing Spyware Engine is literally abuse to all people including children and it's much worse then any problem they would claim to fight. Even if you believe them.<p>By this move people are indoctrinated with the idea that being watched by someone big and powerful is Ok. They learn to accept such abuse and what can be worse for any safety of anyone than learning that? If one is serious about any safety one should learn to walk away from such abuse first just like with any other abuses.<p>It is an attempt to legalize such Spyware Engine installation. Nothing more. The story is just to sell this move using emotional response from naive people. Because <i>high emotions</i> is when people do poor thinking for the long term consequences. Think about Vendetta and consequences of it.<p>Those people should be educated what the real abuse is and they should teach their children to recognize it because abuse by Apple is already there and it is much worse then the problem they claim are trying to solve. People need to understand that it will get much worse with the time.
This was the last straw for me. I’ve started looking around at alternatives (iCloud photos, literally no decent alternatives soo ok far to sync 400gb of photos) and have removed iCloud files to sub for just a simple NAS at home with tailscale.<p>I doubt I’ll buy a new iPhone next.
Reasonably, there are other criminals using iDevices, why stop at this? Even pre-crime police work may be feasible with AI and advanced pattern recognition.
There is another information leak that I have not seen mentioned in any news report. If an image hash matches the CSAM database, then it is sent to Apple, encrypted and with the “safety voucher”. Apple can decrypt the image only if they receive enough vouchers, and so they claim that they do not have any information about the user in case the number of vouchers is lower than the threshold.<p>But actually they do have information: they know that a user has a specific number of images which are perceptually similar to known CSAM material. This information is not conclusive, but it’s also not nothing. For example, could a court order Apple to release the unencrypted iCloud backups of all users who had at least one match?
A question to you legal experts out there: if a potential CSAM match is found during client-side scanning, but such a match has not yet been confirmed by an Apple employee to actually be CSAM, does Apple have the option, legally speaking, to SIMPLY DELETE the "gray-area" content in-place (just like a regular virus scanner), instead of sending it to Apple for further analysis?<p>Someone performs "an implication by malicious actors attack" on your iPhone/iPad and the injected content simply gets deleted. You take a (false positive) photo with your iPhone/iPad - and it simply disappears (making you retake). No private content is ever sent anywhere, no horrible accusation is ever made, no CSAM ever gets uploaded to iCloud. Simple.<p>Why doesn't Apple do that?
This is apple speaking out both sides of the mouth, because they will be punished by China if they don’t implement this.<p>China can submit hashes of political images Xi disagrees with to obtain lists of enemies.<p>The only way to sell this tool to the west is under the banner of protecting innocent children, this is their best shot of squaring the circle.
EFF conflates the CSAM detection and the iMessage safety features in the first paragraph. Disappointing that they can’t make their case with the facts.
This is effectively a virus scanner. Files are hashed (in a fancy way), compared against known hashes, and matches are reported. Your Windows desktop has Windows Defender.<p>EFF lost a lot of credence to me after the Best Buy case. They made this big fuss about how Geek Squad employees were agents of the state for reporting CSAM on a customers hard drive, while doing a requested file recovery. When searched, the defendant had CSAM on 5 different devices. The case was dismissed on a technicality. Never did the EFF mention this. Never did they say they were defending a gynecological doctor who had CSAM on 5 devices. Nope, it was spin city.<p>Now here we are. Apple has made a privacy preserving anti-virus scanner. It does not upload unknown files as Windows Defender does, it does not scan everything. It scans your photos, for known CSAM images, when you are using iCloud backups, in order to comply with the law that they must scan their hosting services for CSAM. It has a more narrow scope than an anti-virus scanner, and a bigger societal benefit.<p>We seem to have taken the idea that sometimes bad things are promoted through "think of the children" to mean we must oppose anything involving the protection of children. Our greatest fear in this is the government using a national security letter to search for banned ISIS memes? Let's address that slippery slope when we come to it, and let's note that we do not see Windows Defender or similar doing the same. This is great, I hope it puts a bunch of pedos behind bars.