TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Apple enabling client-side CSAM scanning on iPhone tomorrow

1182 pointsby robbiet480almost 4 years ago

101 comments

3pt14159almost 4 years ago
I&#x27;m really conflicted about this.<p>For context, I deeply hate the abuse of children and I&#x27;ve worked on a contract before that landed 12 human traffickers in custody that were smuggling sex slaves across boarders. I didn&#x27;t need to know details about the victims in question, but it&#x27;s understood that they&#x27;re often teenagers or children.<p>So my initial reaction when reading this Twitter thread was &quot;let&#x27;s get these bastards&quot; but on serious reflection I think that impulse is wrong. Unshared data shouldn&#x27;t be subject to search. Once it&#x27;s shared, I can make several cases for an automated scan, but a cloud backup of personal media should be kept private. Our control of our own privacy matters. Not for the slippery slope argument or for the false positive argument, but for its own sake. We shouldn&#x27;t be assuming the worst of people without cause or warrant.<p>That said, even though I feel this way a not-small-enough part of me will be pleased if it is deployed because I want these people arrested. It&#x27;s the same way I feel when terrorists get captured even if intelligence services bent or broke the rules. I can be happy at the outcome without being happy at the methods, and I can feel queasy about my own internal, conflicted feelings throughout it all.
评论 #28072281 未加载
评论 #28072601 未加载
评论 #28072274 未加载
评论 #28071050 未加载
评论 #28069786 未加载
评论 #28069697 未加载
评论 #28072220 未加载
评论 #28071160 未加载
评论 #28076061 未加载
评论 #28072040 未加载
评论 #28072032 未加载
评论 #28081044 未加载
评论 #28084082 未加载
评论 #28081017 未加载
评论 #28076224 未加载
评论 #28069983 未加载
ajsnigrutinalmost 4 years ago
So if I understand correctly, they want to scan all your photos, stored on your private phone, that you paid for, and they want to check if any of the hashes are the same as hashes of child porn?<p>So... all your hashes will be uploaded to the cloud? How do you prevent them from scanning other stuff (memes, leaked documents, trump-fights-cnn-gif,... to profile the users)?<p>Or will a huge hash database of child porn hashes be downloaded to the phone?<p>Honestly, i think it&#x27;s one more abuse of terrorism&#x2F;child porn to take away privacy of people, and mark all oposing the law as terrorists&#x2F;pedos.<p>...also, as in the thread from the original url, making false positives and spreading them around (think 4chan mass e-mailing stuff) might cause a lot of problems too.
评论 #28069283 未加载
评论 #28071032 未加载
评论 #28069273 未加载
评论 #28068931 未加载
评论 #28069857 未加载
评论 #28069976 未加载
评论 #28069227 未加载
评论 #28069793 未加载
评论 #28084280 未加载
评论 #28096008 未加载
评论 #28084136 未加载
评论 #28069901 未加载
SXXalmost 4 years ago
It&#x27;s funny to see anyone here could find this acceptable. I wonder what&#x27;s comments would be after Apple start to scan phones for anti-censorship or anti-CCP materials in China. Or for some gay porn in Saudi Arabia.<p>Because you know in some countries there are materials that local government find more offensive than mere child abuse. And once surveillance tech is deployed it&#x27;s certainly gonna be used to oppress people.
评论 #28073387 未加载
评论 #28071933 未加载
评论 #28076670 未加载
评论 #28073054 未加载
评论 #28075567 未加载
评论 #28072226 未加载
secondoalmost 4 years ago
It&#x27;s quite easy to extrapolate this and in a few steps end up in a boring dystopia.<p>First it&#x27;s iPhone photos, then it&#x27;s all iCloud files, that spills into Macs using iCloud, then it&#x27;s client side reporting of local Mac files, and somewhere along all other Apple hardware I&#x27;ve filled my home with have received equivalent updates and are phoning home to verify that I don&#x27;t have files or whatever data they can see or hear that some unknown authority has decided should be reported.<p>What is the utopian perspective of this which counterbalances the risks for this to be a path worth taking?
评论 #28069876 未加载
评论 #28070015 未加载
评论 #28074463 未加载
评论 #28071270 未加载
评论 #28070342 未加载
implyingalmost 4 years ago
A new aspect of this is that because this is self-reported, and the end goal is to involve the criminal justice system, there is now (essentially) an API call that causes law enforcement to raid your home.<p>What would be the result of &#x27;curl&#x27;ing back a few random hashes as positives from the database? Do I expect to be handcuffed and searched until it&#x27;s sorted out? What if my app decides to do this to users? A malicious CSRF request even?
评论 #28071916 未加载
etaioinshrdlualmost 4 years ago
Also, if they send perceptual hashes to your device - it&#x27;s possible images could be generated back from those hashes. These aren&#x27;t cryptographic hashes, so I doubt they are very good one-way functions.<p>Another thought - notice that they say &quot;if too many appear&quot;. This may mean that the hashes don&#x27;t store many bits of information (and would not be reversible) and that false positives are likely - ie, one image is not enough to decide you have a bad actor - you need more.<p>But at Apple&#x27;s scale, statistically, some law-abiding users would likely get snagged with totally innocent images.<p>Just a bad idea all around.
评论 #28069763 未加载
评论 #28071260 未加载
keymealmost 4 years ago
Dear humans,<p>1) You willingly delegated the decision of what code is allowed to run on your devices to the manufacturer (2009). Smart voices warned you of today&#x27;s present even then.<p>2) You willingly got yourself irrevocably vendor-locked by participating in their closed social networks, so that it&#x27;s almost impossible to leave (2006).<p>3) You willingly switched over essentially all human communication to said social networks, despite the obvious warning signs. (2006-2021)<p>4) Finally you showed no resistance to these private companies when they started deciding what content should be allowed or banned, even when it got purely political (2020).<p>Now they&#x27;re getting more brazen. And why shouldn&#x27;t they? You&#x27;ll obey.
评论 #28073134 未加载
评论 #28071584 未加载
评论 #28071004 未加载
评论 #28071213 未加载
评论 #28071586 未加载
评论 #28072098 未加载
评论 #28071103 未加载
评论 #28072485 未加载
评论 #28072310 未加载
评论 #28071432 未加载
评论 #28075898 未加载
评论 #28074447 未加载
评论 #28072577 未加载
评论 #28080568 未加载
评论 #28070996 未加载
bennyp101almost 4 years ago
Will this work differently depending on what country you are in? For instance, back in 2010 there was that thing about Australia ruling that naked cartoon children count as actual child porn. [1]<p>It&#x27;s perfectly legal elsewhere (if a bit weird) to have some Simpsons&#x2F;whatever mash-up of sexualised images, but if I flew on a plane to the land down under, would I then be flagged?<p>edit: If this is scanning stuff on your phone automatically, and you have whatsapp or whatever messenger set to save media automatically, then mass texting an image that is considered &#x27;normal&#x27; in the sender country, but &#x27;bad&#x27; in the recipients, you could get a lot of people flagged just by sending a message.<p>[1] <a href="https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;news&#x2F;2010&#x2F;01&#x2F;simpsons-powerpuff-girls-porn-nets-jail-time-for-australian.ars" rel="nofollow">https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;news&#x2F;2010&#x2F;01&#x2F;simpsons-po...</a>
评论 #28079975 未加载
评论 #28072070 未加载
choegeralmost 4 years ago
Sorry to say that, but stuff like this has to happen at some point when people don&#x27;t own their devices. Currently, nearly no one owns their phone and at least EU legislation is underway to ensure that it stays this way. The next step will be to reduce popular services (public administration, banking, medicine) to access through such controlled devices. Then we are locked in.<p>And you know what? Most people <i>deserve</i> to be locked in and subject to automatic surveillance. They will wake up when their phone creates a China-Style social score automatically, but then it will be far too late. It&#x27;s a shame for those people that fought this development for years, though. But the &quot;I have nothing to hide&quot; crowd deserves to wake up in a world of cyber fascism.
评论 #28071074 未加载
评论 #28071018 未加载
评论 #28070623 未加载
评论 #28070834 未加载
somuchlanalmost 4 years ago
Well this really debunks my common phrase “Apple is a Privacy company, not a Security company”<p>I can’t say I’m surprised they are implementing this (if true), under the radar. I can’t imagine a correct way or platform for Apple to share this rollout publicly. I’m sure nothing will come of this, press will ignore the story, and we all go back to our iPhones
评论 #28071518 未加载
评论 #28070911 未加载
SrslyJoshalmost 4 years ago
This will go great with zero-click iMessage exploits like this one: <a href="https:&#x2F;&#x2F;9to5mac.com&#x2F;2021&#x2F;07&#x2F;19&#x2F;zero-click-imessage-exploit&#x2F;" rel="nofollow">https:&#x2F;&#x2F;9to5mac.com&#x2F;2021&#x2F;07&#x2F;19&#x2F;zero-click-imessage-exploit&#x2F;</a><p>Edit: Actually, this won&#x27;t even require an exploit if they also scan media for people who have enabled &quot;iMessage in iCloud&quot;.<p>Just send someone an image in the DB (or an image that&#x27;s been engineered to generate a false positive) and wait for them to get raided.
评论 #28071333 未加载
评论 #28071078 未加载
deanclatworthyalmost 4 years ago
The terrifying part about this is potential abuse. We have seen people arrested for having child porn in their web cache just from clicking on a bad link. I could inject your cache with any image I want using JS.<p>Presumably the same could apply to your phone. Most messengers save images automatically. I presume the images are immediately scanned against hashes once saved. And the report is immediately made if it passes the reported threshold. There’s no defence against this. Your phone number is basically public information and probably in a database somewhere. You have no protection here from abuse, if you’re a normal citizen. I bet most people don’t even turn the auto save setting off on WhatsApp.
评论 #28071904 未加载
robbiet480almost 4 years ago
This has worrying privacy implications. I hope Apple makes a public announcement about this but wouldn’t be surprised if they don’t. I also would expect EFF will get on this shortly.
评论 #28069233 未加载
eitlandalmost 4 years ago
Ok. I can say this since I don&#x27;t have anything to hide (edit: 1. that I am aware of and 2. yet).<p>I switched to the Apple ecosystem 2 years ago and have been extremely happy.<p>I couldn&#x27;t see a single reason to switch back.<p>Today that reason came. What goes on on my phone is my business.<p>I guess fairphone next.<p>Again, I think have nothing to hide now so I can sat this loud and clear now. Given what recent elections have shown us we cannot know if I have something to hide in a few years (political, religious? Something else? Not that I plan to change but things have already changed extremely much since I was a kid 30 years ago.)
gizdanalmost 4 years ago
I&#x27;m gonna go out on a limb here.<p>At the end of the day laws are relative so to say. The thought behind such a system is noble indeed, but as we&#x27;ve seen, anything any government gets their hands on, they will abuse it. Classic example being PRISM et al. In theory it&#x27;s great to be able to catch the bad guys, but it was clearly abused. This is from countries that are meant to be free, forward thinking etc, not any authoritarian regimes.<p>People in this thread are asking what Saudi Arabia, China etc will do with such power that Apple is adding, you bet your ass that they&#x27;ll use it for their own gain.<p>I want to believe in such systems for the good. I want child abusers caught. But a system that equally can be abused by the wrong people (and I guarantee you that will be western countries too) ain&#x27;t it.
评论 #28071917 未加载
skee_0x4459almost 4 years ago
how the fuck am i supposed to know if that image i downloaded from some random subreddit is of a girl who is 17.98 years old? how long until we just use a NN to identify images of children automatically? she looks pretty young so i guess you will get disemboweled alive in prison? what is stopping someone from planting an image on your phone or a physical picture somewhere on your property? im so tired of this fucking dogma around child porn. you can always identify the presence of dogma by the accompanying vacuum of logic that follows in its wake. a teenage girl can go to jail for distributing pictures that she took of herself. do i even need to say more?
figassisalmost 4 years ago
And with this, the fear politics are in effect. Just from reading the comments it seems one can no longer be 100% sure their phone is clean. So people will live in constant fear that on some random Tuesday the cops will come knocking, your reputation will be destroyed and in the end when you’re cleared, you will have incurred incredible financial and mental costs. This is just aside the fact that your phone should be your phone and no one should be allowed.
srmarmalmost 4 years ago
You demo this tech working with child porn, it maybe shows it&#x27;s worth with some Isis training videos but before long China will be demanding access on their terms as a condition of accessing their markets.<p>And at that point the well meaning privacy advocate who worked hard to get some nice policies to protect users is booted off the project because you can hardly tell the shareholders and investors who own the company that you&#x27;re going to ignore $billions in revenue or let your rival get ahead because of some irrelevant political movement on the other side of the world.<p>It&#x27;s happened plenty of times before and it&#x27;ll happen again.
viktorcodealmost 4 years ago
What I find disturbing is that almost all commenters here took that rumour for a fact. There&#x27;s nothing to substantiate it, there&#x27;s no evidence of scan actually happening, and there&#x27;s no historical precedence of similar thing done by Apple. And yet, people working in tech with supposedly developed critical thinking took the bait.<p>Why? Is it simply because it fits their world view?
评论 #28070775 未加载
评论 #28071057 未加载
评论 #28070974 未加载
rinronalmost 4 years ago
This matches up with how I view Apples corporate thinking. &quot;we know what&#x27;s best&quot; &quot;the consumer is not to be trusted&quot;. Apple limits access to hardware, system settings, they block apps that don&#x27;t meet moral standards, are &quot;unsafe&quot;, or just might cause apple to not make as much money. They do not significantly care what people say they want after all they know best.<p>A lot of people love not having options and having these decisions made for them.<p>I would never want a device like that or with something that scans my device, but I think the vast majority of their customers if they even hear about it will think &quot;I trust apple, they know what&#x27;s best, it wont affect me&quot;<p>Im ok with apple doing it because i think most apple users will be ok with it. I would not be ok with it if all Android devices started doing it though.
评论 #28070491 未加载
评论 #28071153 未加载
ulfwalmost 4 years ago
I&#x27;m a little bit confused here and hope maybe some of you can clear this up.<p>My parents took lots of photos of me as a baby&#x2F;small child. Say lying naked on a blanket or a naked 2yr old me in a kiddie pool in the summer in our backyard. Those are private photos and because it was the 1970s those were just taken with a normal non-digital camera. They were OBVIOUSLY never shared with others, especially outside immediate family.<p>Transform that into the 2020s and today these type of pictures would be taken with your iPhone. Would they now be classified as child pornography even though they weren&#x27;t meant to be shared with anyone nor were they ever shared with anyone? Just your typical proud parent photo of your toddler.<p>Sounds a bit like a slippery slope, but maybe I am misunderstanding the gravity here. I&#x27;m specifically highlighting private &quot;consumption&quot; (parent taking picture of their child who happens to be naked as 1yr olds tend to be sometimes) vs &quot;distribution&quot; (parent or even a nefarious actor taking picture of a child and sharing it with third parties). I 100% want to eliminate child pornography. No discussion. But how do we prevent &quot;false positives&quot; with this?
评论 #28069840 未加载
评论 #28069908 未加载
评论 #28069834 未加载
makachalmost 4 years ago
Well, this is very problematic for a privacy concerned company. Under no circumstances do I want Apple to scan my private files&#x2F;photos, aspecially so if it means that an alarm can allow someone to determine if it is a positive or a false positive.<p>Also, this functionality isn&#x27;t something they should be able to implement without telling their end users.<p>It is also problematic because it will just make the cyber criminals more technical aware of what counter measures they must take to protect their illegal data.<p>The consequence is very bad for the regular consumer: the cyber criminal will be able to hide, and the government has the possibility to scan your files. End consumer lose, again.
gorgoileralmost 4 years ago
Every so often I feel a wave of revulsion that the computer I use the most — my iPhone — is an almost completely closed system controlled by someone else.<p>Contrast this with my desktop where, in the press of a few buttons, I am presented with the source code for the CPU frequency scaling code.<p>Bring on the Linux phones.
评论 #28071161 未加载
devwastakenalmost 4 years ago
This will be used for anti-piracy, government censorship, and targeted attacks, as always. There&#x27;s no such thing as &quot;were only scanning for CP&quot;. By creating the tool the company can be compelled to use the tool in other ways by U.S. or foreign governments. Apple already complies with anti-lgbt countries and will change their app store to suite each one of them. What happens when they&#x27;re required to also scan for LGBT materials? They&#x27;ll comply, because apple doesn&#x27;t actually have morals.<p>Ontop of this, it gives apple far too much power. What happens when someone they don&#x27;t like owns an iphone? They can pull an FBI and put the content onto the device, and having it then &quot;automatically detected&quot;.
tomaskafkaalmost 4 years ago
Saudis: We want a list of everyone who ever shared a photo of Khashoggi (no matter in which app).<p>Apple: Say no more, here they are. Hope you won&#x27;t imprison all of them, as that would decrease our services revenue substantially, lol.<p>Also Apple: Privacy is a human right, buy more iphones.
fencepostalmost 4 years ago
For those not familiar with the acronym, CSAM = Child Sexual Abuse Media
nbzsoalmost 4 years ago
Since Snowden I use my phone in minimalistic way. Phone calls. Minimal texting. No games. Banking apps if necessary.<p>Treat your phones as an enemy. Use real computers with VPN and software like Little Snitch when online. Use cameras for photography and video.<p>The benefits of this approach are immense. I have long attention span. I don&#x27;t have fear of missing out.<p>If governments wan&#x27;t the future to be painted by tracing and surveillance mediated towards people trough big tech - lets make it mandatory by law. And since big tech will reap benefits from the big data they must provide phones for free. :)
评论 #28069292 未加载
评论 #28070417 未加载
评论 #28069296 未加载
Shankalmost 4 years ago
I was under the impression that one of the reasons why these tools aren’t available for public download is because the hashes and system can be used to design defeat mechanisms? Doesn’t this mean that someone who has an image and a jail broken device can just watch the system, identify how the photo is detected, and modify it so that it doesn’t trip the filter?<p>PhotoDNA and systems like it are really interesting, but it seems like clientside scanning is a dangerous decision, not just from the privacy perspective. It seems like giving a CSAM detector and hashes to people is a really risky idea, even if it’s perfect and it does what it says it does without violating privacy.
评论 #28074861 未加载
chimenalmost 4 years ago
These days you can trump on anyone&#x27;s (right to) privacy, freedom or general rights for that matter with just 2 keywords:<p>- terrorism - child pornography<p>Try to protest it and you will be prompted with a nice &quot;do you have anything to hide?&quot; question by the masses.<p>The advertised intention of these tools could not be farther from the truth and people happily fills their pockets at each new launch.
sebyx07almost 4 years ago
4 good reasons why you should buy a iphone<p>- (Check) fragile<p>- (Check) best monopoly app store<p>- (Check) high price<p>- (Check) phones the police in case of a sha collision
评论 #28070212 未加载
vincnetasalmost 4 years ago
This might be unpopular opinion but catching people sharing CP images is like catching end users of drugs. Yes it&#x27;s illegal but the real criminals are the ones producing drugs. But it&#x27;s very difficult to get to them, so you just arrest end users.<p>Another side note is about near future when someone comes up with synthetic CP images, will they also be criminalised?
评论 #28070720 未加载
评论 #28072092 未加载
at_a_removealmost 4 years ago
I have so many questions about the implementation details.<p>1) Does this work only on iPhones or will it be iPads, as well?<p>2) Is this part of a system software update? I wonder if that will show up in the notes and how it would be spun. &quot;In order to better protect our users ...&quot;<p>3) If it is part of the system software update, will they be trying to make it run on older iDevices?<p>4) Is it just photos in your photo bin, iCloud, or does it start grabbing at network drives it attaches to? I could see the latter being prone to blowing up in their proverbial faces.<p>The Four Horsemen of the Infocalypse ride again!
评论 #28077011 未加载
VoodooJuJualmost 4 years ago
Even when this reaches its final conclusion, policing copyrighted and political content, people will still be content to use their i-spy-devices. The future is grim; it&#x27;s now.
评论 #28070549 未加载
zomglingsalmost 4 years ago
How do they determine if an image is child porn? My wife has an iPhone and we take pictures of our baby daughter on it, sometimes in diapers and sometimes naked. Our intentions are not pornographic but now I am worried about apple&#x27;s algorithm flagging them as such.<p>Has apple published its training data somewhere?
评论 #28072385 未加载
yositoalmost 4 years ago
What happens when a theocracy demands that Apple check for hashes of images that disrespect their prophet? To me this sounds potentially more scary and distopian than surveillance in China. But if I&#x27;m honest, I don&#x27;t know that China isn&#x27;t scanning citizens&#x27; devices for illegal hashes.
js2almost 4 years ago
I&#x27;m not worried about China. I&#x27;m worried about the U.S. This is a step along the path to the Buttle&#x2F;Tuttle dystopia that <i>Brazil</i> warned us about.
评论 #28070404 未加载
h_anna_halmost 4 years ago
It is lovely when your &quot;own&quot; device is working against you to catch if you are in possession of illegal numbers <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Illegal_number" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Illegal_number</a>. And surely we can trust Apple that it will only be used for this kind of content instead of for example government leaks.
jl2718almost 4 years ago
I would like to hear the strongest case for the privacy trade-off. How many more children will be physically recovered versus existing methods? What is the reduction in money flow to abduction activities?<p>This might be naive, but I would guess that the best way to fight this kind of thing is to let people know more of the case details. People would protect themselves, find the crimes, and stop unwittingly supporting them. For instance, if it can be shown that cryptocurrency or encrypted messengers are used to a significant extent, the community will either find a technical solution, or stop using it.
lenkitealmost 4 years ago
This is terrifying. The possibilities of extraordinary abuse are endless. What&#x27;s surprising to me is the complete lack of media focus on this topic ? Why isn&#x27;t this being hotly debated on TV ? Scanning people&#x27;s photos is just OK now ?<p>Back to an Android phone, once I confirm this story is true.
akerstenalmost 4 years ago
If you like this, I have some other innovations that you may be interested in:<p>* A car that automatically pulls over when a police cruiser attempts to intercept you<p>* A front door that unlocks when a cop knocks<p>* A camera that uses AI to detect and prevent the photography of minors, police, and critical infrastructure<p>* A Smart TV that counts the number of people in your living room to ensure you aren&#x27;t performing an unauthorized public broadcast of copyrighted content<p>Surely, at least one of those sounds ridiculous to you. As well-intentioned as this scanning may be, it violates a core principle of privacy and human autonomy. Your own device should not betray you. As technologists, just because we <i>can</i> do something doesn&#x27;t mean we <i>should</i>.
评论 #28070172 未加载
评论 #28070315 未加载
评论 #28070374 未加载
评论 #28070279 未加载
评论 #28070377 未加载
评论 #28070235 未加载
评论 #28070189 未加载
max_almost 4 years ago
On the light side perceptual hashing is actually very interesting technology.<p>If you&#x27;re interested I suggest you also have a look at photo DNA[0]<p>[0]: <a href="https:&#x2F;&#x2F;www.microsoft.com&#x2F;en-us&#x2F;photodna" rel="nofollow">https:&#x2F;&#x2F;www.microsoft.com&#x2F;en-us&#x2F;photodna</a>
chobytesalmost 4 years ago
Ive already been itching to de-cloud, and de-tech my life. If were already getting to this stage of surveillance I guess thats just another sign I should be getting on top of it.<p>Today its csam. Tomorrow &quot;misleading information&quot;. etc.
评论 #28070989 未加载
r721almost 4 years ago
Follow-up thread: <a href="https:&#x2F;&#x2F;twitter.com&#x2F;matthew_d_green&#x2F;status&#x2F;1423091097933426692" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;matthew_d_green&#x2F;status&#x2F;14230910979334266...</a>
Angosturaalmost 4 years ago
So many questions that akes this Tweet look odd. It&#x27;s a &quot;Client side tool&quot; - so what? An app you install? That law enforcement can install? That Apple can silently install.<p>It lets &quot;Apple Scan&quot;? So Apple is going to proactively scanning your photos using a tool then install?<p>So many questions about this. It doesn&#x27;t add up.
DavideNLalmost 4 years ago
This is just horrible… the people who actually abuse children and download such photos will now stop using Apple devices, and now the rest of us is vulnerable to misuse&#x2F;abuse&#x2F;corruption.<p>Instead of specifically targeting suspects, everyone is surveyed by default. Welcome to a world of mass surveillance.
trangus_1985almost 4 years ago
This is horrifying. Does this only affect iMessage, or the photos library? Is it remote? Does it require physical access?<p>As I understand it: it&#x27;s a tool (that sends a command of some sort) that compels an iphone to perform the hashing match operation, and output results. Is that correct? Does it notify the user?<p>If I had to build it within apple&#x27;s privacy framework, that&#x27;d probably be my approach: remote command causes sepos unlock of photos library (even running the job on sepos?) to do photo scanning. sepos returns hashes that match
hughrralmost 4 years ago
Ah I’m done now. That’s one step too far. When something overreaches into my data and risks integrity and privacy problems it’s game over.<p>So which Linux desktop sucks the least at the moment?
anonymouswackeralmost 4 years ago
Soon enough this will scope creep into anything containing what a puritanic neoliberal corporation considers contraband. Time for a serious look at Linux phones.
评论 #28070645 未加载
vsliraalmost 4 years ago
I won&#x27;t get into the CSAM discussion but for anyone that has a stash of non-DRMed content I think it&#x27;s a good idea to look into alternatives to Apple devices. Sooner rather than latter the same kind of system will be auto-deleting or alerting authorities about copyrighted material and I doubt that too much care will be taken to ensure that you didn&#x27;t actually have a right to those copies.
u4918almost 4 years ago
In Apple&#x27;s public comments they repeatedly say things like &quot;Apple can&#x27;t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.&quot;<p>If they are only concerned with iCloud accounts then... why not scan in the cloud? Can anyone explain to me why client-side scanning is actually needed here? As far as I&#x27;m aware, Apple only E2E encrypts iMessages, not iCloud photos or backups.<p>US politicians (to say nothing of countries with less individual freedoms) already openly pressure tech companies to censor specific content. And tech companies will do so even to the point of blocking direct, private, 1-1 messages sharing specific news. In that light, Apple&#x27;s crossing a line to client-side scanning seems deeply concerning.<p>I don&#x27;t see how keeping this as narrowly-targeted as it&#x27;s being advertised would ever be possible or even intended.
Xamayonalmost 4 years ago
Isn&#x27;t there the potential for abuse of this to track things like who you talk to in private? Even if the images on your phone do not contain CSAM, the hashes of all your images would need to be shared with Apple, the NCMEC, and who knows what other quasi-gov&#x27;t agencies. All it would take to build massive graphs of who talks to who, etc is to match up those hashes. It doesn&#x27;t matter if they have no idea what image the hashes correspond to... If they then take the simple step of generating hashes for common images found online, they could even track what sites you browse and such. Ignoring the potential for false positives and other negative side effects of the main goal, this is a horrific breach of privacy. If you honestly think the gov&#x27;t won&#x27;t abuse any and all data they collect this way, I don&#x27;t know what to say...
vegetablepotpiealmost 4 years ago
How likely are perceptual hashes to give a false positive? If I take a picture of a tree, how likely is it that a few pixels are going to line up just right in a hashing algorithm and say it might be child porn? How likely is it that law enforcement is going to understand the limitations of this technology? How likely is it that the judicial system will understand the limitations?<p>I can see law enforcement showing up at my door one day with a search warrant demanding to have a look around, and I would have no idea why they’re there, but they’ll want to look through all my personal belongings.<p>Worse yet, I might come home from work one day, see my windows broken, see my place has been ransacked and my computers are missing. I would call the police to report a burglary only to hear than that I’m under investigation and they need me to give them the key to decrypt my hard drives.
JackeJRalmost 4 years ago
The slippery slope argument is that the use of this method on private files, i.e. not shared with others except for the service provider can legitimise the expansion of such scamming scopes.<p>While this argument can and have indeed happened in other instances, this is akin to saying that we should not give anyone any powers to do anything because it is a slippery slope that they can use it to do bad things.<p>What then sets out the difference between what a slippery slope and a non-slippery one is? Checks and balances and the history of USA have shown that this is indeed what can reign in the worst instincts of any single entity. History of course have also shown when these failed and these should serve not as a reason to reject the idea of checks and balances but as acknowledging it&#x27;s imperfection and think of ways to mitigate it.
评论 #28070192 未加载
akerstenalmost 4 years ago
I really have to wonder why Apple chose to do this.<p>As far as I know, this kind of scanning is not legally mandated. So, either they think that this will truly make the world a better place and are doing it out of some sense of moral responsibility, or they&#x27;ve been pressured into it as part of a sweetheart deal on E2E (&quot;we won&#x27;t push for crypto backdoors if you&#x27;ll just scan your users&#x27; phones for us&quot;). Either way it doesn&#x27;t thrill me as a customer that my device is wasting CPU cycles and battery life under the presumption that I might possess data my current jurisdiction deems illegal.<p>For all the acclaim privacy-forward measures like GDPR get here, I&#x27;m surprised there isn&#x27;t more outright repudiation of this frankly Orwellian situation.
评论 #28071367 未加载
alsetmusicalmost 4 years ago
I scanned the comments to find out who this person is and how they would have any inside info and found nothing. Why is this person’s claim being taken at face value? Before debating the merits of Apple scanning photos &#x2F; hashes, why does anyone believe this is true?
jpe-210almost 4 years ago
How does one own&#x2F;use an iPhone and help mitigate any issues from this? How does one help prevent this kind of sneaky photo crawling? I feel like in order to prevent people from spying on me I have to change _everything_ I do on my phone&#x2F;computer.
tinus_hnalmost 4 years ago
According to this this has been a thing for quite a while:<p><a href="https:&#x2F;&#x2F;fightthenewdrug.org&#x2F;apple-fights-child-porn-by-scanning-users-uploaded-icloud-photos&#x2F;" rel="nofollow">https:&#x2F;&#x2F;fightthenewdrug.org&#x2F;apple-fights-child-porn-by-scann...</a>
derefralmost 4 years ago
Huh. I always took the cynical view and assumed that this was something every proprietary OS was already doing, and that this was part of why dark-web die-hards were so insistent on using TAILS. Guess &quot;not yet.&quot;<p>On another note—OSes may only be starting to do this, but that same cynicism still leads me to presume that arbitrary closed-source third-party apps — or even closed-source binary distributions of open-source apps (e.g. the App Store versions of FOSS software) — could have been quietly scanning the files people are passing them for CSAM for years now, without telling users about it. It always seemed to me like the kind of thing it&#x27;d make sense to quietly slip into a media player like VLC.
option_greekalmost 4 years ago
This will end being a nightmare due to false positives especially for parents with kids.
vincnetasalmost 4 years ago
Another real case is how they will handle photos of my own naked kids, of which i have plenty of photos, because it&#x27;s quite natural for my kids to be running around and playing naked. And i want to capture the moments, not the nudity. And also i have very close friends who are visiting us with they kids and for us it&#x27;s ok to see each other children playing naked. We sometimes even share photos of nice moments with our kids, where kids sometimes happen to be naked. Is this already CP? Our kids are 3 and 5 years.
somedude895almost 4 years ago
Will they have to update their EULA or something before they have it communicate to their servers? I hate everything about this and would like to know when it actually happens. So far it&#x27;s just a rumor.
spectoalmost 4 years ago
I think a lot of people are missing why apple is doing this now. They&#x27;re doing this because they have a fairly secure eco system. They have also create a proxy that makes it difficult (impossible according to them) to know the client. More than likely this was implemented so when it does go to congress, they can say look we implemented a system. Otherwise the DOJ will continue to push for no encryption or backdoor encryption. There&#x27;s no winning here.
thepimp32almost 4 years ago
Great. I&#x27;m curious what would happen if you have auto-save received images enabled in WhatsApp and someone would spam you with child pornography images.
kyralisalmost 4 years ago
Is there any actual evidence presented here? This is someone who&#x27;s repeating &quot;But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them.&quot;, for which there is <i>also</i> no evidence and quite a lot of testimony to the contrary. His credibility seems questionable in the absence of evidence.
protomythalmost 4 years ago
So, the Chinese government could force Apple to add the hash for the Tienanmen Square tank man picture and find it on all iPhones?
jhaywardalmost 4 years ago
I wonder when I gave Apple permission to do this?
评论 #28070361 未加载
评论 #28069109 未加载
webmobdevalmost 4 years ago
Wow, this explains so much on why the Indian Government &quot;withdrew&quot; a letter seeking Apple&#x27;s compliance with new IT surveillance rules - <a href="https:&#x2F;&#x2F;thewire.in&#x2F;government&#x2F;centre-withdrew-letter-seeking-apples-compliance-with-new-it-rules" rel="nofollow">https:&#x2F;&#x2F;thewire.in&#x2F;government&#x2F;centre-withdrew-letter-seeking...</a> ...<p>The indian government has recently introduced new laws that give them power to dictate terms over many online platforms and broaden their surveillance powers over online social media and messenger platforms. One of the laws dealing with messenger platforms requires the platform to track shared content, especially &quot;origin of content&quot; (first originator) of any content that is shared through their network. (Facebook &#x2F; WhatsApp has already gone to the court to challenge this, as it claims that they would need to break end-to-end encryption for this and it thus violates indian privacy laws).<p>Apple&#x27;s iMessage platform has more than 25 million users, and thus should come under the ambit of this law. But strangely, the indian government seems to have given them an &quot;exception&quot; .... and now we know why.
SrslyJoshalmost 4 years ago
Betcha the Chinese government already has their own DB of hashes that they want to scan for.
hamilyon2almost 4 years ago
So, while everyone discusses how horrible future dystopia will be.<p>I worry about the method itself: will a simplest of firewalls be very effective against this? The one that forbids any communication except with few hosts, like a pair of them?
JanneVeealmost 4 years ago
So what would the process be for example if a unwitting parent or relative has pictures of a victim on their iPhone that is perceptually similar of their loved one being molested at day care something like that?
andy_pppalmost 4 years ago
How easy is it to generate an image that has the same “perceptual hash” or whatever that are calling it? My guess is it has to be easier than cracking a non fuzzy hash? Do we know the algorithm they are using?
评论 #28071914 未加载
hexoalmost 4 years ago
Thank for letting me know. Now I know I&#x27;m not getting Apple smartphone.
mikkelamalmost 4 years ago
Honestly, it&#x27;s probably about time to switch to GrapheneOS or LineageOS
LinuxBenderalmost 4 years ago
Does this not encourage spawning a new arms race? New or modified apps that randomly change hashes of multimedia files as they are stored? If the CSAM DB is just simple hashes like sha256, md5&#x2F;md4, etc.. then evading detection would be trivial. Or would Apple block applications that could rewrite randomized data into files? People don&#x27;t have to be against CSAM to dislike something scanning their devices and many developers love puzzle challenges. I assume perhaps incorrectly that whatever app is doing the scanning could potentially also accept additional hash DB&#x27;s, allowing Apple to enable categories to detect on per region. One of the iPhone emulators should facilitate reverse engineering the application.
评论 #28079162 未加载
abstractbaristaalmost 4 years ago
This is terrifying. I&#x27;d be pissed if I owned any Apple hardware. Encrypt your systems and run your own private cloud.<p>Privacy is about the only thing keeping innocent people free in today&#x27;s world.
bitcuriousalmost 4 years ago
Remember earlier this year when bing.com returned an empty page for the image search of tank man? We’re now moving towards a world where your phone can show you that same blank page.
vmceptionalmost 4 years ago
Sucks since we pay for icloud backups and get the lesser service
评论 #28070767 未加载
评论 #28070658 未加载
pcdoodlealmost 4 years ago
I am done buying apple products. This is the final straw.
bitLalmost 4 years ago
OK, I guess the temperature is high enough after ~20 years, frogs are boiled, it&#x27;s time to move in and consolidate power everywhere.
shucklesalmost 4 years ago
The comments here suggest very few on HN have run a public service that allows arbitrary upload of photo and video content.
croesalmost 4 years ago
Can we get a statistic how many people switch to Android tomorrow, separated by profession, politicians for instance?
f6valmost 4 years ago
What about people storing their children&#x27;s photos on the iPhone? How would a system differentiate those?
intunderflowalmost 4 years ago
Apple have separate operations in China, how long until anti-party content is on the hash list over there
maCDzPalmost 4 years ago
I really hope Apple have addressed hash collision. Otherwise we are going to have a bad time.
fumaralmost 4 years ago
Can activist shareholders mitigate changes like CSAM scanning?
vmceptionalmost 4 years ago
That tweet thread is saying it will scan for hashes client side and upload the result, circumventing E2E encryption, but then says theyre just going to do it on your icloud backups because they dont have E2E encryption, so which is it?<p>All?
评论 #28069419 未加载
cptskippyalmost 4 years ago
CSAM also means Cyber Security Asset Management.
wayneftwalmost 4 years ago
What if I just don&#x27;t want this feature using my battery time or my network bandwidth?<p>They have to download the hashes in order to compare them, I wonder if a pihole could help here?
sydthrowawayalmost 4 years ago
Why do people think <i>this</i> will result in immediate abuse by corrupt governments as opposed to any other Apple service? Just because anime avatar twitter says so?
ostenningalmost 4 years ago
Now Pegasus can really cause some damage!
oeveralmost 4 years ago
General Failure reading drive A:
AHOHAalmost 4 years ago
&gt;Twitter for iPhone Classic
egbertsalmost 4 years ago
Soviet Apple sees you.
Cyril_HNalmost 4 years ago
Mac up next, I bet.
edge17almost 4 years ago
I&#x27;m assuming android&#x2F;google must do this already?
评论 #28069112 未加载
croesalmost 4 years ago
iPhone only? Why not iPad and iMac too?
layoutIfNeededalmost 4 years ago
If you really believe this will stop at child porn, then I have a bridge to sell you.
bississippialmost 4 years ago
Been scanning the answers for a solution, I am 100% on Apple ecosystem and I am all for protecting little children but as others have pointed out nothing is stopping them from colluding with the government in future to say scan for political dissent: - weaponizing memes against politicians as “hate speech” - memes against government suppressing dissent - and we can look what happened to Uighurs<p>Can Samsung phones that are de-googled work ? I am specifically interested in a new phone that Samsung launches, can it be degoogled ?
pokot0almost 4 years ago
Everyone worried about Apple, but Apple is company and follows the laws of the countries it operates in. If this is legal, or even more required by the goverment agencies: Apple will comply. And there is nothing wrong with it. If you don&#x27;t like the laws of your country, you need to work for that, not just go after Apple on social media.
KirillPanovalmost 4 years ago
Could we get a more useful link here, people?<p>CSAM? According to Google that&#x27;s Confocal Scanning Acoustic Microscopy. Or something.<p>And what with the tweeters? I think my laptop just gave me thighburns from the CPU bloat that clicking on that link caused. Eight seconds to render the page? Why do folks still use this twerker website?
评论 #28070440 未加载
评论 #28070774 未加载
BluSynalmost 4 years ago
I understand the hesitation here, but fundamentally this is like trying to close pandoras box. If something is technically possible to do AND governments demand it be done, it will be done. If not by Apple, by someone else.<p>Rather than complain about it, I am interested in what alternative solutions exist, or how concerns regarding privacy and abuse of this system could be mitigated.
评论 #28069174 未加载
评论 #28069183 未加载
评论 #28070875 未加载
评论 #28069125 未加载
评论 #28069173 未加载