TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Apple plans to scan US iPhones for child abuse imagery

372 pointsby alexanderkleinalmost 4 years ago

63 comments

partiallyproalmost 4 years ago
I really don&#x27;t see how this is going to end well, there could be perfectly innocent photos on someone&#x27;s phone of their own children doing perfectly normal things that kids do. Like a kid running butt naked around the house, or a photo of something like a rash that is sent to a nurse friend for advice on what it is etc.<p>I&#x27;m all for protecting children from being abused, but how are they going to filter what is normal and what is abuse without human intervention? And at that point isn&#x27;t that a vast invasion of privacy to the perfectly innocent? It&#x27;s different if it&#x27;s online because that puts it in the public realm or on devices&#x2F;servers owned by Apple, etc.
评论 #28076437 未加载
评论 #28075730 未加载
评论 #28076103 未加载
评论 #28075642 未加载
评论 #28076298 未加载
评论 #28075756 未加载
评论 #28075616 未加载
评论 #28076364 未加载
评论 #28075670 未加载
评论 #28077110 未加载
评论 #28076625 未加载
评论 #28076442 未加载
评论 #28076840 未加载
评论 #28084225 未加载
评论 #28075621 未加载
评论 #28076431 未加载
评论 #28076395 未加载
评论 #28081733 未加载
评论 #28076529 未加载
评论 #28075995 未加载
评论 #28076102 未加载
imrootalmost 4 years ago
<a href="https:&#x2F;&#x2F;towardsdatascience.com&#x2F;black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277?gi=cfb3a66600e5" rel="nofollow">https:&#x2F;&#x2F;towardsdatascience.com&#x2F;black-box-attacks-on-perceptu...</a><p>I mean, who&#x27;s to say that my hash is actually what they think it is...<p>...and if it&#x27;s not uploaded to iCloud for someone to manually review what the image is, is that going to be enough to get a search warrant for my phone?<p>&quot;AUSA SOMEONE: But, Your Honor, The Defendant&#x27;s phone has an image that matched a hash of a CSAM picture!&quot;<p>&quot;DEFENDANT: I&#x27;m not going to give you my phone&#x27;s password.&quot;<p>&quot;THE COURT: Bonk: Stay in jail for contempt.&quot;<p>I understand why they&#x27;re doing it. I don&#x27;t agree with it. At all, and I say that as a survivor of childhood sexual abuse.
评论 #28076590 未加载
评论 #28075587 未加载
评论 #28076477 未加载
评论 #28076053 未加载
Someone1234almost 4 years ago
If it matches, am I guilty until found innocent?<p>I have no illicit images, but false positives are always going to be a problem and even at sub-1% rates if you&#x27;re scanning literally every image stored on an iOS devices that could still be thousands of wrong matches. If I lose the false-positive lottery, am I going to have the police calling and have my mugshot in the evening news for &quot;CP on their device&quot; in particular in my state&#x27;s months long backlog for digital forensics. Law enforcement aren&#x27;t exactly known for their understanding of technical evidence like hash-matching.<p>I&#x27;m very sympathetic&#x2F;supportive of anti-CP initiatives in general, but this still seems problematic.<p>PS - And before someone replies with &quot;they aren&#x27;t contacting the police,&quot; don&#x27;t forget the &quot;yet.&quot; Once this is in place they&#x27;ll immediately get pressured by politicians and well-intentioned people alike to flag any positive detections to authorities who will overreact (as law enforcement has a history of doing, particularly around this topic).
评论 #28076498 未加载
评论 #28076342 未加载
KingMachiavellialmost 4 years ago
This could backfire in a huge way. I think a lot of people, even EFF&#x2F;privacy focused people are fine with a private company doing CSAM detection (or anything really) on <i>uploaded</i> user content since it is actually voluntary.<p>Now that CSAM detection is being deployed to user devices including those previously sold without it, there&#x27;s a motivation to circumvent&#x2F;block it. Unless Apple is doing CSAM scanning 100% inside a 100% secure processor, eventually someone will figure out the specifics of the current perceptual hash algorithm. There are already methods to make perceptual hash collisions. All it takes is one security researcher to publish a tool that creates millions of false-positives for the current dataset&#x2F;model used by Apple &amp; others to be useless.<p>Sure eventually they will improve the model&#x2F;dataset but it just starts a cat &amp; mouse game between Apple&#x2F;Governments &amp; honest citizens&#x2F;users which actually benefits the people committing these crimes.
评论 #28076366 未加载
评论 #28075846 未加载
devwastakenalmost 4 years ago
Reminder that even accusations of abuse imagery will get you in cuffs and your children taken away (see article at the end)<p>Linux truly is the last bastion of sanity. I am so glad that linux desktops took off and work as well as they do. Last thing I should be worrying about is if an employee at Microsoft or Apple decides their &quot;automated system with no errors&quot; decides that a hash collision or a picture of a baby is close enough to actual abuse imagery. There are plenty of people that believe even a picture of a naked baby in a bathtub is abuse imagery. Anyone remember the case of Walmart? They took their kids away &quot;pending investigation&quot;.<p><a href="https:&#x2F;&#x2F;jonathanturley.org&#x2F;2009&#x2F;09&#x2F;18&#x2F;arizona-couple-sues-wal-mart-after-store-calls-the-police-on-them-for-developing-pictures-of-this-children-in-a-bathtub-and-the-children-are-taken-by-the-state&#x2F;" rel="nofollow">https:&#x2F;&#x2F;jonathanturley.org&#x2F;2009&#x2F;09&#x2F;18&#x2F;arizona-couple-sues-wa...</a>
评论 #28079301 未加载
laputan_machinealmost 4 years ago
Ah, the classic &#x27;think of the children&#x27; approach to increasing surveillance. If you&#x27;re opposed to it, what are you, a nonce?
评论 #28076252 未加载
评论 #28075463 未加载
评论 #28075486 未加载
Rebelgeckoalmost 4 years ago
In a world where all police were incorruptible and all laws were just, maybe I&#x27;d be ok with this. But since we live in <i>this</i> world, I feel like it&#x27;s only a matter of time until the &quot;think of the children&quot; technology is coopted to scan for other &quot;bad&quot; images.<p>I wonder how long it&#x27;ll take for China to compel Apple to add winnie_the_pooh.jpg to the set of naughty images for users in China and HK? And like we saw when bing altered the search results for &quot;tank man&quot; even in the countries like the US, at that point we&#x27;d be one configuration error away from those hashes leaking into the US dataset.
评论 #28075897 未加载
评论 #28078160 未加载
mikecealmost 4 years ago
Scan for child abuse today; scan for wrong-think tomorrow. I&#x27;m liking the idea of Graphene OS more and more each day... or just going phone-free entirely.
评论 #28075643 未加载
dehrmannalmost 4 years ago
&gt; ...ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography.<p>I get why they&#x27;re making these demands, but this is in the US. This would clearly be an unreasonable search by Fourth Amendment standards if it were done by the government, so I&#x27;m not sure why government agencies think they can make demands that they, themelves can&#x27;t legally execute of a private company. To be clear, it&#x27;s legal for Apple to do this, and legal for the government to ask them to. The government is just laundering its door-to-door search.
评论 #28077165 未加载
smnrchrdsalmost 4 years ago
When it comes to Apple and privacy, it&#x27;s important to differentiate two different types of privacy: (1) privacy against snooping by the state, and (2) privacy against snooping by all others. As we had seen with Apple and China, it&#x27;s only the second type of privacy that Apple cares about. Sharing your information with the state is not out of character for Apple. In fact, had they done the opposite, it would have been out of character.
评论 #28076535 未加载
评论 #28076714 未加载
itakealmost 4 years ago
WhatsApp automatically saves photos sent to the phone to the Photo Album. Does this mean they will get a knock on their door if someone sent a bad image?
评论 #28078793 未加载
1-6almost 4 years ago
Apple should perhaps dogfood their system on their own employees first: <a href="https:&#x2F;&#x2F;www.losaltosonline.com&#x2F;news&#x2F;hills-man-arrested-on-child-porn-charges&#x2F;article_9bbe51f7-aaa4-5814-98eb-a9a6b0f51a42.html" rel="nofollow">https:&#x2F;&#x2F;www.losaltosonline.com&#x2F;news&#x2F;hills-man-arrested-on-ch...</a>
评论 #28075992 未加载
mistercoolalmost 4 years ago
&gt; Apple intends to install software on American iPhones to scan for child abuse imagery<p>&gt; Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system<p>Why is there any need for Apple to install software on the iPhone if they are isolating the algorithm to run only on cloud storage, not local images? Not a programmer, so maybe a simple explanation.<p>Also, if this is isolated to iCloud, won&#x27;t criminals just start using a different cloud backup provider?
评论 #28076290 未加载
评论 #28076297 未加载
sandworm101almost 4 years ago
As any of the lawyers here can attest, the legal definitions here open to huge debates. Some material is unambiguous, other images reside extensive grey areas. The legality of material can even change based on context. Something that is perfectly legal in one area can be absolutely illegal in another. Example: If I had to list the most iconic photographs in world history, one of them is of a totally nude child (Vietnam). That isn&#x27;t an illegal image, but try explaining that to a computer. At the moment these nuances are handled by cops, prosecutors and in extreme cases the courts. Is Apple going to put a marker down on a particular universal definition? Will that definition become the de facto world standard?
评论 #28078802 未加载
bastawhizalmost 4 years ago
&gt; Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.<p>It&#x27;s weird and invasive that they&#x27;d search users devices, but this makes it sound like they&#x27;re just searching for files with certain hashes, which (to me) is fairly uninteresting. What I don&#x27;t understand is why this is called &quot;neuralMatch&quot;. Is it some sort of perceptual hashing (versus cryptographic hashing)? What exactly makes it &quot;neural&quot;?<p>If neuralMatch is to Google&#x27;s reverse image search as Apple Maps was to Google Maps when it launched, this is going to have so many false positives.<p>&gt; According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities<p>Is this separate? Or is the article just confusing iCloud storage for on-device storage? Regardless, there&#x27;s some very lazy journalism going on here.
评论 #28078996 未加载
TerntUpalmost 4 years ago
<a href="https:&#x2F;&#x2F;archive.ph&#x2F;ys5Q5#selection-1501.0-1501.54" rel="nofollow">https:&#x2F;&#x2F;archive.ph&#x2F;ys5Q5#selection-1501.0-1501.54</a>
r00fusalmost 4 years ago
I bet Apple understands this is a bad idea (dilutes their &quot;privacy&quot; brand image) I wonder what forces have pushed Apple to this point.
评论 #28078201 未加载
评论 #28076661 未加载
评论 #28075973 未加载
analyte123almost 4 years ago
Things that would reduce child sexual abuse more than scanning everyone&#x27;s phones on a daily basis:<p>- Encourage marriage and remove benefits penalties for two-parent households, as children are substantially more likely to be abused when their biological father is not in the household<p>- Fund undercover police work and informants, which is how most large busts already happen now. Ideally highly independent from existing courts and law enforcement to deal with Epstein- or Dutroux-type situations. The worst offenders are not swapping child porn on Facebook or Dropbox (or now iMessage and Apple Photos)<p>- Enforce obscenity laws against incest and other extreme forms of pornography<p>- Institute the death penalty for child sexual abuse and make sure it is performed swiftly and publicly<p>Since these aren&#x27;t under consideration, we can tell this isn&#x27;t about reducing child abuse. It&#x27;s about setting up a permanent and ever-expanding digital system for political control.
评论 #28077172 未加载
julkalialmost 4 years ago
<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28068741" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28068741</a><p>This topic already made it to the current frontpage.
throwaway287391almost 4 years ago
&gt; Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.<p>Haven&#x27;t Apple previously claimed that their encryption works such that Apple themselves can&#x27;t decrypt it no matter what? (IIRC this came up during the San Bernadino attack when the FBI requested that Apple decrypt&#x2F;unlock a suspect&#x27;s iPhone and Apple said they couldn&#x27;t.) Is this not a change? Maybe I&#x27;m confusing iCloud encryption vs. iPhone encryption and the former doesn&#x27;t have any of those guarantees?
评论 #28076150 未加载
评论 #28076113 未加载
marricksalmost 4 years ago
Isn&#x27;t Apple going to start auto scanning the text in your images too for searching&#x2F;easy copying? How many years until the hash of text in your images is compared to a database of... god knows what. Sensitive information? Dangerous text? I honestly can&#x27;t think where it&#x27;s most likely headed but I don&#x27;t like it.
smlss_sftwralmost 4 years ago
I have yet to deep-dive into the techical details of Pegasus so this is pure speculation, but if Pegasus demonstrates the feasibility of deploying and executing arbitrary payloads on remote devices without the owner&#x27;s knowledge or consent wouldn&#x27;t that raise a blackmail vector with this new program? Again I&#x27;m not familiar with the specific exploits used by Pegasus, but is there anything stopping another malicious actor from relying on the same exploits to say install a bootloader that downloads images that would trigger a positive scan?
morpheuskafkaalmost 4 years ago
&gt; Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.<p>This doesn&#x27;t really make any sense. So it&#x27;s not a Neural Engine thing but merely hash matching, and it also only applies to iCloud Backup? That is already non-e2e encrypted, so Apple can already access those files whenever it wants.
评论 #28076305 未加载
yepthatsrealityalmost 4 years ago
This will enable Apple to send hashes of matching fingerprint and face scans to their online database. This also completely disabled their argument that their product is safe because the data remains encrypted on your device. Instead of passing the data to the spy service they are sending a report on data to the cloud which will allowed to be much more opaque and misleading.<p>I just ordered a Iphone 12 Mini to replace an Iphone SE, but now I’m going to return it. No amount of cool tech in a product is worth enabling this slippery slope against human privacy rights.
bitwizealmost 4 years ago
This just means child molesters will seek out PinePhones, Purism Librems, etc. And those phones will acquire reputations as &quot;pedophones&quot;, and anyone buying or using one will be immediately considered sus. People with nothing to hide won&#x27;t mind an OS with surveillance features.
评论 #28076052 未加载
seph-reedalmost 4 years ago
I&#x27;m too old for this, but I really wonder what it&#x27;s going to be like for high-school kids and nudes 10 years on.<p>I mean, if you get a really good nude from someone you&#x27;re going to want to keep it. But for how long? Is there now a social rule of how long to keep nudes for? Is 21 too old, and you should delete them? What if you&#x27;re still with the person?<p>Personally, I&#x27;m glad I don&#x27;t have to figure it out, but teens sending each-other nudes just has all sorts of confusing ethical quandries to it.
nullifidianalmost 4 years ago
While they are at it they should up the ante and also add COVID misinformation filters, so that a phone refuses to save it and&#x2F;or doesn&#x27;t send it anywhere. Would be very topical. There is also racism and other almost universally recognized wrongthink. If you decide to police offline activity as if it&#x27;s a cloud platform, why not implement all the online platform&#x27;s automatic content filtering&#x2F;censorship? &#x2F;s
评论 #28076646 未加载
kingsloialmost 4 years ago
I had a very sick 8 month old baby who I obviously had a million photos of. Unfortunately, my lil girl had 9 surgeries and her little body had gone through more than people can think of. She passed in May, and it&#x27;s really difficult to look back at old pictures because of how much stuff was going on, various levels of critical ICU care, pre&#x2F;post surgery, etc. But it wasn&#x27;t until a few months after she was born I had gone to do something in Google and realised my little girl&#x27;s very severe diaper rash, open sternum, chest tubes, etc, etc, etc photos were backed up and tagged in Photos.<p>This is sort of the premise to why I&#x27;ve de-Googled&#x2F;de-Cloud myself because I didn&#x27;t want my little girls pain and suffering to train some computer&#x27;s image detection AI.<p>How would this work for a parent of a sick child who sends their doctor, S&#x2F;O, possibly graphic in nature and personal photos of their own child? I was in close contact with my daughter&#x27;s doctor, and would send him all sorts, I can only imagine the same is true for most parents of chronic children.
评论 #28076818 未加载
mkoryakalmost 4 years ago
Hopefully they are using phash to match the phone&#x27;s images to known child abuse image hashes, because if they are using ML, I can only imagine what a false positive will do to someone&#x27;s life.
评论 #28077843 未加载
InTheArenaalmost 4 years ago
On one hand, I applaud this - CASM is a massive issue, and the internet community has never taken it as seriously as it should have.<p>As many others have noted - the potential for misuse is rife. Also, realize that Apple has a single set of infrastructure for their phones and macs, especially now that M1 exists. I&#x27;ve read accounts in the person where disaffected people &quot;planted&quot; porn on other peoples system, then called the police.<p>Imagine a web server with a page and a hash map matching image, with a frame-size of 1px. It&#x27;s now in your cache.<p>Also, it&#x27;s almost inevitable that this is a slippery slope. Today is may be pre-hashed images. Tommorow it may be AI detection. The day after it may be vaccine falsehoods. The day after it may be other bad government speach.<p>This is needed. This is a major major problem. But I hope someone is thinking of how to mitigate the potential for abuse.
mrkrameralmost 4 years ago
Imagine if Microsoft does this with Windows, lots of people would get arrested rightly so.<p>But like some people say this can be dangerous because evidence can be planted on your device. I remember reading articles how through torrent clients files could be planted but I never actually saw it in the wild.
评论 #28076757 未加载
评论 #28076241 未加载
simion314almost 4 years ago
Do we know why did Apple put the effort to create this? Who forced them to do it? Like it is a move that is bad for PR and also does not make them money, I suspect some big government is demanding it but maybe I am to simple in my thinking and this makes sense.
flibble12345almost 4 years ago
so you’ll be able to ruin someone’s life by sending them a live photo where one frame has child abuse. i find the law enforcement part of these well intentioned anti child abuse initiatives to be absolutely terrifying because the standard is not that you were proven to have been a party to child abuse but that an offending file was found on your device.
therealjumboalmost 4 years ago
Along with this: <a href="https:&#x2F;&#x2F;www.reuters.com&#x2F;technology&#x2F;exclusive-facebook-tech-giants-target-manifestos-militias-database-2021-07-26&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reuters.com&#x2F;technology&#x2F;exclusive-facebook-tech-g...</a><p>Just like the Patriot Act and the no fly list, it&#x27;s only a very short matter of time until this is expanded to Android and more importantly, non-CP content. The only difference is its the other party doing it this time, in both cases its &quot;to catch bad guys&quot; and &quot;for your own good&quot;.
coldacidalmost 4 years ago
Today kiddy porn, tomorrow wrongthink.
评论 #28077031 未加载
评论 #28076983 未加载
评论 #28077960 未加载
kwhitefootalmost 4 years ago
Better not take any more pictures of the kids splashing in the bath for granny then.
throwaway97904almost 4 years ago
This article raises serious privacy concerns. Just building the infrastructure for on-device scanning and reporting is extremely troubling. A slippery slope, or a break in the dam, as others have said. My view is that we&#x27;ve been on that slope for a long time, and this changes little. We just have to trust Apple, as has always been the case.<p>When I look at the technical details, it seems to me to be a reasonable compromise. It allows Apple and government to do something about the worst offenders, whereas it has no impact on anyone else.<p>Two technical reasons for this:<p>- The &quot;neuralMatch&quot; algorithm suggests some sort of CV, whereas the article talks about matching against a hash of know images. My guess is that the actual technology is something like Microsoft&#x27;s PhotoDNA (<a href="https:&#x2F;&#x2F;news.microsoft.com&#x2F;on-the-issues&#x2F;2018&#x2F;09&#x2F;12&#x2F;how-photodna-for-video-is-being-used-to-fight-online-child-exploitation&#x2F;" rel="nofollow">https:&#x2F;&#x2F;news.microsoft.com&#x2F;on-the-issues&#x2F;2018&#x2F;09&#x2F;12&#x2F;how-phot...</a>). Hash collisions aside, this should only produce matches against images that are already in a government database. It will match manipulated images (e.g., rotated or cropped), but it won&#x27;t match new images.<p>- As described here, the scanning only applies to images also uploaded to iCloud (&quot;[the] algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system&quot;). While we don&#x27;t know whether this description is accurate, it suggests that if you don&#x27;t back up images to iCloud your device won&#x27;t do any scanning locally. Apple already has the keys to your iCloud backups, so if you value privacy you&#x27;re probably not backing up to iCloud anyhow.<p>- It sounds like it doesn&#x27;t flag a single image, but requires multiple hash hits.<p>So, if you want to feel better about this, understand that it is a system that will flag people who are downloading storing known child exploitation images on their devices and naively backing those up to iCloud.<p>This is the sort of privacy compromise that works in practice. Serious offenders are either caught or diverted to other channels. Minor offenders are probably not caught. The risk to non-offenders is zero or close to it.<p>Apple has the control to do all sorts of invasive things to our privacy. They could be scanning and reporting all kinds of things already, and we might not even know. Or they could start doing so tomorrow. From this point of view we&#x27;re already trusting them to do right by us as their customers, and this feature doesn&#x27;t change that.
评论 #28076616 未加载
verifexalmost 4 years ago
Non pay-walled site. <a href="https:&#x2F;&#x2F;www.msn.com&#x2F;en-gb&#x2F;money&#x2F;technology&#x2F;apple-plans-to-scan-us-iphones-for-child-abuse-imagery-ft&#x2F;ar-AAMYKxe" rel="nofollow">https:&#x2F;&#x2F;www.msn.com&#x2F;en-gb&#x2F;money&#x2F;technology&#x2F;apple-plans-to-sc...</a>
DSingularityalmost 4 years ago
To everyone here who wishes we do nothing about this: you should rethink your position. There is a difference between “let’s be careful and design the right solution” and “let’s do nothing to maximize individual liberty”.<p>In 2017 a 369% increase in CSAM in terms of reports happened. The only explanation is that platforms now facilitate the sharing and a growing problem continues to grow.<p>In raw percentages the data clear: too many children are being abused and the numbers are growing.<p><a href="https:&#x2F;&#x2F;storage.googleapis.com&#x2F;pub-tools-public-publication-data&#x2F;pdf&#x2F;b6555a1018a750f39028005bfdb9f35eaee4b947.pdf" rel="nofollow">https:&#x2F;&#x2F;storage.googleapis.com&#x2F;pub-tools-public-publication-...</a>
bxjaiialmost 4 years ago
Related to this, I love Canva.com but then I noticed they were scanning my designs. Some warning popped up.<p>This idea of perpetually renting services than monitoring the use is something that should be criticized.
greatgibalmost 4 years ago
To laugh, I just imagine the society of 30 years ago and what would have happened if someone would have tried to suggest to do something like that:<p>ie, have private company give themselves the right to go have a look through your belonging and in your home because you might be a criminal.<p>It is now well known that most citizens of most countries are criminals, so no one deserve privacy anymore...<p>But sadly, it looks like that intelligence and common sense has generally decreased in the population.<p>Btw, the appleTV and iphone, that are always listening to your conversations in your home to detect some bad keywords and then denounce you to the police is not yet ready? Nazi collaborators, from their tombs, are still waiting for this nice feature!<p>Anyway, iphone owners, let me tell you that you did a good job by stupidly giving your money and power to Apple!
doe88almost 4 years ago
What if under a secret order Apple is instructed to insert non-CSAM photos, to match whatever a relevant juridiction wants to match? Effectively, you now have a black box operating on your own computer, this is deeply disturbing. This is not going to end well. And by the way this is like all the DNA stuff, you&#x27;re not only at risk from your own computer but also the all the libraries your picture happened to be in.
olliejalmost 4 years ago
Standard &quot;think of the children&quot; logic. Plenty of countries will put up hashes of LGBT content, &quot;incorrect&quot; religious content, etc
Cycl0psalmost 4 years ago
&gt; The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy and ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography.<p>The 1st and 4th amendments tell the fed to go pound sand, Apple should follow suit.
smsm42almost 4 years ago
So, how long before the same technology is used to fight &quot;misinformation&quot; and &quot;domestic terrorism&quot; (e.g. memes that disagree with official point of view)? How long until it&#x27;s used to ban &quot;hateful speech&quot;, or any speech unapproved by the gov^H^H^HReal Government - the Almighty Apple?
JohnFenalmost 4 years ago
Wow, this seems like a terrible, terrible idea on so many counts. It would certainly keep me from buying an iPhone -- the last thing I need is to get a visit from the cops because of false positives.<p>It also means that you can&#x27;t have any trust about your data safety even if you never put it in the cloud.
forgingaheadalmost 4 years ago
<a href="https:&#x2F;&#x2F;archive.ph&#x2F;ys5Q5" rel="nofollow">https:&#x2F;&#x2F;archive.ph&#x2F;ys5Q5</a>
mancerayderalmost 4 years ago
Will Google by some miracle be the company that offers freedom from forced snooping?<p>It seems not, so we have a choice between this and Google&#x27;s always-track-you setup.<p>I&#x27;m sure no one here has any illegal pictures, but the precedent is terrifying. What about copyrighted images or downloaded videos without DRM ?
underseacablesalmost 4 years ago
This is extremely alarming, I don’t care what the reason, I don’t want apple or anyone looking through my personal photos. It’s none of their business, and the sheer chance of false positives and ruining peoples lives is far too great to let this go forward.
评论 #28078338 未加载
agilobalmost 4 years ago
Will Edrogan use this to remove pictures where he is presented in bad light?
评论 #28077979 未加载
tibbydudezaalmost 4 years ago
The dumbest idea ever -so glad that I am an Android user.
xkcd-sucksalmost 4 years ago
So is the phash code actually going to run locally (I.e. can dump it and figure out the function) or is it only running on icloud?
评论 #28075427 未加载
Animatsalmost 4 years ago
Next, something that scans for pictures of cops, to detect people taking pictures of law enforcement.
jcadamalmost 4 years ago
So now anyone can frame you by sending unsolicited child porn to your iPhone.
wayneftwalmost 4 years ago
&gt; the screening is done on the phone.<p>Where is my right to refuse to dedicate my devices cpu, battery time and network bandwidth for this activity?<p>Did I agree to this already?<p>(Sorry for asking a question! I&#x27;ll try to simply trust Apple at all times from now on...)
micah94almost 4 years ago
So everyone here just has an FT subscription? I don&#x27;t see anything but a paywall.
rajacombinatoralmost 4 years ago
Privacy, except for thoughtcrimals.
kwhitefootalmost 4 years ago
Are they going to compensate the owners for the CPU time, electricity and bandwidth this will cost?
评论 #28077488 未加载
cptskippyalmost 4 years ago
Does this paywalled article have any further details? Aside from a cryptic twitter post that mentions CSAM, are there any further details?<p>CSAM also means Cyber Security Asset Management.
josh_todayalmost 4 years ago
I’ve been meaning to start a home cloud for pictures on top of Raspberry Pi. Looks like that will get done sooner than I planned
MeinBlutIstBlaualmost 4 years ago
Whose to say what is abuse? Are Caesar Milan tactics on a child abuse? This is outright dangerous beyond belief. Nobody is gonna wanna be a parent if you have to worry about your kid being mad at you and then posts something like that.<p>IIRC there was a teenage girl that lied about her father molesting her back in the 2000s. No evidence, just pure hearsay. Dude was locked up for like 10 years. Now imagine that on a greater scale with several videos and photos taken out of context.
评论 #28075487 未加载
headShrinkeralmost 4 years ago
Apple: An Apple NeuralMatch Agent has positively matched images on device &quot;Apple iPhone 12 (Red™)&quot; geotagged at the coup attempt against Prime Minister Trump occurring on January 30th, 2024. Images and documents have been submitted to the Thump Bureau of Investigation for further evaluation. Your Apple account has been permanently suspended.<p>You can request a review of Apple&#x27;s actions, however we are receiving higher than normal requests at this time. This means we may be unable to review your account.<p>Sorry for any inconvenience.
runawaybottlealmost 4 years ago
It’s 2035, a major terrorist attack occurs. Apple scans all phones for location and images near the area over the last 12 months, sends the list over since the government declared Marshall law.<p>You are now suspicious until proven not suspicious. The real terrorists don’t even use iPhones. But here we are.<p>Edit:<p>Can any legal-heads explain how this does not constitute unlawful search? Apple is not the government, but once they hand the information over, isn’t the government indirectly committing unlawful search? Don’t you need a warrant for this? ____________<p>Second Edit:<p>Wanted to reply to another comment but being rate limited:<p><i>As of June 2016, the Terrorist Watch List [No-Fly-List] is estimated to contain over 2,484,442 records, consisting of 1,877,133 individual identities.</i><p>The number of Muslims in America:<p><i>A 2017 study estimated that 3.35 million Muslims were living in the United States, about 1.1 percent of the total U.S. population.</i><p>Lol. Jesus Christ, you do the math.
评论 #28077330 未加载