TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

One Bad Apple

1484 pointsby cwmartinalmost 4 years ago

56 comments

tgsovlerkhgselalmost 4 years ago
NCMEC has essentially shows that they have zero regard for privacy and called all privacy activists &quot;screeching voices of the minority&quot;. At the same time, they&#x27;re at the center point of a highly opaque, entrenched (often legally mandated) censorhip infrastructure that can and will get accounts shut down irrecoverably and possibly people&#x27;s homes raided, on questionable data:<p>In one of the previous discussions, I&#x27;ve seen claims about the NCMEC database containing a lot of harmless pictures misclassified as CSAM. This post confirms this again (ctrl-f &quot;macaque&quot;)<p>It also seems like the PhotoDNA hash algorithm is problematic (to the point where it may be possible to trigger false matches).<p>Now NCMEC seem to be pushing for the development of a technology that would implant an informant in every single of our devices (mandating the inclusion of this technology is the logical next step that seems inevitable if Apple launches this).<p>I&#x27;m surprised, and honestly disappointed, that the author seems to still play nice, instead of releasing the whitepaper. The NCMEC seems to have decided to position itself directly alongside other Enemies of the Internet, and while I can imagine that they&#x27;re also doing a lot of important and good work, at this point, I don&#x27;t think they&#x27;re salvageable would like to see them disbanded.<p>Really curious how this will play out. I expect attacks either sabotaging these scanning systems by flooding them with false positives, or exploiting them to get the accounts of your enemies shut down permanently by sending them a picture of a macaque.
评论 #28111676 未加载
评论 #28112915 未加载
评论 #28112621 未加载
评论 #28111007 未加载
评论 #28113415 未加载
评论 #28111223 未加载
评论 #28118723 未加载
评论 #28111158 未加载
评论 #28115236 未加载
评论 #28112764 未加载
评论 #28111608 未加载
评论 #28112577 未加载
defaultnamealmost 4 years ago
Good article, however-<p>&quot;Due to how Apple handles cryptography (for your privacy), it is very hard (if not impossible) for them to access content in your iCloud account. Your content is encrypted in their cloud, and they don&#x27;t have access. If Apple wants to crack down on CSAM, then they have to do it on your Apple device&quot;<p>I do not believe this is true. Maybe one day it will be true and Apple is planning for it, but right now iCloud service data is encrypted in the sense that they are stored encrypted at rest and in transit, however Apple holds the keys. We know this given that iCloud backups have been surrendered to authorities, and of course you can log into the web variants to view your photos, calendar, etc. Not to mention that Apple has purportedly been doing the same hash checking on their side for a couple of years.<p>Thus far there has been no compelling answer as to why Apple needs to do this on device.
评论 #28110902 未加载
评论 #28111222 未加载
评论 #28110923 未加载
评论 #28111083 未加载
评论 #28110885 未加载
评论 #28112326 未加载
评论 #28119753 未加载
评论 #28110933 未加载
cromwellianalmost 4 years ago
I don&#x27;t see many people pushing back on the child pornography laws themselves that are the cause of this. I&#x27;m stepping into a hornets nest by even bringing this up, because any criticism of the laws on the books makes one look they&#x27;re a pedo, so I&#x27;ll preface by saying, child pornography (filmed with actual kids) is vile and disgusting, but it is the production of it that is evil to be fought and suppressed, not the possession of it. Remember in the 90s, we had to deal with the Communications Decency Act I &amp; II, because every time they want to crack down on the internet, the excuse is always &quot;it&#x27;s for the children!&quot; And pedophilia is the go-to excuse in a lot of circumstances.<p>The CPPA had bans on virtual child porn (e.g. using look-alike adult actresses or CGI), that was overturned by SCOTUS, and then Congress responded with the PROTECT act which tightened up those provisions. These laws on possession are practically unenforceable with modern technology, peer to peer file sharing, onion routing, and encrypted hard drives.<p>Thus, in order to make them enforcement, the government has to put surveillance at all egress and ingress points of our private&#x2F;secure enclaves, whether it&#x27;s at the point of storing it locally, or the point of uploading it to the cloud.<p>While I agree with the goal of eliminating child porn, should it come at the cost of an omnipresent government surveillance system everywhere? One that could be used for future laws that restrict other forms of content? How about anti-vax imagery? Anti-Semitic imagery? And with other governments of the world watching, especially authoritarian governments, how long until China, which had a similar system with Jingwang Weishi (<a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Jingwang_Weishi" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Jingwang_Weishi</a>) starts asking: hey, can you extend this to Falun Gong, Islamic, Hong Kong resistance, and Tiananmen square imagery? What if Thailand passes a law that requires Apple to scan for images insulting to the Thai Monarch, does Apple comply?<p>This is a very bad precedent. I liked the Apple that said no to the FBI instead of installing backdoors. I&#x27;d prefer if Apple get fined, and battle all the way to the Supreme Court to resist this.
评论 #28111759 未加载
评论 #28112746 未加载
评论 #28112822 未加载
评论 #28112513 未加载
评论 #28111516 未加载
评论 #28111874 未加载
评论 #28113058 未加载
antmanalmost 4 years ago
There are a lot of articles about Apples hadh algorithm and for me they are mostly irrelevant to the main problem.<p>The main problem is that Apple has backdoored my device.<p>More types of bad images or other files will be scanned since now apple does not have plausible deniablity to defend any of ghe government’x requests.<p>In the future a false? positive that happened? to be of a political file that crept in the list can pin point people to the future dictator wannabe.<p>It’s always about the children or terrorism.
评论 #28111244 未加载
评论 #28113074 未加载
评论 #28111771 未加载
评论 #28111233 未加载
评论 #28112436 未加载
评论 #28111967 未加载
heavyset_goalmost 4 years ago
As a fan of this blog for longer than I can remember, it&#x27;s refreshing to hear this particular author&#x27;s take on this issue, especially considering their background.<p>I&#x27;m glad these issues were addressed in a much more elegant way than I would have put them:<p>&gt; <i>Apple&#x27;s technical whitepaper is overly technical -- and yet doesn&#x27;t give enough information for someone to confirm the implementation. (I cover this type of paper in my blog entry, &quot;Oh Baby, Talk Technical To Me&quot; under &quot;Over-Talk&quot;.) In effect, it is a proof by cumbersome notation. This plays to a common fallacy: if it looks really technical, then it must be really good. Similarly, one of Apple&#x27;s reviewers wrote an entire paper full of mathematical symbols and complex variables. (But the paper looks impressive. Remember kids: a mathematical proof is not the same as a code review.)</i><p>&gt; <i>Apple claims that there is a &quot;one in one trillion chance per year of incorrectly flagging a given account&quot;. I&#x27;m calling bullshit on this.</i>
评论 #28112268 未加载
评论 #28112050 未加载
评论 #28111827 未加载
Dah00nalmost 4 years ago
&gt;<i>18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony.</i><p>I&#x27;m not sure, after reading the article, who is&#x2F;has the most insane system of Apple or NCMEC.
评论 #28111096 未加载
评论 #28111232 未加载
评论 #28112802 未加载
评论 #28112633 未加载
IronWolvealmost 4 years ago
There was parents arrested over bath time and playing in the yard sprinklers, photos being processed at photo mats, will the same thing happen by apple mistakenly reporting parents?<p>When I worked telecom, we had md5sum database to check for this type of content. If you emailed&#x2F;sms&#x2F;uploaded a file with the same md5sum, your account was flagged and sent to legal to confirm it.<p>Also if a police was involved, the account was burned to dvd in the datacenter, and only a police officer would touch the dvd, no engineer touched or saw the evidence. (Chain of Evidence maintained)<p>Prob changed since I haven&#x27;t worked in telecom in 15 years, but one thing I&#x27;ve read for years, is the feds knew who these people are, where they hang out online, even ran the some of the honeypots. The problem is they leave these sites up to catch the ring leaders, the feds are aware, they have busts almost every month of rings of criminals. Twitter has had accounts reported, and they stay up for years.<p>I dont think finding the criminals are the problem, seems like every time this happens, theres been people of interest for years, just not enough law enforcement dedicated to investigating this.<p>All the defund the police, I think moving some police from traffic duty to Internet crimes would be more of an impact on actual cases being closed. Those crimes lead to racketeering and other organized crime anyways.
评论 #28111548 未加载
评论 #28111945 未加载
joluxalmost 4 years ago
&gt; To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.<p>Not a lawyer, but I believe this part about legality is inaccurate, because they aren’t copying your photos without notice. The feature is not harvesting suspect photos from a device, it is attaching data to <i>all</i> photos before they are uploaded to Apple’s servers. If you’re not using iCloud Photos, the feature will not be activated. Furthermore, they’re not <i>knowingly</i> transferring CSAM, because the system is designed only to notify them when a certain “threshold” of suspect images has been crossed.<p>In this way it’s identical in practice to what Google and Facebook are already doing with photos that end up on their servers, they just run the check before the upload instead of after. I certainly have reservations about their technique here, but this argument doesn’t add up to me.
评论 #28110652 未加载
评论 #28110864 未加载
评论 #28111410 未加载
robnewman7900almost 4 years ago
Wow, 20% of images being misclassified as CP would be terrifying. I never thought I&#x27;d say it but my next phone will definitely not be an iPhone or any device that is uploading my information to another server without my permission for any reason.<p>Is there even any evidence that arresting people with the wrong bit pattern on the computer helps stop child rape&#x2F;trafficking? If so, why aren&#x27;t we also going after people who go to gore websites? There&#x27;s tons of awful material out there easily accessible of people getting stabbed, shot, murdered, butchered, etc. Do we not want to find people who are keeping collections of this material on their computers? And if so, what about people who really like graphic horror movies like Saw or Hostel? Obviously it&#x27;s not real violence, but it&#x27;s definitely close enough, and if you like watching that stuff, maybe you should be on a list? If your neighbor to the left of you has videos of naked children, and your neighbor to the right has videos of people getting stabbed and tortured to death, only one should be arrested and put on a list?<p>This is all not even taking into account that someone might not even realize they are in possession of CP because someone else put it on their device. I&#x27;ve heard there&#x27;s tons of services marketing on the dark net where you pay someone $X00 in bitcoin and they remotely upload CP to any target&#x27;s computer.<p>It seems like we are going down a very scary and dangerous path.
评论 #28113612 未加载
评论 #28112395 未加载
cletusalmost 4 years ago
I&#x27;m honestly shocked that Apple is buying into this because it&#x27;s one of those well-intentioned ideas that is just incredibly bad. It also goes to show you can justify pretty much anything by saying it fights terrorism or child exploitation.<p>We went through this 20+ years ago when US companies then couldn&#x27;t export &quot;strong&quot; encryption (being stronger than 40 bits if you can believe that). Even at the time that was ridiculously low.<p>We then moved onto cryptographic back doors, which seem like a good idea but aren&#x27;t for the obvious reason that if a backdoor exists, it will be exploited by someone you didn&#x27;t intend or used by an authorized party in an unintended way (parallel construction anyone?).<p>So these photos exist on Apple servers but what they&#x27;re proposing, if I understand it correctly, is that that data will no longer be protected on their servers. That is, human review will be required in some cases. By definition that means the data can be decrypted. Of course it&#x27;ll be by (or intended to be by) authorized individuals using a secured, audited system.<p>But a backdoor now exists.<p>Also, what controls exist on those who have to review the material? What if it&#x27;s a nude photo of an adult celebrity? How confident are we that someone can&#x27;t take a snap of that on their own phone and sell it or distribute it online? It doesn&#x27;t have to be a celebrity either of course.<p>Here&#x27;s another issue: in some jurisdictions it&#x27;s technically a case of distributing CSAM to have a naked photo of yourself (if you&#x27;re underage) on your own phone. It&#x27;s just another overly broad, badly written statute thrown together in the hysteria of &quot;won&#x27;t anybody think of the children?&quot; but it&#x27;s still a problem.<p>Will Apple&#x27;s system identify such photos and lead to people getting prosecuted for their own photos?<p>What&#x27;s next after this? Uploading your browsing history to see if you visit any known CSAM trafficking sites or view any such material?<p>This needs to be killed.
评论 #28111877 未加载
评论 #28112622 未加载
评论 #28112364 未加载
anonymouse008almost 4 years ago
This feels like missing the forest from the trees — Steve Jobs said many times to the effect ‘it doesn’t matter how any of this stuff happens, GigaHertz, Ram, Speeds, it only matters that the user gets what they want.’<p>Right now Apple’s biggest unhappy user is the DOJ. As it stands with the legislation coming down the pipe and both previous administrations building on a keenness to ‘get something done’ about big tech, Apple will do as they’ve done in China and ‘obey the laws in each jurisdiction.’<p>Right now there are a lot of unwritten laws that say Apple better play right or lose quite a bit more —<p>So, <i>how</i> it’s getting done is a side show.<p>That said, it wasn’t long ago that they stood toe to toe with the FBI —- but there also weren’t wonderfully strong ‘sanctions’ on the horizon.
评论 #28111168 未加载
评论 #28111793 未加载
Klonoaralmost 4 years ago
I appreciate just about everything about this post, but this part keeps getting lost in everything I see written about it:<p>&gt;As noted, Apple says that they will scan your Apple device for CSAM material. If they find something that they think matches, then they will send it to Apple. The problem is that you don&#x27;t know which pictures will be sent to Apple.<p>It&#x27;s iCloud Photos. Apple has <i>explicitly</i> said it&#x27;s iCloud photos. If it&#x27;s being synced to iCloud Photos, you know it&#x27;s getting scanned one way or another (server side, currently, or client side, going forward).<p>It notes privacy issues, but... iCloud syncs by default. You wouldn&#x27;t do the kind of work they&#x27;re talking about (e.g, investigation) and store that kind of material where it could be synced to a server to begin with.<p>Everyone keeps proclaiming that Apple is scanning your entire device, but that&#x27;s not what&#x27;s happening with this change. It&#x27;s not even comparable to A&#x2F;V in this respect - it would be a very different story if that was the case. The wording and explanation matters.
评论 #28111498 未加载
评论 #28111327 未加载
initplusalmost 4 years ago
This reads like a failure of the NCMEC, and the legal system surrounding it.<p>It is insane that using perceptual hashes is likely illegal. As the hashes are actually somewhat reversible and so possession of the hash is a criminal offence. It just shows how twisted up in itself the law is in this area.<p>One independent image analysis service should not be beating reporting rates of major service providers. And NCMEC should not be acting like detection is a trade secret. Wider detection and reporting is the goal.<p>And the law as setup prevents developing detection methods. You cannot legally check the results of your detection (which Apple are doing), as that involves transmitting the content to someone other than the NCMEC!
评论 #28111282 未加载
评论 #28111414 未加载
评论 #28111014 未加载
zmmmmmalmost 4 years ago
There is so much focus on the technical aspects such as probability of mismatch etc.<p>For me the risk is much more that through some mechanism outside my control real CSAM material becomes present on my device. Whether its a dodgy web site, a spam email, a successful hack attempt or something else like that, I feel like there&#x27;s a significant chance some day I&#x27;ll end up with this stuff injected onto my phone without me knowing. So I&#x27;m not at all concerned about the technical capacity to accurately match to CP etc. In fact I&#x27;m even more worried if its really accurate because then I know when this unfortunate event happens I face a huge risk of being immediately flagged before I even know about the content and then spending years extricating myself from a ruined reputation and a legal system that treats evidence like this with far more trust than it should have.
评论 #28112922 未加载
评论 #28112721 未加载
评论 #28113832 未加载
NKosmatosalmost 4 years ago
Really nice explanation from someone who knows a thing or two about images&#x2F;photos (Dr. Neal Krawetz is the creator of <a href="https:&#x2F;&#x2F;fotoforensics.com" rel="nofollow">https:&#x2F;&#x2F;fotoforensics.com</a> and specializes in computer forensics).
评论 #28110955 未加载
g5095almost 4 years ago
I&#x27;ve been a FOSS dev for 25 years and I remember when everyone else I worked with were avid linux&#x2F;freeBSD users because &#x27;we didn&#x27;t trust the big end of town&#x27;.. over the years I&#x27;ve watched the vast majority of devs move to apple devices for all sorts of &#x27;just works&#x27;, &#x27;shinier&#x27; reasons that just boil down to &#x27;convenience is more important than privacy&#x27;.<p>Perhaps this is just the benefit of longevity but from my POV it was engineer early adoption and advocacy that made Apple, Google Search etc what they are, and it will be engineer early adoption and advocacy that dethrones these problematic companies from controlling the ecosystem..<p>Back 20 years ago, before the community filled with $_$ dollars-struck startup founders, software was built by people who wanted to use it.. rather than sell it. There are still some people doing this now, Look at Matrix network for instance.<p>What will it take for a grass-roots software industry to start building privacy-first apps and systems that don&#x27;t suck, based on decentralised, distributed principles? We have the skills to build highly polished alternatives to these things, but it takes a determination to step away from convenience for a period of time for the sake of privacy.<p>How bad does it have to get before the dev community realise this? or are we in a frog boiling slowly scenario and it&#x27;s hopeless?
twoodfinalmost 4 years ago
The author of this article purports to have done a ton of research into this system, but appears to have missed basic information that I’ve acquired from a few podcasts.<p>Namely the “1-in-a-trillion” false positives per account per year is based on the likelihood of <i>multiple</i> photos matching the database (Apple doesn’t say how many are required to trip their manual screening threshold).
评论 #28111026 未加载
评论 #28113017 未加载
gscottalmost 4 years ago
Kim Dotcom did this and it lead to Riaa saying it obligated the file sharing service to look for all copyrighted materials. This opens pandoras box for Apple.
评论 #28113013 未加载
ksecalmost 4 years ago
&gt;Think of it this way: Your landlord owns your property, but in the United States, he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it&#x27;s theft. Apple&#x27;s license agreement says that they own the operating system, but that doesn&#x27;t give them permission to search whenever they want or to take content.<p>Yes. That is an analogy only with Apple&#x27;s software but not hardware. In Apple&#x27;s view they are selling you the experience. So they are more like Hotels, You dont own the hotel room, the bed, the TV or anything inside that room. And in a Hotel, they can do Room Cleaning anytime they want.
robertwt7almost 4 years ago
&gt; Apple then manually reviews each report to confirm there is a match,<p>This is always the terrifying part for me. They will access your personal photos or data without telling you. I’m surprised how is that even legal given all the law that are already available. Are they immune to those laws stated in thd blog?<p>Also what happens when they launch this in EU, AU, etc with different privacy laws?
评论 #28111147 未加载
评论 #28112413 未加载
borlandalmost 4 years ago
The &quot;legal&quot; section talks about local scanning, and possible transmission of CSAM from devices to Apple, in pursuit of verification, however Apple have made clear that the scanning happens only for files that have been uploaded to iCloud Photo Library -- in which case they are not deliberately transmitting the CSAM but rather flagging something which the user already sent them.<p>Likewise the copyright issue; The user has already sent these files to Apple themselves by enabling iCloud photo library, and Apple are not making any additional copies that I am aware of.<p>It also says &quot;The problem is that you don&#x27;t know which pictures will be sent to Apple.&quot; - but we do know exactly which pictures will and will not be sent to apple; the ones that are already sent by iCloud Photo library.<p>[To be clear, I don&#x27;t like the precedent&#x2F;slippery slope that this kind of technique might lead towards in the future, but it doesn&#x27;t seem like all the criticisms of it today are valid]
评论 #28111990 未加载
dmitryminkovskyalmost 4 years ago
&gt; However, nothing in the iCloud terms of service grants Apple access to your pictures for use in research projects, such as developing a CSAM scanner. (Apple can deploy new beta features, but Apple cannot arbitrarily use your data.) In effect, they don&#x27;t have access to your content for testing their CSAM system.<p>&gt; If Apple wants to crack down on CSAM, then they have to do it on your Apple device.<p>I don’t understand… Apple can’t change their TOS but they can install this scanning service on your device?
评论 #28111217 未加载
评论 #28111112 未加载
DethNinjaalmost 4 years ago
There is one particular thing I don’t understand about this Apple policy:<p>You can buy a SIM card and send images to your enemies&#x2F;competitors through WhatsApp, and these images automatically gets downloaded to iPhone and potentially uploaded to iCloud.<p>What precautions are Apple taking against such actions? Or will it be some kind of exploitable implementation where you can easily swat any person you want and let them go to courts to prove their innocence?
评论 #28112606 未加载
评论 #28112099 未加载
xvectoralmost 4 years ago
The engineers that worked on this should honestly be ashamed of themselves. We need some sort of oath of ethics in computer science.
kemayoalmost 4 years ago
&gt; To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.<p>I think the article is wrong about this. Or, right-but-situationally-irrelevant. As far as I can tell from Apple&#x27;s statements, they&#x27;re doing this only to photos which are being uploaded to iCloud Photos. So, any photo this is happening to is one that you&#x27;ve already asked Apple to copy to their servers.<p>&gt; In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple -- not NCMEC.<p>I also suspect this is a fuzzy area, and anything legal would depend on when they can actually be said to be certain there&#x27;s illegal material involved.<p>Apple&#x27;s process seems to be: someone has uploaded photos to iCloud and enough of their photos have tripped this system that they get a human review; if the human agrees it&#x27;s CSAM, they forward it on to law enforcement. There <i>is</i> a chance of false positives, so the human review step seems necessary...<p>After all, &quot;Apple has hooked up machine learning to automatically report you to the police for child pornograpy with no human review&quot; would have been a <i>much</i> worse news week for Apple. :D
评论 #28111908 未加载
评论 #28111567 未加载
IshKebabalmost 4 years ago
&gt; Apple claims that there is a &quot;one in one trillion chance per year of incorrectly flagging a given account&quot;. I&#x27;m calling bullshit on this.<p>Why? You can get any false positive rate you want if you don&#x27;t care about the false negative rate.<p>It seems likely that that was a design criterion and they just tweaked the thresholds and number of hits required until they got it.<p>The last analysis on HN about this made the exact same mistake, and it&#x27;s a pretty obvious one so I&#x27;m skeptical about the rest of their analyses.<p>It is nice to have some actual numbers from this article though about how much CP they report, the usefulness of MD5 hashes, etc.<p>Edit: reading on, it seems like he just misread - it sounds like he thinks they&#x27;re saying there&#x27;s a 1 in a trillion chance of a false positive on a <i>photo</i> but Apple are talking about an <i>account</i> which requires multiple photo hits. The false positive rate per photo might be 1 in 1000 but if you need 10 hits then it&#x27;s fine.
celeritasceleryalmost 4 years ago
This made me think of a good point regarding apples error rate of “one in a trillion”. If they are so confident that there won’t be false positive’s why bother sending it to apple for manual review? Clearly they think the error rate will be high enough that a human has to double check everything. But that is not what they are saying.
评论 #28112855 未加载
therealmarvalmost 4 years ago
If this is true and content of pictures can be unhashed to a 26x26 picture this is ethically a total nightmare. Nobody wants to carry a phone with child porn with them, discusting if you think about it.<p>I don&#x27;t even want to imagine what very religious people and countries will think about the iPhone then.
judge2020almost 4 years ago
&gt; As it was explained to me by my attorney, that is a felony.<p>Apple could argue they were already going to receive the photo (since this algorithm only affects Photos destined for iCloud Photos) and thus &quot;when in the upload process it was classified&quot; is simply technological semantics.<p>&gt; This was followed by a memo leak, allegedly from NCMEC to Apple:<p>Well, we certainly are the minority. If a majority of people knew and were mad we&#x27;d have protests in major cities.
mlindneralmost 4 years ago
That PhotoDNA is reversible is ridiculously shocking to me...
评论 #28112292 未加载
easterncalculusalmost 4 years ago
I&#x27;m just wondering how long it will take before average people, jokingly or otherwise, imply that buying an android phone makes you a criminal.
评论 #28111778 未加载
vpmpaulalmost 4 years ago
Maybe you can answer this. What amount of the CP pictures are actual CP? Like are teens posting selfies of themselves in swimsuits or revealing clothes being submitted? People with their kids in the bathtub&#x2F;pool? Are most of these pictures real give you nightmares CP?<p>Its seems insane to me that anyone would knowingly upload CP to a forensics site on purpose. Much less several times a day.
hdjjhhvvhgaalmost 4 years ago
&gt; If someone were to release code that reverses NCMEC hashes into pictures, then everyone in possession of NCMEC&#x27;s PhotoDNA hashes would be in possession of child pornography.<p>Please correct me if I&#x27;m wrong, but wouldn&#x27;t it be more correct to say they &quot;would be in possession of images recognized by PhotoDNA as child pornography&quot; rather than actual CP?
评论 #28114566 未加载
Vorhalmost 4 years ago
<a href="https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20210808233609&#x2F;https:&#x2F;&#x2F;www.hackerfactor.com&#x2F;blog&#x2F;index.php?&#x2F;archives&#x2F;929-One-Bad-Apple.html" rel="nofollow">https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20210808233609&#x2F;https:&#x2F;&#x2F;www.hacke...</a> (I can&#x27;t seem to access archive.is, so archive.org)
starchild_3001almost 4 years ago
Bad Apple it is. Also hypocrite Apple, with their fake (at the very least misleading) privacy stance against internet advertising.
andrewmcwattersalmost 4 years ago
1 in 1 trillion of what?<p><a href="https:&#x2F;&#x2F;i.imgur.com&#x2F;E1YRsXQ.jpg" rel="nofollow">https:&#x2F;&#x2F;i.imgur.com&#x2F;E1YRsXQ.jpg</a>
SPBSalmost 4 years ago
&gt; Apple&#x27;s license agreement says that they own the operating system, but that doesn&#x27;t give them permission to search whenever they want or to take content.<p>If this is true, how has this gotten past Apple&#x27;s legal team? Are they not aware it would be a flagrant violation of the law?
charles_falmost 4 years ago
&gt; So where else could they get 1 trillion pictures?<p>That&#x27;s a real kicker in my opinion. Unless they get training data from NCMEC I struggle to understand how they&#x27;re training their model? Unless it&#x27;s entirely algorithmic and not based on ML?
jscipionealmost 4 years ago
What assurances do we have that this system will not be used to flag &quot;extremist content&quot; i.e. memes and to send that information to law-enforcement in the future?
bluntealmost 4 years ago
Somewhat tangential, but people like this author amaze me in their deep knowledge AND ability to communicate it well.<p>Also, none of this topic is something I would want to deal with.
mistermarcusalmost 4 years ago
You believe CP is evil. You believe privacy is sacred. You are Tim Cook. What do you do?<p>If this is a sincere effort, clearly Apple has failed to thread the needle. This announcement could also be a fumbled attempt to reframe what is already a common practice at the company, to get ahead of a leak. I heard from a friend who&#x27;s related to an Apple employee that Apple scans the mountains of data on its servers, already, for &quot;market research&quot;. The claims otherwise... marketing gambits
ChrisMarshallNYalmost 4 years ago
Here&#x27;s the gist of a post I made a couple of days ago (I removed one sentence that someone considered inflammatory):<p>Having known <i>many</i> victims of sexual violence and trafficking (Seriously. I deal with them, several times a week, and have, for decades), I feel for the folks that honestly want <i>that particular</i> kind of crime to stop. Humans can be complete scum. Most folks in this community may think they know how low we can go, but you are likely being optimistic.<p>That said, law enforcement has a nasty habit of having a rather &quot;binary&quot; worldview. People are either cops, or uncaught criminals.<p>With that worldview, it can be quite easy to &quot;blur the line&quot; between child sex traffickers, and traffic ticket violators. I remember reading a <i>The Register</i> article, about how anti-terrorism tools were being abused by local town councils to do things like find zoning violations (for example, pools with no CO).<p>Misapplied laws can be <i>much</i> worse than letting some criminals go. This could easily become a nightmare, if we cede too much to AI.<p>And that isn&#x27;t even talking about totalitarian regimes, run by people of the same ilk as child sex traffickers (only wearing Gucci, and living in palaces).<p><i>”Any proposal must be viewed as follows. Do not pay overly much attention to the benefits that might be delivered were the law in question to be properly enforced, rather one needs to consider the harm done by the improper enforcement of this particular piece of legislation, whatever it might be.” -Lyndon B. Johnson</i><p><i>[EDITED TO ADD]:</i> And I 100% agree that, if we <i>really</i> want to help children, and victims of other crimes, then we need to start working on the <i>root causes</i> of the issues.<p>Poverty is, arguably, the #1 human problem on Earth, today. It causes levels of desperation that ignore things like climate change, resource shortages, and pollution. People are so desperate to get out of the living hell that 90% of the world experiences, daily, that they will do <i>anything</i> (like sell children for sex), or are angry enough to cause great harm.<p>If we <i>really</i> want to solve a significant number of world problems, we need to deal with poverty; and that is not simple at all. I have members of my family that have been working on that, for decades. I have heard all the &quot;yeah...but&quot; arguments that expose supposedly simple solutions as...<i>not simple</i>.<p>Of course, the biggest issue, is that the folks in the 0.0001% need to loosen their hands on their stashes, and that ain&#x27;t happening, anytime soon. I don&#x27;t know if the demographic represented by the Tech scene is up for that, since the 0.0001% are our heroes.
HALtheWisealmost 4 years ago
Given that the weights for the NeuralHash algorithm are being shipped to every iOS device and neural network adversarial attacks are pretty well studied in literature, it should be trivial to make a website or Android camera app that tweaks a few pixels of an image to make it have a hash collision with apple&#x27;s CSAM database. If anyone seriously wants to kill this initiative, widely distributing a few memes with hash collisions could go a long way.
Little_Johnalmost 4 years ago
I hear a Siren Outside Someone has a serious problem. Don&#x27;t just shine it on.
chris_wotalmost 4 years ago
If they file even one false claim, then it ruins lives.
netr0utealmost 4 years ago
To help fight back against false positives, why not just repeatedly trigger the code that sends the data to NCMEC (per the article&#x27;s claimed legal requirements) and create a DoS attack?
评论 #28111280 未加载
评论 #28110967 未加载
max_entropyalmost 4 years ago
Imagine being a victim of CP, knowing that all over the world people hold in their hands devices storing artifacts of your terror.
fspacefalmost 4 years ago
Great title.
ikhanhmaialmost 4 years ago
awesome product!
ikhanhmaialmost 4 years ago
cool
a-dubalmost 4 years ago
so maybe i&#x27;m confused, but i thought it worked like this:<p>pre this thing:<p>* before syncing photos to icloud, the device encrypts them with a device local key, so they sit on apple&#x27;s servers encrypted at rest and apple cannot look at them unless they push an update to your device that sends them your key or uploads your photos unencrypted somewhere else<p>after this thing:<p>* before syncing photos to icloud, the device encrypts them, but there are essentially two keys. one on your device, and one that can be derived on their servers under special circumstances. the device also hashes the image, but using one of these fancy hashes that are invariant to crops, rotations, translations and noise (like shazam, but for pictures)<p>* the encrypted photo is uploaded along with the hash (in a special crypto container)<p>* their service scans all the hashes, but uses crypto magic that does the following:<p>1) it does some homeomorphic encryption thing where they don&#x27;t actually see the hash, but they get something like a zero knowledge proof if the image&#x27;s hash (uploaded along with the image in the &quot;special crypto container&quot;) is in their list of bad stuff<p>2) if enough of these hit, then there&#x27;s a key that pops out of this process that lets them decrypt the images that were hits<p>3) the images get added to a list where a room full of unfortunate human beings look at them and confirm that there&#x27;s nothing good going on in those photos<p>4) they alert law enforcement<p>a couple points of confusion to me:<p>1) i&#x27;m assuming they get their &quot;one in a trillion&quot; thing based on two factors. one being the known false positive rate of their perceptual hashing method and the other being their tunable number of hits necessary to trigger decryption. so they regularly benchmark their perceptual hash thing and compute a false positive rate, and then adjust the threshold to keep their overall system false positive probability where they want it?<p>2) all the user&#x27;s photos are stored encrypted at rest, it seems that this thing isn&#x27;t client side scanning, but rather assistance for server side scanning of end to end encrypted content put on their servers.<p>first off i think it&#x27;s actually pretty cool that a consumer products company offers end to end encrypted cloud backup of your photos. i don&#x27;t think google does this, or anyone else. they can just scan images on the server. second off, this is some pretty cool engineering (if i understand it correctly). they&#x27;re providing more privacy than their competition aaand they&#x27;ve given themselves a way to ensure that they&#x27;re not in violation of the law by hosting CSAM for their customers.<p>but i guess the big question is, if people don&#x27;t like this, can&#x27;t they just disable icloud?
评论 #28112539 未加载
eh9almost 4 years ago
I really HATE usage of “CP”, it’s abuse and has nothing to do with pornography.
neilvalmost 4 years ago
I&#x27;m concerned about personal security risks of this move by Apple (e.g., Swatting-like attacks).<p>Just wasted the weekend on this, and now asking for options: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28111995" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28111995</a>
drivingmenutsalmost 4 years ago
I think Apple may have really screwed the pooch with this move. They&#x27;re going to catch hell for being able to take images from your device without warning or consent, and, they&#x27;ll catch hell if they remove it.<p>Worse, they&#x27;ve also opened the door to government censorship of images and content and propped that door wide open.
评论 #28111817 未加载
评论 #28111344 未加载
cratermoonalmost 4 years ago
Question: who is or will be making money on this deal? Answer that (&quot;follow the money&quot;) and then I think we&#x27;ll have a handle on what&#x27;s really going on.
评论 #28111290 未加载
评论 #28111289 未加载
iJohnDoealmost 4 years ago
I haven’t read all the details, articles, and comments. My personal thoughts on the whole situation are the following.<p>If you are a parent and lose a child then you would want every possible avenue taken to find your child. You would be going mad wanting to find them. If there is a way to match photos to known missing children then I say it should be at least tried.<p>I equate this to Ring cameras. They are everywhere. You cannot go for a walk without showing up on dozens of cameras, which we know Amazon (god mode) and law enforcement abuse their access privileges. However, if a crime happened to you and a Ring camera captured it, then I know almost everyone would certainly want that footage reviewed. Would you ignore the Ring footage possibility just because you despise Ring cameras? Probably not.<p>It’s all an invasion of privacy until you’re sitting on the other side of the table where you have a vested interest in getting access to the information.
评论 #28112540 未加载
评论 #28111505 未加载
评论 #28112561 未加载
评论 #28111783 未加载
评论 #28111507 未加载
评论 #28111508 未加载
评论 #28111491 未加载
评论 #28111869 未加载
评论 #28111520 未加载