TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: 59a34eabe31910abfb06f308 – NeuralHash Collision Demo

339 pointsby mono-bobover 3 years ago

17 comments

smt88over 3 years ago
This simple site is a far better demo and explanation of the extreme danger of Apple&#x27;s proposal than any of the long articles written about it.<p>Thank you for caring enough to put this together and publish it.
评论 #28305593 未加载
评论 #28305653 未加载
评论 #28311769 未加载
评论 #28306126 未加载
评论 #28305861 未加载
umviover 3 years ago
Seems like this CSAM tech could be super useful in China for detecting winnie the pooh or other evidence of thought crime against the regime. Even if Apple doesn&#x27;t end up rolling it out, I&#x27;m sure Huawei is taking careful notes.
评论 #28305785 未加载
评论 #28307317 未加载
评论 #28306200 未加载
评论 #28305919 未加载
评论 #28305774 未加载
评论 #28306619 未加载
firebazeover 3 years ago
This page directly links to the EFF: <a href="https:&#x2F;&#x2F;act.eff.org&#x2F;action&#x2F;tell-apple-don-t-scan-our-phones" rel="nofollow">https:&#x2F;&#x2F;act.eff.org&#x2F;action&#x2F;tell-apple-don-t-scan-our-phones</a><p>Please spend a few bucks on supporting them.<p>A bit of a background on <i>why</i> apple did this (this was flagged, but I don&#x27;t know why): <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28259622" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28259622</a>
评论 #28306333 未加载
评论 #28306975 未加载
评论 #28305726 未加载
nullcover 3 years ago
I think Apple may have figured out that the best way to get people to accept backdoored encryption is simply to not call it backdoored, and claim that its a privacy feature...<p>...as if having a trillion dollar corporation playing batman and going on a vigilante crusade to scan your private files is a situation we should already be comfortable with.
评论 #28308773 未加载
drexlspiveyover 3 years ago
Cool! Now do one where the user uploads the image and it tweaks it to find a collision on the fly.
评论 #28305806 未加载
CodesInChaosover 3 years ago
Since the target image is chosen, this is a (second) preimage, not merely a collision.
评论 #28307221 未加载
spoonjimover 3 years ago
Imagine hiring a young-looking 18-year old model to duplicate the photos in the database and create a hash collision. Now you have a photo which is perfectly legal for you to possess but can rain down terror on anyone you can distribute this file to.
advisedwangover 3 years ago
The argument against this tech is a slippery slope argument - that this technology will eventually be expanded to prevent copyright infringement, censor obscenity, limit political speech or other areas.<p>I know this is a controversial take (in HN circles), but I no longer believe this will happen. This kind of tech has existed for a while, and it simply hasn&#x27;t happened that it&#x27;s been mis-applied. I now think that this technology has proved to be an overall net good.
评论 #28314181 未加载
spullaraover 3 years ago
So 30+ images get flagged and they run it against the real CSAM database and it doesn&#x27;t match? Or let&#x27;s say someone is able to somehow make an image that gets flagged by both and someone looks at the image and it isn&#x27;t CSAM. Nothing happens.
评论 #28305727 未加载
评论 #28305827 未加载
WesolyKubeczekover 3 years ago
Each image on the left has a blob vaguely similar to the highlights in the dog image on the right. Likely the &quot;perceptual&quot; algorithm isn&#x27;t &quot;perceiving&quot; contrast the same way human eyes and brains do.
aliabdover 3 years ago
Here&#x27;s a web demo[0] where you can try out any two images and see the resulting hashes, and whether there&#x27;s a collision. You can also try your own transformations (rotation, adding a filter, etc) on the image. Demo was built using Gradio[1].<p>[0]: <a href="https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;akhaliq&#x2F;AppleNeuralHash2ONNX" rel="nofollow">https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;akhaliq&#x2F;AppleNeuralHash2ONNX</a> [1]: <a href="https:&#x2F;&#x2F;gradio.dev" rel="nofollow">https:&#x2F;&#x2F;gradio.dev</a>
DavideNLover 3 years ago
&gt; For example, it&#x27;s possible to detect political campaign posters or similar images on users&#x27; devices by extending the database.<p>So who controls the database?
评论 #28306469 未加载
评论 #28306386 未加载
seanbarryover 3 years ago
Can somebody please explain to me how one can go about finding images that have collision hashes? Or how you can create an image to have a specific hash?
评论 #28307504 未加载
spuzover 3 years ago
Apple have stated that they will make the database of hashes that their system uses auditable by researchers. Does anyone know if that has happened yet? Is it possible to view the database and if so, in what form? Can the actual hashes be extracted? If so then that would obviously open up the kind of attack described in the article. Otherwise, it would be interesting to know how Apple expects the database to be auditable without revealing the hashes themselves.
评论 #28305793 未加载
评论 #28305759 未加载
nonbirithmover 3 years ago
Irrespective of whether or not NeuralHash is flawed, should Apple scan user data or should they not?<p>If not, what is going to convince them to stop at this point?<p>I believe that they should scan user data in <i>some</i> capacity, because this is about data that causes harm to children.<p>However, I believe that they should <i>not</i> run the scan on the device, because that carries significant drawbacks for personal privacy.
评论 #28306134 未加载
评论 #28306076 未加载
评论 #28306121 未加载
评论 #28306230 未加载
评论 #28306087 未加载
评论 #28306116 未加载
评论 #28306445 未加载
评论 #28306206 未加载
评论 #28306156 未加载
评论 #28306133 未加载
评论 #28306357 未加载
评论 #28306318 未加载
评论 #28306263 未加载
评论 #28306277 未加载
1vuio0pswjnm7over 3 years ago
[deleted]
评论 #28306011 未加载
slgover 3 years ago
Now let&#x27;s create one for the hash matching that Google, Microsoft, and other cloud providers use.<p>If your problem with Apple&#x27;s proposal is the fact they do hash matching (rather than the system is run on your device), why is the criticism reserved for Apple instead of being directed at everyone who does hash matching to find CSAM? It seems like a lot of the backlash is because Apple is being open and honest about this process. I worry that this will teach companies that they need to hide this type of functionality in the future.
评论 #28305883 未加载
评论 #28305884 未加载