TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: 59a34eabe31910abfb06f308 – NeuralHash Collision Demo

339 点作者 mono-bob超过 3 年前

17 条评论

smt88超过 3 年前
This simple site is a far better demo and explanation of the extreme danger of Apple&#x27;s proposal than any of the long articles written about it.<p>Thank you for caring enough to put this together and publish it.
评论 #28305593 未加载
评论 #28305653 未加载
评论 #28311769 未加载
评论 #28306126 未加载
评论 #28305861 未加载
umvi超过 3 年前
Seems like this CSAM tech could be super useful in China for detecting winnie the pooh or other evidence of thought crime against the regime. Even if Apple doesn&#x27;t end up rolling it out, I&#x27;m sure Huawei is taking careful notes.
评论 #28305785 未加载
评论 #28307317 未加载
评论 #28306200 未加载
评论 #28305919 未加载
评论 #28305774 未加载
评论 #28306619 未加载
firebaze超过 3 年前
This page directly links to the EFF: <a href="https:&#x2F;&#x2F;act.eff.org&#x2F;action&#x2F;tell-apple-don-t-scan-our-phones" rel="nofollow">https:&#x2F;&#x2F;act.eff.org&#x2F;action&#x2F;tell-apple-don-t-scan-our-phones</a><p>Please spend a few bucks on supporting them.<p>A bit of a background on <i>why</i> apple did this (this was flagged, but I don&#x27;t know why): <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28259622" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28259622</a>
评论 #28306333 未加载
评论 #28306975 未加载
评论 #28305726 未加载
nullc超过 3 年前
I think Apple may have figured out that the best way to get people to accept backdoored encryption is simply to not call it backdoored, and claim that its a privacy feature...<p>...as if having a trillion dollar corporation playing batman and going on a vigilante crusade to scan your private files is a situation we should already be comfortable with.
评论 #28308773 未加载
drexlspivey超过 3 年前
Cool! Now do one where the user uploads the image and it tweaks it to find a collision on the fly.
评论 #28305806 未加载
CodesInChaos超过 3 年前
Since the target image is chosen, this is a (second) preimage, not merely a collision.
评论 #28307221 未加载
spoonjim超过 3 年前
Imagine hiring a young-looking 18-year old model to duplicate the photos in the database and create a hash collision. Now you have a photo which is perfectly legal for you to possess but can rain down terror on anyone you can distribute this file to.
advisedwang超过 3 年前
The argument against this tech is a slippery slope argument - that this technology will eventually be expanded to prevent copyright infringement, censor obscenity, limit political speech or other areas.<p>I know this is a controversial take (in HN circles), but I no longer believe this will happen. This kind of tech has existed for a while, and it simply hasn&#x27;t happened that it&#x27;s been mis-applied. I now think that this technology has proved to be an overall net good.
评论 #28314181 未加载
spullara超过 3 年前
So 30+ images get flagged and they run it against the real CSAM database and it doesn&#x27;t match? Or let&#x27;s say someone is able to somehow make an image that gets flagged by both and someone looks at the image and it isn&#x27;t CSAM. Nothing happens.
评论 #28305727 未加载
评论 #28305827 未加载
WesolyKubeczek超过 3 年前
Each image on the left has a blob vaguely similar to the highlights in the dog image on the right. Likely the &quot;perceptual&quot; algorithm isn&#x27;t &quot;perceiving&quot; contrast the same way human eyes and brains do.
aliabd超过 3 年前
Here&#x27;s a web demo[0] where you can try out any two images and see the resulting hashes, and whether there&#x27;s a collision. You can also try your own transformations (rotation, adding a filter, etc) on the image. Demo was built using Gradio[1].<p>[0]: <a href="https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;akhaliq&#x2F;AppleNeuralHash2ONNX" rel="nofollow">https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;akhaliq&#x2F;AppleNeuralHash2ONNX</a> [1]: <a href="https:&#x2F;&#x2F;gradio.dev" rel="nofollow">https:&#x2F;&#x2F;gradio.dev</a>
DavideNL超过 3 年前
&gt; For example, it&#x27;s possible to detect political campaign posters or similar images on users&#x27; devices by extending the database.<p>So who controls the database?
评论 #28306469 未加载
评论 #28306386 未加载
seanbarry超过 3 年前
Can somebody please explain to me how one can go about finding images that have collision hashes? Or how you can create an image to have a specific hash?
评论 #28307504 未加载
spuz超过 3 年前
Apple have stated that they will make the database of hashes that their system uses auditable by researchers. Does anyone know if that has happened yet? Is it possible to view the database and if so, in what form? Can the actual hashes be extracted? If so then that would obviously open up the kind of attack described in the article. Otherwise, it would be interesting to know how Apple expects the database to be auditable without revealing the hashes themselves.
评论 #28305793 未加载
评论 #28305759 未加载
nonbirithm超过 3 年前
Irrespective of whether or not NeuralHash is flawed, should Apple scan user data or should they not?<p>If not, what is going to convince them to stop at this point?<p>I believe that they should scan user data in <i>some</i> capacity, because this is about data that causes harm to children.<p>However, I believe that they should <i>not</i> run the scan on the device, because that carries significant drawbacks for personal privacy.
评论 #28306134 未加载
评论 #28306076 未加载
评论 #28306121 未加载
评论 #28306230 未加载
评论 #28306087 未加载
评论 #28306116 未加载
评论 #28306445 未加载
评论 #28306206 未加载
评论 #28306156 未加载
评论 #28306133 未加载
评论 #28306357 未加载
评论 #28306318 未加载
评论 #28306263 未加载
评论 #28306277 未加载
1vuio0pswjnm7超过 3 年前
[deleted]
评论 #28306011 未加载
slg超过 3 年前
Now let&#x27;s create one for the hash matching that Google, Microsoft, and other cloud providers use.<p>If your problem with Apple&#x27;s proposal is the fact they do hash matching (rather than the system is run on your device), why is the criticism reserved for Apple instead of being directed at everyone who does hash matching to find CSAM? It seems like a lot of the backlash is because Apple is being open and honest about this process. I worry that this will teach companies that they need to hide this type of functionality in the future.
评论 #28305883 未加载
评论 #28305884 未加载