TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Expanded Protections for Children – FAQ [pdf]

45 点作者 almostdigital将近 4 年前

11 条评论

YokoSix将近 4 年前
&gt; <i>Can non-CSAM images be “injected” into the system to flag accounts for things other than CSAM?<p>Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes.</i><p>The problem is not that Apple can&#x27;t add images to the database but that organizations from the outside can inject any hashes to the new, constantly sniffing system at the heart of iOS, padOS and macOS. Apple has no way to verify those or any hashes before they get injected into the database.<p>If the system detects any matches only some overworked and underpaid content checker from Bangladesh is there to stop your life from being destroyed by some SWAT team crashing through your front door at 3am, killing your barking dog. And who knows if those foreign sweatshops are even trustable.
评论 #28114371 未加载
评论 #28114542 未加载
gregoriol将近 4 年前
There is one question missing: if China asks Apple to flag users having Winnie-the-Pooh images on their devices, or leave the chinese market, what will Apple choose?
评论 #28114487 未加载
评论 #28114521 未加载
评论 #28114469 未加载
评论 #28114719 未加载
评论 #28114767 未加载
avnigo将近 4 年前
There are apps, like WhatsApp, that allow you to save photos you receive to your camera roll instantly.<p>If somebody, or another compromised device sends a large collection of CSAM to your device, they will be uploaded to iCloud, probably before you get a chance to remove them -- the equivalent of &quot;swatting&quot;.<p>Besides the apps that you give permission to store photos in your Photos app, what about malware such as Pegasus we&#x27;ve seen again and again?<p>I wonder if we&#x27;ll start hearing a year from now about journalists, political dissidents, or even candidates running for office going to jail for being in possession of CSAM. It would be much easier to take out your opponents when you know Apple will report it for you.<p>I guess all this does is disincentivize anyone who cares about their privacy from using iCloud Photos, which is sadly ironic since privacy is what Apple was going for.
评论 #28114842 未加载
评论 #28115255 未加载
评论 #28114883 未加载
tomduncalf将近 4 年前
Feels like they messed up the comms on this in a quite un-Apple-like way.<p>My understanding (high level) is their system is designed to improve user privacy by meaning they don&#x27;t need to be able to decrypt photos on iCloud (which is, if I understand correctly, how other cloud providers do this scanning, which they are required to do by law?), but rather do it on the device - without going in to the upsides and downsides of either approach, I&#x27;m surprised they didn&#x27;t manage to communicate more clearly in the initial messaging that this is a &quot;privacy&quot; feature and why they are taking this approach, and instead are left dealing with some quite negative press.
评论 #28114685 未加载
评论 #28115022 未加载
评论 #28117107 未加载
评论 #28114598 未加载
deadalus将近 4 年前
Expectation : Political rivals and enemies of powerful people will be taken out because c*ild pornography will be found in their phone. Pegasus can already monitor and exfiltrate every ounce of data right now, it won&#x27;t be that hard to insert compromising images on the infected device.
评论 #28114758 未加载
jstx1将近 4 年前
&gt; Why is Apple doing this now?<p>I find the answer to this question unconvincing.<p>If we think very selfishly from the company&#x27;s perspective - Apple already had one of the most secure, private and trusted platforms. And they must have anticipated the backlash against the new feature. So I still don&#x27;t get why a company like Apple would consider the marginal benefit from this to be worth the cost.
评论 #28114582 未加载
Hamuko将近 4 年前
&gt;<i>CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations.</i><p>How is Apple validating the datasets for non-US child safety organisations?
评论 #28114551 未加载
AndrewDucker将近 4 年前
Something I&#x27;d missed before: &quot;By design, this feature only applies to photos that the user chooses to upload to iCloud Photos&quot;<p>This is not about what people have on their own phones. This is about what people are uploading to iCloud, because Apple does not want CSAM on their servers!
评论 #28114563 未加载
评论 #28115943 未加载
评论 #28114360 未加载
rasguanabana将近 4 年前
Everyone has said everything wrong about it already. Nevertheless, Apple can sugarcoat it as much as they like. There’s no <i>technical</i> control (no actual nor possible one) making this exclusively about targeting CSAM.
epaga将近 4 年前
It&#x27;s frustrating (though not at all surprising) to see Apple continue to be so tone-deaf. They clearly think &quot;If only we could make people understand how it works, they wouldn&#x27;t be so upset, in fact they&#x27;d thank us.&quot;<p>This is not the case - we do understand how it works, and we think it&#x27;s a bad idea.
DavideNL将近 4 年前
The question and answer i’m missing is:<p>Will Apple notify a user once an image has been (&#x2F;erroneously) flagged and will be inspected by Apple employees?
评论 #28114809 未加载
评论 #28117145 未加载