TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Expanded Protections for Children – FAQ [pdf]

45 pointsby almostdigitalalmost 4 years ago

11 comments

YokoSixalmost 4 years ago
&gt; <i>Can non-CSAM images be “injected” into the system to flag accounts for things other than CSAM?<p>Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes.</i><p>The problem is not that Apple can&#x27;t add images to the database but that organizations from the outside can inject any hashes to the new, constantly sniffing system at the heart of iOS, padOS and macOS. Apple has no way to verify those or any hashes before they get injected into the database.<p>If the system detects any matches only some overworked and underpaid content checker from Bangladesh is there to stop your life from being destroyed by some SWAT team crashing through your front door at 3am, killing your barking dog. And who knows if those foreign sweatshops are even trustable.
评论 #28114371 未加载
评论 #28114542 未加载
gregoriolalmost 4 years ago
There is one question missing: if China asks Apple to flag users having Winnie-the-Pooh images on their devices, or leave the chinese market, what will Apple choose?
评论 #28114487 未加载
评论 #28114521 未加载
评论 #28114469 未加载
评论 #28114719 未加载
评论 #28114767 未加载
avnigoalmost 4 years ago
There are apps, like WhatsApp, that allow you to save photos you receive to your camera roll instantly.<p>If somebody, or another compromised device sends a large collection of CSAM to your device, they will be uploaded to iCloud, probably before you get a chance to remove them -- the equivalent of &quot;swatting&quot;.<p>Besides the apps that you give permission to store photos in your Photos app, what about malware such as Pegasus we&#x27;ve seen again and again?<p>I wonder if we&#x27;ll start hearing a year from now about journalists, political dissidents, or even candidates running for office going to jail for being in possession of CSAM. It would be much easier to take out your opponents when you know Apple will report it for you.<p>I guess all this does is disincentivize anyone who cares about their privacy from using iCloud Photos, which is sadly ironic since privacy is what Apple was going for.
评论 #28114842 未加载
评论 #28115255 未加载
评论 #28114883 未加载
tomduncalfalmost 4 years ago
Feels like they messed up the comms on this in a quite un-Apple-like way.<p>My understanding (high level) is their system is designed to improve user privacy by meaning they don&#x27;t need to be able to decrypt photos on iCloud (which is, if I understand correctly, how other cloud providers do this scanning, which they are required to do by law?), but rather do it on the device - without going in to the upsides and downsides of either approach, I&#x27;m surprised they didn&#x27;t manage to communicate more clearly in the initial messaging that this is a &quot;privacy&quot; feature and why they are taking this approach, and instead are left dealing with some quite negative press.
评论 #28114685 未加载
评论 #28115022 未加载
评论 #28117107 未加载
评论 #28114598 未加载
deadalusalmost 4 years ago
Expectation : Political rivals and enemies of powerful people will be taken out because c*ild pornography will be found in their phone. Pegasus can already monitor and exfiltrate every ounce of data right now, it won&#x27;t be that hard to insert compromising images on the infected device.
评论 #28114758 未加载
jstx1almost 4 years ago
&gt; Why is Apple doing this now?<p>I find the answer to this question unconvincing.<p>If we think very selfishly from the company&#x27;s perspective - Apple already had one of the most secure, private and trusted platforms. And they must have anticipated the backlash against the new feature. So I still don&#x27;t get why a company like Apple would consider the marginal benefit from this to be worth the cost.
评论 #28114582 未加载
Hamukoalmost 4 years ago
&gt;<i>CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations.</i><p>How is Apple validating the datasets for non-US child safety organisations?
评论 #28114551 未加载
AndrewDuckeralmost 4 years ago
Something I&#x27;d missed before: &quot;By design, this feature only applies to photos that the user chooses to upload to iCloud Photos&quot;<p>This is not about what people have on their own phones. This is about what people are uploading to iCloud, because Apple does not want CSAM on their servers!
评论 #28114563 未加载
评论 #28115943 未加载
评论 #28114360 未加载
rasguanabanaalmost 4 years ago
Everyone has said everything wrong about it already. Nevertheless, Apple can sugarcoat it as much as they like. There’s no <i>technical</i> control (no actual nor possible one) making this exclusively about targeting CSAM.
epagaalmost 4 years ago
It&#x27;s frustrating (though not at all surprising) to see Apple continue to be so tone-deaf. They clearly think &quot;If only we could make people understand how it works, they wouldn&#x27;t be so upset, in fact they&#x27;d thank us.&quot;<p>This is not the case - we do understand how it works, and we think it&#x27;s a bad idea.
DavideNLalmost 4 years ago
The question and answer i’m missing is:<p>Will Apple notify a user once an image has been (&#x2F;erroneously) flagged and will be inspected by Apple employees?
评论 #28114809 未加载
评论 #28117145 未加载