TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Expanded Protections for Children

230 pointsby robbiet480almost 4 years ago

68 comments

Syonykalmost 4 years ago
Don&#x27;t shit in my hand and call it chocolate ice cream.<p>&quot;We&#x27;re going to scan your photos, on your encrypted device, to look for badness. Right now, we&#x27;re going to claim that&#x27;s only for the really icky people that nobody is going to defend, but, hey, once the tech is in place, who&#x27;s to say we can&#x27;t scan for dank memes and stuff?&quot;<p>I think I&#x27;m done with Apple. Sad, really. I was hoping that their bowing to China with iCloud wasn&#x27;t a sign of what&#x27;s to come, but apparently it was. They had done such nice stuff with privacy too.<p>Demote my phone to a house phone and go without, I suppose.
评论 #28079536 未加载
评论 #28078940 未加载
评论 #28078660 未加载
评论 #28078911 未加载
评论 #28078733 未加载
评论 #28079361 未加载
评论 #28080373 未加载
评论 #28078566 未加载
评论 #28078713 未加载
评论 #28079360 未加载
评论 #28078610 未加载
Jaxkralmost 4 years ago
This is incredibly disappointing. The sick criminals that run child pornography rings are not storing their material on iCloud.<p>The &quot;This could be sensitive to view&quot; screen is downright Orwellian. This technology could be used to scan for ANYTHING, completely undermining user privacy. It might just be CP today, but tomorrow it could be screenshots of protest material, whistleblower content, or anti-government memes.<p>I cannot express how sad I am Apple has decided to do this. It doesn&#x27;t protect children, it won&#x27;t catch any pedophiles, but it certainly WILL be misused in the future and create a chilling effect on what (politically dissident) content people are willing to store on their phones.
评论 #28078875 未加载
userbinatoralmost 4 years ago
&quot;The road to hell is paved with good intentions.&quot;<p>The gradual but steadily accelerating rise of authoritarianism scares me far more than terrorists, drugs, child abuse, and the pandemic.<p>Unless we push back mightily, it will be a question of when, not if, owning a general-purpose computer that&#x27;s not controlled by the government or a company becomes discouraged, suspicious, and eventually illegal.
评论 #28079289 未加载
评论 #28079148 未加载
评论 #28079346 未加载
ComputerGurualmost 4 years ago
Remember, Apple doesn&#x27;t actually control the CSAM database (NCMEC). They almost certainly don&#x27;t (and wouldn&#x27;t want to) even have access to a reverse mapping between hashes and original images. A&#x2F;The government(s) could (yes, theoretically - this is cryptography we are purportedly talking about, the onus is on them to prove they can&#x27;t) easily slip in pictures&#x2F;screenshots of political materials to have your account flagged. Even if it requires manual human review on Apple&#x27;s end, the government could still (and has in the past) serve them with a warrant + gag order for &quot;false&quot; matches in the past.<p>Yes, it&#x27;s all &quot;they could&quot; but with current technical solutions already providing some measure of protection against a corrupt or malicious government from cracking down on its citizens, anything that erodes from that freedom deserves to be held to such a high standard.
评论 #28080693 未加载
评论 #28083739 未加载
评论 #28079167 未加载
brightballalmost 4 years ago
After reading the post and as a parent of two kids who are in middle school now...I&#x27;m pretty happy with what I see. I didn&#x27;t expect to be based on the comments I read here before reading the article though.<p>I know a local family who has a daughter who&#x27;s been in therapy for the last 3 years because she fell victim to the type of thing Apple is discussing in this post. They are firmly advocates for better parent education and oversight, sharing their experience so that other people can hopefully never have to deal with the same thing. They told us about an app called Bark[1] that&#x27;s supposed to really help with a lot of this stuff and seems inline with what Apple is talking about here. I&#x27;m pretty happy to see it will be built in.<p>&gt; The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.<p>All the parental controls in the world don&#x27;t prevent the fact that getting your kids a phone in this day and age is a pretty terrifying experience if you know what type of things are out there.<p>&gt; When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.<p>1 - <a href="https:&#x2F;&#x2F;www.bark.us&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.bark.us&#x2F;</a>
评论 #28078795 未加载
评论 #28078878 未加载
评论 #28078789 未加载
评论 #28078726 未加载
评论 #28079078 未加载
评论 #28081333 未加载
评论 #28078902 未加载
tptacekalmost 4 years ago
They got Mihir Bellare to review (and write a proof for) the private set intersection cryptography pieces of this.<p><a href="https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Technical_Assessment_of_CSAM_Detection_Mihir_Bellare.pdf" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Technical_Assessment_...</a><p>Apple&#x27;s official proof was done in part by Dan Boneh:<p><a href="https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Apple_PSI_System_Security_Protocol_and_Analysis.pdf" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Apple_PSI_System_Secu...</a>
评论 #28078852 未加载
评论 #28080247 未加载
评论 #28082955 未加载
TravisHuskyalmost 4 years ago
Wow, I am surprised Apple is taking this route. It&#x27;s not like iCloud was a haven for sharing illegal content. I love the security features of iOS but honestly this may have pushed me to move to an Android device running GrapheneOS.<p>Honestly, any time there is a new policy to &quot;protect children&quot; it is almost always incredibly invasive and it always feels like there is some other motive and &quot;protecting children&quot; is used to scare anyone who tries to question it.
PragmaticPulpalmost 4 years ago
I really don’t understand why Apple is doing this. The vast majority of their customers aren’t involved in any of these illegal activities, so it only provides potential downside through false positives.<p>I’m also struggling to imagine scenarios where a child predator is clever enough to acquire illegal photos without triggering any number of internet monitoring mechanisms (e.g. honeypots, server logs with their IP address) who would then turn around and upload those photos to their iCloud account. Doesn’t make sense.<p>This is a really strange move.
评论 #28079584 未加载
评论 #28079150 未加载
评论 #28080042 未加载
评论 #28079899 未加载
alpbalmost 4 years ago
Alternative title: Security researchers voice privacy concerns over Apple&#x27;s new plan to scan users&#x27; data for child abuse images, saying governments will likely increase its scope <a href="https:&#x2F;&#x2F;www.ft.com&#x2F;content&#x2F;14440f81-d405-452f-97e2-a81458f5411f" rel="nofollow">https:&#x2F;&#x2F;www.ft.com&#x2F;content&#x2F;14440f81-d405-452f-97e2-a81458f54...</a>
diebeforei485almost 4 years ago
The first item (on-device nudity detection in conversations) is good and deserves some applause. It&#x27;s also easy to implement, they could just take an off-the-shelf API and optimize the parameters to their needs, and has minimal privacy concerns because nothing is being reported to Apple. I&#x27;m honestly surprised this isn&#x27;t common already.<p>It&#x27;s the part about scanning people&#x27;s photo libraries that folks are (rightly) concerned about.
netr0utealmost 4 years ago
Sad, more spyware masqueraded behind &quot;think of the children.&quot;
评论 #28078524 未加载
diebeforei485almost 4 years ago
Apple&#x27;s software is quite buggy. There have been multiple instances of Unicode bugs for example, where receiving a notification or text message containing certain Unicode characters would cause a kernel panic, boot loop, or other fun stuff.<p>People often posted messages containing these Unicode characters in Discord groups (tagging @everyone) for the lulz, because every iPhone user in that group would get a notification containing those unicode characters and then kernel panic.<p>It&#x27;s only a matter of time until someone finds a bunch of false positives, spams them around for the lulz, and boom people&#x27;s iCloud accounts are disabled.<p>Even if you believe the theory is mathematically sound, the implementation need not be.
cultofmetatronalmost 4 years ago
damn, I was thinking of getting an iphone. Now I&#x27;m thinking of ditching the entire apple ecosystem altogether.<p>The idea that apple can scan files on my system without my consent is pretty sickening. I don&#x27;t care what its purportedly used for. This is a slippery slope to all sorts of privacy violations.
评论 #28078966 未加载
评论 #28078755 未加载
lphalmost 4 years ago
This is weird. Apple&#x27;s own announcement only talks about hash matching, but other reporting (e.g., [0]) talks about a system called &#x27;neuralMatch&#x27; that&#x27;s doing AI on user photos. To me, the privacy implications (and chance of false positives) seems quite different. Quite a discrepancy.<p>[0] <a href="https:&#x2F;&#x2F;www.zerohedge.com&#x2F;technology&#x2F;apple-plans-monitor-all-us-iphones-evidence-child-porn" rel="nofollow">https:&#x2F;&#x2F;www.zerohedge.com&#x2F;technology&#x2F;apple-plans-monitor-all...</a>
评论 #28078834 未加载
ncw96almost 4 years ago
A summary of the photo scanning system:<p>- Only applies to photos uploaded to iCloud<p>- Matching against a known set of CSAM (Child Sexual Abuse Material) hashes occurs on-device (as opposed to the on-server matching done by many other providers)<p>- Multiple matches (unspecified threshold) are required to trigger a manual review of matched photos and potential account suspension
评论 #28078679 未加载
评论 #28079337 未加载
评论 #28080676 未加载
评论 #28078899 未加载
Zelphyralmost 4 years ago
To everyone angry at this change, please take a moment to directly let Apple know how you feel. <a href="https:&#x2F;&#x2F;www.apple.com&#x2F;feedback&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;feedback&#x2F;</a>
bob1029almost 4 years ago
&gt; These efforts will evolve and expand over time.<p>I have no words.
评论 #28078643 未加载
robbiet480almost 4 years ago
Previous discussion at <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28068741" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=28068741</a>
评论 #28078468 未加载
评论 #28078404 未加载
jwileyalmost 4 years ago
Who needs the NSO group?<p>Repressive regime TODO list:<p>1: create a child safety organization, or require an existing one to accept your images<p>2: add images of the children of dissidents (or journalists, or leaders of other political parties), photoshopped to be sexually explicit<p>3: dissident iphones informs on them. Apple turns the information over to the authorities in the host country<p>4: if Apple pushes back, threaten iphone sales. Or just improve your doctoring.<p>5: if Apple plays along or doesn&#x27;t complain, insist on the ability to detect terrorists, criminals, etc. Again, threaten iphone sales, allow Apple to keep the agreement secret.<p>This may only work once or twice, but it&#x27;s worth a shot! If you make it to step 5, you have a really bespoke, beautifully designed, Apple managed intelligence apparatus. made with love in Cupertino.<p>Maybe this is an overly cynical take, and in addition to the cryptography they have rock-solid, audited governance and internal controls that would prevent it and&#x2F;or insider abuse.<p>Maybe localities with real data privacy laws (EU) will be able to offer protections to their citizens with fines big enough Apple will begrudgingly agree, so that a repressive regime can&#x27;t target their citizens as well as citizens in the host country.<p>Maybe this isn&#x27;t a slippery slope to more exotic forms of surveillance, like scanning your contact list for pedophiles.
ComputerGurualmost 4 years ago
&gt; The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image.<p>&quot;additional encrypted data&quot; makes it sound like it&#x27;s encrypted to keep your information safe, when it actually means &quot;encrypted so you can&#x27;t see what&#x27;s being sent&quot; and that &quot;only Apple (and people&#x2F;governments Apple shares decryption keys with) can decrypt it.&quot; And what does &quot;additional encrypted data&quot; even mean? I&#x27;m guessing it includes a thumbnail. It could include a list of people you shared the image with and their home addresses. The point is, you don&#x27;t (and can&#x27;t) know.
评论 #28079673 未加载
hughrralmost 4 years ago
While I believe there is genuine good intent behind this I can’t support apparatus that can be repurposed and can be used to cut me off from my entire digital life due to an arbitrary review process.<p>Not only that I do not consent to using the device I purchased for such measures.<p>I need a grand reset to 2005 levels of technology at this point.
malwarebytessalmost 4 years ago
Woah. I definitely don&#x27;t want this. I don&#x27;t want apple trolling through my data.<p>Terrifying. This will be used in other ways. Whenever you hear &quot;protect the children&quot; as an excuse to increase surveillance you know they&#x27;re up to something horrible. The future is bleak AF.
评论 #28079626 未加载
breckalmost 4 years ago
&gt; &quot;developed in collaboration with child safety experts&quot;<p>Prove that. Release all git commits and emails and communication and who was involved in these features. I don&#x27;t see any reason they would not do that, if this was about &quot;child safety&quot;.<p>My guess is a portion of these &quot;child safety experts&quot; will have emails ending in &quot;nsa.gov&quot;.
celeritasceleryalmost 4 years ago
&gt; Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.<p>Are they saying that they physically can’t access your iCloud until the threshold is reached, or just that they “promise not to”?
评论 #28078850 未加载
评论 #28078928 未加载
yellow_leadalmost 4 years ago
It&#x27;s kind of funny they announce this on the first day of Defcon. By next year&#x27;s conference, hackers will have had a field day with this.
tobyjsullivanalmost 4 years ago
Apple has been pro-encryption and pro-privacy lately and is using that as their key differentiator, especially against Google.<p>Various governments, on the other hand, have been very anti-encryption over the past couple decades. Australia&#x27;s anti-encryption law[0] is just one example but I&#x27;m sure there are many others.<p>This presents an ongoing threat to Apple&#x27;s current strategy.<p>The most common arguments used by governments to justify anti-encryption legislation are child protection and anti-terrorism.<p>I wonder if this is a tactic by Apple to undermine that common argument and pro-actively protect their rights to employ end-to-end encryption and other privacy features.<p>[0] <a href="https:&#x2F;&#x2F;fee.org&#x2F;articles&#x2F;australia-s-unprecedented-encryption-law-is-a-threat-to-global-privacy&#x2F;" rel="nofollow">https:&#x2F;&#x2F;fee.org&#x2F;articles&#x2F;australia-s-unprecedented-encryptio...</a>
sascha_slalmost 4 years ago
This is brilliant misdirection. The crypto is great, but where and how your content is scanned was never the problem to begin with, or at least, a small part.<p>The accuracy of not just the detection rate (there are some outlandish claims of once in billion here), but also the accuracy of the NCMEC database are really the main concern, as well as Apple keeping this system limited to this specific scope.<p>Interesting aside: I once attempted to get access to PhotoDNA, essentially the only insurance against malicious actors abusing upload fields on your website to &quot;digitally swat&quot; you (as has happened to a twitch streamer with an open Dropbox folder), and there is no way you&#x27;ll get access without a department of lawyers. Why is NCMEC is so protective of an API with rate limits and automated reporting features and then would let Apple ship a bloom filter.
评论 #28079748 未加载
option_greekalmost 4 years ago
I&#x27;m going to bet 100:1 that Android and Windows with Mac in tow are going to follow soon with their own implementations (though most likely technology supplied by three letter agencies) in next three years.<p>All that&#x27;s needed is a law prohibiting unsecured computing devices by government to plugin any &#x27;gaps&#x27; (cue: non tpm, locked down bootloader devices)
teekertalmost 4 years ago
If I understand correctly pictures of my naked children never match any known hashes so I should be ok (1 in a trillion ok that is? What is the expectation value for this?)? Not that I make such pictures but perhaps by accident. It’s not that I find it weird but god I’m afraid they end up beyond my control.<p>I don’t use iCloud (only for MS MFA backups), is this affecting me?<p>I self host everything, what happens when my account is suspended, does my iPhone work?<p>I sure hope we get to learn how effective this has been in say, a year? It’s quite something that my device is going to scan and check my pictures. I’d really like there to be a large, confirmed benefit for the children.
评论 #28079765 未加载
RegnisGnawalmost 4 years ago
Note that:<p>1) its only scanned on upload to iCloud, so if you don&#x27;t upload then its not scanned<p>2) (per another article, <a href="https:&#x2F;&#x2F;techcrunch.com&#x2F;2021&#x2F;08&#x2F;05&#x2F;apple-icloud-photos-scanning&#x2F;" rel="nofollow">https:&#x2F;&#x2F;techcrunch.com&#x2F;2021&#x2F;08&#x2F;05&#x2F;apple-icloud-photos-scanni...</a>): Most cloud services — Dropbox, Google, and Microsoft to name a few — already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM.<p>So you really can&#x27;t opt out unless you avoid all cloud photos
评论 #28079324 未加载
评论 #28078951 未加载
评论 #28079876 未加载
jszymborskialmost 4 years ago
This will probably result in me ditching the iPhone next upgrade.
评论 #28078585 未加载
aparks517almost 4 years ago
Does Apple have a legal or regulatory requirement to scan iCloud photos for CSAM? I poked around but didn&#x27;t find a quick answer.<p>Reading through the two linked PDF&#x27;s, I got the impression that they&#x27;re aiming to use cryptographic techniques to meet a (possibly self-imposed) requirement to scan for CSAM while revoking their current ability to decrypt photos stored in iCloud. I guess they may want to revoke this ability so that they can no longer be compelled to hand over customer&#x27;s photos except those for which the requirements of this new system are met.<p>I guess if it results in an increase of the privacy of iCloud content, that&#x27;s nice as far as it goes. But it does skeeve me out to be reminded that my phone can (and does) paw through my stuff.
MonadIsPronadalmost 4 years ago
Please, can we just be allowed to control what happens on the hardware we buy? Is that so great a demand?
zug_zugalmost 4 years ago
This raises some questions. Suppose my hypothetical 17-year-old daughter does send nudes to her boyfriend, and suppose I as a parent am totally okay with this. If this technology isn&#x27;t using exact-file-matches but heuristics on the file, is there some world where this consensual naked photo gets sent as &quot;evidence&quot; to Apple&#x2F;NCMEC?<p>&quot;Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC&quot; -- So doesn&#x27;t &quot;reviewing reports&quot; in this hypothetical mean looking at a child&#x27;s nudes without consents in this case? The documentation left out the part where this file, stored entirely locally, gets uploaded to apple. Or does this only happen for iCloud?
评论 #28079276 未加载
resfirestaralmost 4 years ago
My main question is whether the system makes it highly impractical to look for 1 to a small number (let&#x27;s say 5) newly introduced images. We have the secret sharing threshold, how easy is it for the server to change this on the fly? Putting aside the principle for a moment (personally I don&#x27;t like this because it&#x27;s a cop in your phone, full stop), if we want to practically prevent government abuse to look for leakers (or whatever category you prefer) then it&#x27;s helpful if Apple can&#x27;t reasonably comply with an order to find someone who has just 1-5 specific images.
ashleynalmost 4 years ago
Upon reviewing TFA it appears this is intended for iCloud users as a privacy-focused alternative to CSAM-scanning the images you upload to iCloud on-server. Rather than leaving images unencrypted on-server so thay they can be checked, the images are checked on your own device prior to upload (to dramatically simplify it). The sell here is that CSAM checking is then done without revealing the image to Apple...unless it&#x27;s CSAM, and there&#x27;s enough of it.<p>iCloud was already scanning for CSAM, this time though they do it without seeing what you upload to the server.<p>This is essentially encryption with a backdoor, and the chief objection is basically the same as the Clipper chip&#x2F;Skipjack. There are privacy issues with the idea of backdoored encryption, as the EFF pointed out. To counter misinformation that is being spread, it&#x27;s <i>not</i> a little AI cop doing blanket scans on your image library and app caches - it is expressly intended as a part of iCloud CSAM detection.<p>But at the same time it seems trivial for Apple to expand it to that someday, since they just deployed a method capable of doing so trivially. Conceivably they can make the API so that it does a scan when an image is saved to disk. This is <i>not</i> currently what will be done but whether or not they will is anyone&#x27;s guess.
bob1029almost 4 years ago
I posted in this thread a little bit ago. I couldn&#x27;t find anything coherent to say at the time. Now I have something to say.<p>I will be pushing for our current &amp; future customers to use alternative device vendors as part of our product stack. We have developed a cross platform implementation of our iOS application recently, and this is a great opportunity to start putting it to use.<p>I don&#x27;t know if Apple is paying attention, but there are enterprise customers out there paying them $300&#x2F;yr for the permission to use their presumably-secure devices for purposes of running custom LOB applications. Our organization and product are part of the reason you have financial institutions making bulk purchases of iPads and other Apple accessories.<p>If Apple makes it impossible for us to provide assurances to our customers that no data is leaving the devices (aside from agreed-upon application protocols), then we have a serious problem. I am not putting myself in a position where I push a customer to purchase 1000+ iPads for their business if I will then have to answer to their auditors 6 months later on when sensitive customer documents hashes or whatever other horrible things start getting sent to the mothership as part of v2.0 of this shitty idea. Apple even said it themselves - &quot;These efforts will evolve and expand over time.&quot;.<p>Is there a special order form if I need to purchase a non-bugged iPad for a classified government setting or very pissed-off enterprise customer? If not, our default recommendation is going to be Microsoft Surface and we will totally deprecate usage of Apple hardware over the course of the next 18-24 months. This is setting a <i>completely</i> unacceptable precedent, and most of our customers I have spoken with today also agree.
summerlightalmost 4 years ago
I&#x27;m not strictly against this if it can effectively help protecting children (which needs completely separate discussions of course). But Apple&#x27;s usual privacy&#x2F;security PR masquerade doesn&#x27;t seems very compatible with this, especially with the sentence &quot;These efforts will evolve and expand over time&quot;.
erhkalmost 4 years ago
Sexual predators and sex trafficking are trojan horse reasons for deeper momitoring and privacy violations. No one would argue against child pornograpy being bad and by transitive properties there is an attempt to make arguing against violations of your own privacy equally morally abhorrent.
olliejalmost 4 years ago
Can&#x27;t wait for this to be extended for all other &quot;criminal&quot; content. In many countries that includes LGBT, pro-democracy, etc materials.<p>In the US weed is illegal at a federal level, so we can get that as well.<p>Woo!<p>And does anyone think this will actually catch any fucking pedos?
wyageralmost 4 years ago
&gt; If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.<p>Oh, joy, if a robotic bureaucrat nukes my personal data, I have the option to file an appeal that will probably never be looked at.
akmarinovalmost 4 years ago
If only iCloud backups were encrypted…<p>Also this is US only? What about other children?
评论 #28078579 未加载
arriualmost 4 years ago
I&#x27;m happy to see people held accountable for being scummy human beings but this does raise the question of where the line will be drawn and who gets to make that decision.
aluminum96almost 4 years ago
I&#x27;m going to contradict the rest of the commenters here: I strongly support this change. The ease of producing and profiting from child pornography has risen tremendously in recent years due to the advent of technology, and it&#x27;s perfectly ethical for tech companies to account for their impact and attempt to mitigate it. There is no slippery slope from child porn prohibition to political censorship, as so many other commenters fear.
评论 #28079952 未加载
评论 #28081720 未加载
obblekkalmost 4 years ago
I&#x27;ll take an unpopular stance. This is the least bad way a backdoor scan could be implemented.<p>At the very least, this makes retroactive scans of a newly banned content difficult without explicit deployment by Apple, and explicit software update from a user. In Western countries, this friction is enough to ensure people will be able to file suit in courts to block the government.<p>In authoritarian governments, it&#x27;s not, but then, nothing is.
YeBanKoalmost 4 years ago
Until this I was typically defaulting to Apple products for privacy reasons. This seems like a major issue to me. This is an ideal spy tool, disguised as an privacy-preserving feature.<p>Any government of a country with a large enough market for Apple, can force Apple to include hashes hashes into localized versions of OS.
ryukafalzalmost 4 years ago
So my concern about this has less to do with the technology itself, and more to do with...<p>What happens when lawmakers make this mandatory? In such a way where the user can&#x27;t be allowed to remove it?<p>If that ever happens, you&#x27;ve just outlawed phones&#x2F;computers that allow their users to have ultimate control over the software running on them. No more alternative desktop&#x2F;mobile OSes.<p>Oops.
wellthisisgreatalmost 4 years ago
Can someone who understands cryptography well enough please comment, if these hash comparisons can easily be extended to other areas such as contextual analysis of photos or texts?<p>For example would it be easy now to get to the hypothetical scenario where a text containing certain phrases will be flagged if some partner &#x2F; regulator demands that?
kleene_opalmost 4 years ago
It&#x27;s almost comical the tendency companies&#x2F;societies&#x2F;countries have to invoke absolutely abhorrent concepts in order to justify their future unethical behaviors and shield themselves from any valid criticism.<p>The fact that they raised the children card speaks volumes about the level of fuckery they&#x27;re gonna deploy on this one.
thih9almost 4 years ago
I like that these features are being introduced with something similar to a press release and are being discussed in the open, as well as from the point of view of privacy.<p>It all may still end up being a slippery slope; but with this perhaps the chance is slightly lower (than if Apple introduced these changes without as much of a comment).
7373737373almost 4 years ago
Similar things happening in Europe: <a href="https:&#x2F;&#x2F;www.patrick-breyer.de&#x2F;en&#x2F;chatcontrol-european-parliament-approves-mass-surveillance-of-private-communications&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.patrick-breyer.de&#x2F;en&#x2F;chatcontrol-european-parlia...</a>
threatofrainalmost 4 years ago
&gt; The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.<p>&gt; Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
ogurechnyalmost 4 years ago
It&#x27;s always been “think of the children” for three decades in IT, whether it was encryption, anonymity, or access to information. <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Four_Horsemen_of_the_Infocalypse" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Four_Horsemen_of_the_Infocalyp...</a><p>Almost every government blocking system started for the sake of the children, then expanded for this or that reason. In some cases, it expanded before starting: notoriously, first sites blocked when Russian government censorship was introduced (to protect our kids from the dangers of the internet!) were well known media entities that had criticized Putin for a long time. The zest was that they were hurriedly blocked even <i>before</i> the relevant acts came into effect. Like, why keep the act after the deed is done? So now there is a government agency that has been blocking oh-so-edgy pictures of nooses on DeviantArt and similar content for years to pretend that they care about the children, and not execute censorship (and sell their extortion services to people who want to remove some information from the Web on the side). Whether that helped any kid is an open question.
angchalmost 4 years ago
One new way for swatting.<p>Send or otherwise inject problematic images to the target, but make sure the target can&#x27;t see it (image too small, part of larger collage), but their device does. The target gets flagged by Apple for having CSAM.<p>The target gets FBI&#x27;d.
protoman3000almost 4 years ago
Can somebody please explain what the big issue here is? I thought this only applies to iMessage and there are many other possibilities available to communicate securely.
评论 #28079156 未加载
评论 #28079003 未加载
arthurcollealmost 4 years ago
So can I just like disable updates right now and not get this malware when it comes out in the future, or will it be some evil OTA update?
评论 #28079511 未加载
matrixalmost 4 years ago
This feature is a big plus for parents. It&#x27;s hard to appreciate how hard it is to protect kids online if you don&#x27;t have kids yourself. I get that those without kids will find it intrusive, but it sounds like these feature are opt-in.<p>I consider iOS to be the best platform for kids today. That said, Apple: if you&#x27;re listening: please tighten up parental controls around time limits and re-loading apps!
评论 #28079385 未加载
karmicthreatalmost 4 years ago
Fantastic, so is Apple going to notify me if someone is looking at the pics on my phone? No?<p>How about when Apple gets a NSL or other non-disclosable warrant and searches my phone with this tool? Still no?<p>Probably going to be a lot of leaked nudes by Apple soon.
Copernicronalmost 4 years ago
This is completely unacceptable. Apple is giving themselves the ability to monitor and&#x2F;or censor anything you do on your phone, no matter if it&#x27;s end to end encrypted or not. It&#x27;s not at all a stretch to go from monitoring iMessage for CSAM to monitoring it for gay pornography or any pornography at all. Or monitoring for anti-government sentiments.<p>Sure, they&#x27;re only scanning for images right now. But there&#x27;s nothing stopping them from scanning for other things, like Mein Kampf or the writings of Karl Marx. Or some other author who suddenly finds themselves unpopular with the government of the day.
aunty_helenalmost 4 years ago
Done with Apple now. I will never own a device that is a revolving door for law enforcement to look into my doings.<p>This constant yo-yo of privacy marketing and bad privacy decisions is too time consuming and mentally stressful.<p>I was waiting to update my 2017 MBP to a M1 16&quot; when they come out and when my iphone 11 is coming up 2 years. These products will no longer be apple products and idgaf if I have to deal with linux desktop and jailbreaking an android to get it done.
DavideNLalmost 4 years ago
I think Apple has good intentions. However history shows the next step will be detecting less serious types of abuse, crime, etc. This system will expand slowly, step by step, until it is normalized again.<p>A sad say for privacy and freedom.<p>Obviously any child porn owners will stop using Apple devices, while the rest of us suffer the consequences.<p>I always wonder what would happen if Tim Cook will retire one day and be replaced by someone like Donald Trump. Once all the tools are in place...
breadsticks3726almost 4 years ago
As a young adult I have to ask this - you guys really fucked up this hard on defending the privacy landscape? How long do I have until a mini space hitler in my pocket that censors and snitches on my activities?<p>Your actions or lack thereof will have grave consequences for billions of people across generations to come.
DaniloDiasalmost 4 years ago
If you have young children, better not take pics of them, lest apple inc accuses you of being a predator.
RIMRalmost 4 years ago
It&#x27;s absolutely a noble cause, but if Apple is able to scan your iCloud content to figure this stuff out, then your iCloud content isn&#x27;t secure.
评论 #28078859 未加载
jpxwalmost 4 years ago
First off, I don’t think this is some evil plan to kill our privacy. I think this project is done with good intentions, if nothing else.<p>However I think this is an interesting question: how does Apple know that the hashes they’re supplied match CSAM, and not, say, anti-government material? How would they know if the people they got hashes from started supplying anti-government hashes? Apple will only be receiving the hashes here - by design, even they won’t have access to the underlying content to verify what the hashes are for.
throwawayseaalmost 4 years ago
Safetyism, particularly with children, is the vehicle that is most often used to enable bad precedent-setting policies, because it can seem virtuous and morally acceptable in that single application. Here, Apple is using a &quot;think of the children&quot; argument to open the door on their intrusion into customers&#x27; private data and policing&#x2F;moderation of the same. Google started doing this themselves recently (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23275308" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=23275308</a>) and it is just as unacceptable here. I guess that&#x27;s the end of me using or recommending Apple products.
bdibsalmost 4 years ago
Doesn’t this ease the worries from yesterday’s thread? They’ve taken steps to minimize false positives (one in a trillion accounts per year), and it’s not happening on private photos like rumored, only photos that are synced to iCloud (which is what they were doing server side already (?)). I don’t buy the slippery slope argument personally, if you want to get up in arms when they scan for politically motivated content, do so when they actually do that.
评论 #28079254 未加载
aerovistaealmost 4 years ago
It never occurred to me before reading this post that someday child porn will inevitably be completely vanquished. Open to contradictory views, but my thinking is that someday, eventually, AI will be advanced enough to identify an image&#x27;s content unambiguously and without error, just like a human user.<p>No human could possibly be unable to correctly identify an image of child porn, and someday algorithms will reach that point too. And once they do....it&#x27;s probably not a big leap from there for browsers and operating systems to start denying the images altogether, just blacking them out and preventing their transmission over networks.<p>The flipside is that it is impossible for even a human to tell the difference between a 17yo and an 18yo, and moreover impossible for a computer (and arguably for a human) to know whether the user of a device is sending images of themself (i.e. a 17yo sexting with their bf&#x2F;gf) or whether it&#x27;s exploitation. That&#x27;s harder.<p>So for like this new post from apple, this is going to be pretty shitty for high schoolers trying to sext their boyfriends and girlfriends.
评论 #28079051 未加载
评论 #28079386 未加载
评论 #28078876 未加载