TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The deceptive PR behind Apple’s “expanded protections for children”

753 点作者 arespredator将近 4 年前

44 条评论

querez将近 4 年前
I have a newborn at home, and like every other parent, we take thousands of pictures and videos of our newest family member. We took pictures of the very first baby-bath. So now I have pictures of a naked baby on my phone. Does that mean that pictures of my newborn baby will be uploaded to Apple for further analysis, potentially stored for indefinite time, shared with law enforcement?
评论 #28162126 未加载
评论 #28162112 未加载
评论 #28161254 未加载
评论 #28162124 未加载
评论 #28161239 未加载
评论 #28166358 未加载
评论 #28164844 未加载
评论 #28162070 未加载
评论 #28161270 未加载
评论 #28162693 未加载
评论 #28161215 未加载
评论 #28161272 未加载
评论 #28161466 未加载
评论 #28162498 未加载
评论 #28161187 未加载
评论 #28162319 未加载
评论 #28161843 未加载
评论 #28161613 未加载
ekianjo将近 4 年前
&gt; The worst part is: how do I put my money where my mouth is? Am I going back to using Linux on the desktop (2022 will be the year of Linux on the desktop, remember), debugging wifi drivers and tirelessly trying to make resume-from-suspend work?<p>Oh come on. DOn&#x27;t make it sound like it&#x27;s that bad. Wifi is a solved problem for a long time now, and you can buy Lenovo, System76 or Tuxedo if you want to make sure 100% things work as expected. Don&#x27;t be that guy.
评论 #28166562 未加载
评论 #28166387 未加载
评论 #28166001 未加载
评论 #28166222 未加载
评论 #28166489 未加载
评论 #28167259 未加载
评论 #28167528 未加载
评论 #28167018 未加载
评论 #28167270 未加载
评论 #28167847 未加载
评论 #28167072 未加载
评论 #28167084 未加载
评论 #28167933 未加载
评论 #28166475 未加载
评论 #28182776 未加载
farmerstan将近 4 年前
Whoever controls the hash list controls your phone from now on. Period. End of sentence.<p>Apple has not disclosed who gets to add new hashes to the list of CSAM hashes or what the process is to add new hashes. Do different countries have different hash lists?<p>Because if the FBI or CIA or CCCP or KSA wants to arrest you, all they need to do is inject the hash of one of your photos into the “list” and you will be flagged.<p>Based on the nature of the hash, they can’t even tell you which photo is the one that triggered the hash. Instead, they get to arrest you, make an entire copy of your phone, etc.<p>It’s insidious. And it’s stupid. Why Apple is agreeing to do this is disgusting.<p>And it doesn’t make sense. If I were a pedophile and I took a new CSAM photo, how long would it take for that specific photo to get on the list? Months? Years? As long as pedophiles know that their phones are being scanned, they won’t use iPhones for their photos. And then it will be only innocent people like me that get scanned for CSAM and potentially getting that used against me in the future.<p>If they really cared about CSAM, this feature is useless and stupid. All it does is make regular people vulnerable to Big Brother tactics which we know already exist.
评论 #28162647 未加载
评论 #28162672 未加载
评论 #28172156 未加载
评论 #28166872 未加载
Barrin92将近 4 年前
&gt;The worst part is: how do I put my money where my mouth is? Am I going back to using Linux on the desktop (2022 will be the year of Linux on the desktop, remember)<p>people really need to retire this meme. On the desktop in particular as a dev environment Linux is completely fine at this point. I can understand people not wanting to run a custom phone OS because that really is a ton of work but for working software developers Fedora, Ubuntu whatever any mainstream distro is at this point largely hassle free.
评论 #28162799 未加载
评论 #28162630 未加载
评论 #28163799 未加载
评论 #28163536 未加载
评论 #28163575 未加载
Componica将近 4 年前
Imagine taking a photo or have in your gallery a photo a dear leader doesn&#x27;t want to spread. Ten minutes later you heard a knocking at your door. That&#x27;s what I&#x27;m most worried about, how is this not creating the infrastructure to ensnare political dissidents.
评论 #28162178 未加载
ekianjo将近 4 年前
&gt; Am I getting a Pixel and putting GrapheneOS on it like a total nerd? FUCK.<p>Thanks for depicting people who care about privacy and act on their beliefs as &quot;total nerds&quot;, that&#x27;s an encouraging attitude.
评论 #28165751 未加载
评论 #28165883 未加载
tuatoru将近 4 年前
Unless Apple can demonstrate that the techniques they are using are <i>intrinsically</i> specific to CSAM and to CSAM only--the techniques do not work for any other kinds of photo or text--slippery slope arguments are perfectly valid and cannot be denied.<p>Apple is a private company and as such its actions amount to vigilantism.
arespredator将近 4 年前
Hi, post author here.<p>To anyone upset or offended by the Linux&#x2F;nerd paragraph: please chill, and please forgive my tone.<p>I am a nerd myself indeed, and what I wanted to convey by this not-as-funny-sa-expected paragraph was that &quot;going full nerd&quot; is not a solution. There are ways to protect your privacy that will not be available to less tech-savvy people, and it&#x27;s a problem. HN crowd will use Thinkpads with Arch on them, and phones with Graphene or whatever, but most people won&#x27;t.<p>Yours, Absolute nerd and lover of desktop Linux since SuSE 6.0
评论 #28167511 未加载
atbpaca将近 4 年前
I doubt Apple has not thought about the PR &amp; policy consequences of such an iPhone backdoor. For me, it&#x27;s even more sad to see Apple using the fight against CSAM, a noble cause, as a shield and a way to convince the masses that breaking its promise to protect privacy is OK. &quot;What happens in your iPhone stays on your iPhone [no longer]&quot;. There is no court oversight, no laws, it&#x27;s automated mass surveillance.
dev_tty01将近 4 年前
There is a great deal of misinformation and confusion on this topic. Here is a good interview with Apple&#x27;s head of Privacy.<p><a href="https:&#x2F;&#x2F;techcrunch.com&#x2F;2021&#x2F;08&#x2F;10&#x2F;interview-apples-head-of-privacy-details-child-abuse-detection-and-messages-safety-features&#x2F;" rel="nofollow">https:&#x2F;&#x2F;techcrunch.com&#x2F;2021&#x2F;08&#x2F;10&#x2F;interview-apples-head-of-p...</a>
评论 #28163065 未加载
评论 #28166249 未加载
wpdev_63将近 4 年前
I used to always get the latest and greatest iphone but with the politics and everything that&#x27;s going on why would I want to spend more than the absolute minimum on my cellphone? There are plenty of wholesome things to spend money on other than tech.
severak_cz将近 4 年前
&gt; The hypothesis that I have is that Apple wishes to distance itself from checking users’ data.<p>This is best explanation of the whole situation I have read.
评论 #28162229 未加载
zug_zug将近 4 年前
Good thing this didn&#x27;t exist in 1776, or I&#x27;d be living in Great Britain.
FabHK将近 4 年前
From <i>A Concrete-Security Analysis of the Apple PSI Protocol</i>:<p>&gt; Taking action to limit CSAM is a laudable step. But its implementation needs some care. Naively done, it requires scanning the photos of all iCloud users. But our photos are personal, recording events, moments and people in our lives. Users expect and desire that these remain private from Apple. Reciprocally, the database of CSAM photos should not be made public or become known to the user. Apple has found a way to detect and report CSAM offenders while respecting these privacy constraints. When the number of user photos that are in the CSAM database exceeds the threshold, the system is able to detect and report this. Yet a user photo that is not in the CSAM database remains invisible to the system, and users do not learn the contents of the CSAM database.<p><a href="https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Alternative_Security_Proof_of_Apple_PSI_System_Mihir_Bellare.pdf" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Alternative_Security_...</a>
rambambram将近 4 年前
So Apple is going to take care of positive matches with highly reliable and trained personnel? Just like their highly trained personnel who kept the App Store clean of shitty apps? :&#x27;)
id5j1ynz将近 4 年前
Apple implemented a backdoor that scans your photos on your device, then alerts Apple and the authorities if there is a match against an un-auditable list of reference photos.<p>Currently it&#x27;s been activated for CSAM only and only scans photos backed up to iCloud.<p>That&#x27;s the framing I prefer and which much better explains the issue with it.
roody15将近 4 年前
Apple is not the police. Apple is not an extension of the Unites Government. There is simply no reason for Apple to enable local scanning for any content what so ever.
评论 #28166632 未加载
johnvaluk将近 4 年前
My common sense is tingling, telling me that Apple&#x27;s eventual move will be one of malicious compliance, finally implementing e2ee in a way that provides them with culpable deniability and users with a much desired privacy enhancement.
GiorgioG将近 4 年前
I turned off iCloud photos tonight. F** Apple. If there is a collision, then it gets manually reviewed by a human...so now my private pictures are on display for someone to see who I&#x27;ve not given permission. Just Say No.
raxxorrax将近 4 年前
On the internet almost everyone wants to extract money from kids and their parents and they try to hook them up with different mechanisms. That is also true for Apple, although they appeal to protective instincts of their guardians.<p>I get why a safe environment is appealing. Parents know that their kids get milked by virtual goods in games or social media and don&#x27;t know how to protect them from that. I think states are indeed responsible to set sensible boundaries for the industry to protect minors.<p>But this cannot lead to subject the whole net to it. Age verification is also not possible, so a protected environment is the way to go. The latter is difficult to advertise to developers because they also know about corporate ambitions to get their hands on market share.<p>Google isn&#x27;t even the worst actor, more aggressive corps like Amazon are far more destructive in this field, but there isn&#x27;t a single corp that is guilty here, so legislation also needs to protect free spaces. While seemingly in contradiction, this is also extremely important for digital education of future generations, even more so than questionable content in my opinion. Most here might have been subjected to that as kids. Was it that bad as generally assumed? This is a threat that should not be overblown. Parents feeling guilty neglecting their kids are extremely vulnerable to this line of thinking, even if they don&#x27;t neglect their kids at all.<p>Many countries have rules against cartels, but there is a conflict of interest here. No country likes to split their most successful companies for nothing in an international market. So nobody does.
rootsudo将近 4 年前
<a href="https:&#x2F;&#x2F;www.missingkids.org&#x2F;theissues&#x2F;end-to-end-encryption" rel="nofollow">https:&#x2F;&#x2F;www.missingkids.org&#x2F;theissues&#x2F;end-to-end-encryption</a><p>Geez.
评论 #28161886 未加载
评论 #28166378 未加载
评论 #28162895 未加载
jliptzin将近 4 年前
There is something you can do about it: don’t use Apple products
评论 #28162531 未加载
seph-reed将近 4 年前
I really don&#x27;t see why the scanning would ever be done on the phone instead of on iCloud if it only affects iCloud images.<p>But I do have guesses why.
评论 #28161379 未加载
评论 #28161545 未加载
评论 #28161322 未加载
评论 #28161406 未加载
评论 #28162111 未加载
8bitsrule将近 4 年前
&quot;I don’t care what anything was designed to do. I care about what it can do!&quot; &lt;= Gene Kranz in &#x27;Apollo 13&#x27;
1vuio0pswjnm7将近 4 年前
&quot;The hypothesis that I have is that Apple wishes to distance itself from checking users&#x27; data. <i>They&#x27;ve been fighting with the FBI and the federal government for years</i>, they&#x27;ve been struggling with not reporting CSAM content to the NCMEC, they don&#x27;t want to be involved in any of this anymore.&quot;<p>However there is close to zero evidence to support this idea. I was just reading something the other day that directly contradicted this; it suggested the relationship has been excellent save for a single, well-publicised dispute over unlocking an iPhone. In other words, the publicly aired dispute was an anomaly, not representative of the underlying relationship.<p>Even more, unless the pontificator works for Apple or the government, she is not a good position to summarise the relationship. Plainly put, it is not public information.<p>What does such baseless speculation achieve. Is it like spreading a meme. I dont get it.<p>&quot;The worst part is: how do I put my money where my mouth is? Am I going back to using Linux on the desktop (2022 will be the year of Linux on the desktop, remember), debugging wifi drivers and tirelessly trying to make resume-from-suspend work? Am I getting a Pixel and putting GrapheneOS on it like a total nerd? FUCK.&quot;<p>Is having a computer with closed source wifi drivers and proper ACPI support more important than having a computer with an open OS that does not include an intentional backdoor.<p>Maybe the problem is not how to put your money where your mouth is, its how to put your mouth where your money is. What does GrapheneOS cost. Maybe this is not about money.<p>Options like GrapheneOS, even the mere idea of GrapheneOS, i.e., that there can be alternatives to BigTech&#x27;s offerings, get buried underneath Apple marketing. Much of that marketing Apple gets for free. It comes from people who do not work for Apple.<p>Bloggers and others who discuss computers can help change that. They can also help Apple sail through any criticism (and they do).
评论 #28163169 未加载
评论 #28164553 未加载
评论 #28164500 未加载
Accacin将近 4 年前
Eh, I completely agree that this is a step too far, but the solution is so simple. Stop using Apple devices - luckily I switched from iOS to CalyxOS when my iPhone 7 broke earlier this year. Honestly, it wasn&#x27;t so bad.
评论 #28162059 未加载
spoonjim将近 4 年前
Any idea why Apple didn’t just implement server side scanning like everyone else?
评论 #28161572 未加载
评论 #28161442 未加载
评论 #28162188 未加载
评论 #28161551 未加载
评论 #28161668 未加载
评论 #28161472 未加载
citizenpaul将近 4 年前
This system has 0 transparency. I cannot appeal. I cannot check my status. I cannot check this mysterious &quot;counter&quot; I cannot even check an image i have to see if it is flagged. I cannot know who is manually looking at my data no matter how private it its.<p>I get a note in my fruit delivery with the name of the person and time they were at work packing my food. Yet I&#x27;m told that I cannot know anything about what is happening with at least a partially automated system that can potentially put me in jail for the next 20 years?
FabHK将近 4 年前
Question:<p>Would Apple report CSAM matches worldwide to one specific US NGO? That&#x27;s a bit weird, but ok. Presumably they know which national government agencies to contact.<p>Opinion:<p>If Apple can make it so that<p>a) the list of CSAM hashes is globally the same, independent of the region (ideally verifiably so!), and<p>b) all the reports go only to that specific US NGO (which presumably doesn&#x27;t care about pictures of Winnie the Pooh or adult gay sex or dissident pamphlets)<p>then a lot of potential for political abuse vanishes.
评论 #28162997 未加载
评论 #28162457 未加载
deeblering4将近 4 年前
What would prevent someone from, for instance, printing off an illegal photo, “borrowing” a disliked co-workers iCloud enabled phone, and snapping a picture of the illegal picture with their camera?<p>On iOS the camera can be accessed before unlocking the phone, and wouldn’t this effectively put illegal image(s) in the targets possession without their knowledge?
评论 #28162300 未加载
评论 #28162439 未加载
评论 #28162454 未加载
squarefoot将近 4 年前
This whole mess brought back a memory of when I was 4 to 5 years old (so probably 1971). During a summer vacation we were walking at a harbor in Tuscany with my parents and they told me suddenly I had to take a dump. Problem was that there was no bathroom nearby, well it probably was since the place was filled with restaurants, but we were like a hundred meters from the nearest one, which was incompatible with the sudden need of a baby like I was. So my parents quickly found an area with vegetation behind a building, helped me remove my clothes and sit down waiting for me to unload all that stuff. Then my father saw me doing an expression they later described as priceless, so he quickly shouted me to wait, then grabbed his Nikon and shot me a photo. That photo later that year won a prize.<p>Now imagine the same happening today with my dad shooting me a photo using his iPhone, only to trigger a CSAM alert somewhere an probably be investigated for child abuse. Just no thanks. Screw you Apple, and all those who pull your strings into creating this farce.
评论 #28162960 未加载
ashneo76将近 4 年前
We need more people using Linux on desktop and pouring money into the Linux phone to justify these.<p>Voting with your wallet matters and works. It is why apple and Google still do so much marketing and hype about their phones and devices.
shmerl将近 4 年前
This ad seems fitting in the context: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=tdVzboF2E2Q" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=tdVzboF2E2Q</a>
xg15将近 4 年前
&gt; <i>In the world of computer security this technology has a name, it’s called “a backdoor.” A well-documented and well-intended backdoor, but still a backdoor. Installed and enabled by default on millions of devices around the world.</i><p>Sorry, but that backdoor has already existed for a long time. It exists in every IoT gadget, smart car, smart speaker, smart home and other connected device that phones home to its vendor and can receive arbitrary firmware updates. It exists for every app and every desktop software that will automatically update itself in the name of &quot;evergreen software&quot;.<p>This is just the first time someone is publicly making use of the backdoor.
atbpaca将近 4 年前
#NotUpdating to iOS15, also #NotUpgrading this time until further notice.
rvz将近 4 年前
Yep, that&#x27;s deceptive advertising on privacy and everyone bought into it and walked straight into the reality distortion field.<p>Another innovative &#x27;gotcha&#x27; by Apple. A reminder that they are not your friends.
cebert将近 4 年前
As much as the tech and security community has concerns and objections to this policy change on the part of Apple, I’m skeptical there will be any notable impact to Apple’s revenue and future sales.
评论 #28162442 未加载
pers0n将近 4 年前
Now if the government hates you they can claim they found this on your phone
atbpaca将近 4 年前
when are they going to add this backdoor to MacOS?
balozi将近 4 年前
Dear tech users,<p>Associating with some of you has become a liability. One may be smart enough to avoid iPhone and Alexa et al. but what to do when one is surrounded by people who willingly expose themselves to nefarious technology?<p>In short, I don&#x27;t want pictures of me being hoovered up along with your baby pics from your iPhone.
anko将近 4 年前
From the article;<p>&gt; You could of course say that it’s “a slippery slope” sort of argument, and that we should trust Apple that it won’t use the functionality for anything else. Setting aside the absurdity of trusting a giant, for-profit corporation over a democratically-elected government,<p>And then later it reads<p>&gt; and has previously cancelled their plans for iCloud backups encryption under the pressure of FBI.<p>Isn&#x27;t the FBI in place because of the democratically elected government? It seems like the for profit organisation is trying to do the right thing, and the government is stopping them.<p>This is the fundamental problem with arguments based on &quot;trust&quot; - the government seems to be doing the wrong thing.
tehjoker将近 4 年前
I&#x27;d like to point out that the government (and by proxy Apple, companies care even less) doesn&#x27;t give a shit about children. They are advocating a policy of mass infection, they didn&#x27;t give a crap about children in Flint drinking toxic water, etc. If they cared about kids, they would care a lot about thinks that physically hurt and kill them. This means we don&#x27;t have to take their stated reasons for this at all seriously.<p>Apple, if you care about children, you&#x27;ll pay more than your legally owed taxes and push for improved access to education, nutrition, and free child care. They&#x27;re only interested in the avenue that coincidentally dramatically increases their surveillance powers and the powers of the government.<p>Weird, can&#x27;t figure that one out.
评论 #28167507 未加载
EugeneOZ将近 4 年前
in &quot;Photos&quot; app, in the bottom right corner there is a &quot;search&quot; icon. When I click it, and entering &quot;beach&quot;, I can see photos I&#x27;ve made on the beach (or in the sea, near the beach).<p>What does it mean? My (and your) photos are scanned and analyzed. I&#x27;ve heard literally zero noise about this feature - nobody was complaining (at least not loud enough to let me notice it).<p>So, why the hell all of that fuzz is being raised now? You&#x27;re (and mine) photos will be scanned and analyzed AGAIN. Not by humans, by algorithms. In some really rare cases they might be checked by humans, but you 100% will not have troubles with the law if photos don&#x27;t contain CSAM.<p>I have 2 kids and I’m not buying that argument “oh my library of naked photos of my child - I’m in danger”. If you are uploading naked photos of your child to iCloud - it&#x27;s similar to publishing them. Everything that is uploaded to the Internet, will belong to the Internet, and you don&#x27;t have so much control of it. If, for some awkward reason, you have sets of naked photos of your child and you want to save them - never ever send them to the Internet.<p>If you think that not-so-experienced users should not know about this rule - I’m pretty sure they don&#x27;t even know (or care) about this “scandal”. All of that FUD wave is raised by the journalists and echoes on forums like this one.
评论 #28162029 未加载
评论 #28162067 未加载
phkahler将近 4 年前
I really don&#x27;t get all the hype. This is not a backdoor as it&#x27;s called in TFA. It&#x27;s not Apple &quot;reaching into your device&quot;. It is literally checking for specific images and reporting their presence to Apply if found. It&#x27;s not using AI to analyze your photos or anything like that. It&#x27;s looking for specific images, and only prior to uploading them to iCloud. It won&#x27;t even flag your own nasty images because the hash won&#x27;t match.<p>Note: The above assume we&#x27;re talking about a typical hash of data and not an image-analysis &quot;hash&quot; of what it thinks the content it. This is supported by the language they use.<p>Yes, it&#x27;s a bit big-brother. But I already assume the authorities can fairly easily get ALL your iCloud data if they ask Apple the right way.<p>You know what&#x27;s creepy AF? Having a private conversation and getting facebook ads the next day relating to the topic. Talk about an acquaintance acting schizophrenic and get ads about medications and treatment for that? Creepy as fuck. And that was on the wifes iPhone - I have Android and didn&#x27;t get that stuff, but I seem to remember similar incidents where I got ads for stuff talked about. That&#x27;s serious voice analysis, not just checking a file hash, and it happens when your phone is in your pocket.
评论 #28162041 未加载
评论 #28162024 未加载
评论 #28162048 未加载
评论 #28162031 未加载
评论 #28162056 未加载