TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Princeton Researchers Who Built a CSAM Scanning System Urge Apple to Not Use It

314 点作者 nathan_phoenix将近 4 年前

9 条评论

atlgator将近 4 年前
&gt; A foreign government could, for example, compel a service to out people sharing disfavored political speech. That&#x27;s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.<p>This sums up the concern. CSAM is just the excuse because who would come out against protecting children, right? But this will absolutely be used for political purposes.
评论 #28248244 未加载
评论 #28247179 未加载
评论 #28247883 未加载
评论 #28247225 未加载
评论 #28246986 未加载
评论 #28246679 未加载
the_snooze将近 4 年前
Here&#x27;s the link to those researchers&#x27; words directly: <a href="https:&#x2F;&#x2F;www.washingtonpost.com&#x2F;opinions&#x2F;2021&#x2F;08&#x2F;19&#x2F;apple-csam-abuse-encryption-security-privacy-dangerous&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.washingtonpost.com&#x2F;opinions&#x2F;2021&#x2F;08&#x2F;19&#x2F;apple-csa...</a>
评论 #28247692 未加载
ummonk将近 4 年前
Nothing about those concerns seems specific to the end-to-end encryption compatible CSAM system they or Apple built...<p>Honestly if I were Apple I&#x27;d consider just scrapping the whole thing and doing server side CSAM testing on iCloud photos without rolling out E2E encryption for iCloud photos. It&#x27;s just not worth the PR blowback.
评论 #28248432 未加载
评论 #28248123 未加载
评论 #28255232 未加载
arpstick将近 4 年前
you know, one surefire way to signal to apple that deployment of such technology is detrimental to their bottom line is to just not buy apple products anymore.
评论 #28247996 未加载
riedel将近 4 年前
The losses are big having such a backdoor, but i am wondering about the gains. As far as I understand the police and the judicial system is often overwhelmed already with evidence. Wouldn&#x27;t it be much better to pour money into the police for just this cause. My understanding is that they are largely underfunded and understaffed. It would seem much better to me if human intelligence would break into sharing rings rather than prosecuting a few more pedophiles. I would expect that they have enough leads from past cases...<p>I think just paying more taxes would be the safest interface between big tech and governments. The problem seems to me that we are loosing trust in our law enforcement and cannot find appropriate ways to empower them without involving big tech, that has no problem with cooperating with foreign law enforcement agencies. The whole situation is so awkward: Apple needs to protect us from law enforcement that they want to support. Something fundamental seems broken and research should actively look for solutions here.
zepto将近 4 年前
Here’s their paper: <a href="https:&#x2F;&#x2F;www.usenix.org&#x2F;system&#x2F;files&#x2F;sec21-kulshrestha.pdf" rel="nofollow">https:&#x2F;&#x2F;www.usenix.org&#x2F;system&#x2F;files&#x2F;sec21-kulshrestha.pdf</a><p>Their system is vulnerable in all the ways they claim.<p>However Apple’s system is not the same and does contain mitigations.<p>&gt; Apple’s muted response about possible misuse is especially puzzling because it’s a high-profile flip-flop.<p>This is a dishonest statement. Apple has not been muted about the concerns these researchers are presenting.<p>They address them here: <a href="https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Security_Threat_Model...</a><p>There is nothing in this piece that relates to Apple’s actual technology.<p>These researchers obviously have the background to review what Apple has said and identify flaws, but they have not done so here.
shadilay将近 4 年前
There should be a prize for the person who comes up the name for the CSAM version of swatting.
评论 #28248158 未加载
评论 #28248180 未加载
评论 #28247671 未加载
1cvmask将近 4 年前
I am reposting a comment to the original article that appeared in the Washington Post originally because of the slippery slope dangers (<a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Slippery_slope" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Slippery_slope</a>):<p>In a previous comment on this very same subject on Apple&#x27;s attempt to flag CSAM I wrote: This invasive capability on the device level is a massive intrusion on everyone&#x27;s privacy and there will be no limits for governments to expand it&#x27;s reach once implemented. The scope will always broaden. Well in the article they correctly point out how the scope of scanning is already broad by governments around the world and a violation of privacy by content matching political speech and other forms of censorship and government tracking. We already have that now on the big tech platforms like Twitter that censor or shadow ban contetnt that they as the arbiters (egged on by the politicians and big corporate media) of truth (or truthiness as Colbert used to say in the old show The Colbert Report) label as misinformation or disinformation. Do we now need to be prevented from communicating our thoughts and punished for spreading the truth or non-truths, especially given the false positives, and malware injections and remote device takeovers and hijackings by the Orwellian Big Tech oligopolies. Power corrupts absolutely and this is too much power in the hands of Big Corporations and Governments. From the article in case you need the lowdown: Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser. A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials. We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny. We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month. That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced. China is Apple’s second-largest market, with probably hundreds of millions of devices. What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials? Absolutely nothing, except Apple’s solemn promise. This is the same Apple that blocked Chinese citizens from apps that allow access to censored material, that acceded to China’s demand to store user data in state-owned data centers and whose chief executive infamously declared, “We follow the law wherever we do business.” Apple’s muted response about possible misuse is especially puzzling because it’s a high-profile flip-flop. After the 2015 terrorist attack in San Bernardino, Calif., the Justice Department tried to compel Apple to facilitate access to a perpetrator’s encrypted iPhone. Apple refused, swearing in court filings that if it were to build such a capability once, all bets were off about how that capability might be used in future.
评论 #28249811 未加载
评论 #28246982 未加载
zepto将近 4 年前
Without access to a technical description of what they built, we have no idea whether it is relevant.
评论 #28248018 未加载