TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The Apple Psi System [pdf]

168 点作者 lavp将近 4 年前

26 条评论

drenvuk将近 4 年前
This is incredibly annoying. They&#x27;re providing all of this information which says that this program that runs on our devices is incredibly safe if we&#x27;re not bad people. That the people at Apple and then law enforcement don&#x27;t see anything about our photos until there is a human set threshold hit by a human programmed algorithm. Technically, this should be sound since the proof has checked out with multiple cryptography experts. But none of that matters.<p>The Apple PSI System is spyware.<p>They are providing all of this info to justify putting spyware on our devices. They are attempting to put spyware on our devices to see if we can be sent to jail. That&#x27;s all that matters. That is the end effect.<p>Apple is justifying putting SPYWARE ON ALL OF THEIR PHONES. Any discussion of the technical merits of a SPYWARE system implemented against you is missing the point. It should not exist.
评论 #28104502 未加载
评论 #28104620 未加载
评论 #28104720 未加载
评论 #28104487 未加载
评论 #28105879 未加载
评论 #28106919 未加载
评论 #28107205 未加载
eknkc将近 4 年前
This is a way worse of a spyware than I initially thought.<p>- They have a database of file hashes.<p>- They can’t validate its contents by design.<p>- They can’t explain who supplies it in detail.<p>- The suppliers of these data work with &#x2F; close to the US gov.<p>- The match your own files on your own device against this database that contains who knows what.<p>- When it matches a couple of times, they alert the authorities.<p>- There is zero fucking visibility for both Apple and you by design and this is a very good thing for Apple.<p>- I assume the source of this data can update &#x2F; add new hashes in time and your device will happily comply.<p>And their only concern is to say that the algorithm is so perfect, they can not see what’s happening and there won’t be false positives (hopefully).<p>You know what, I trust your ability to do it properly. No need to explain more.<p>Problem is that the very thing you are building is fucked up by design. And obviously Apple does not address it in any way.<p>But think about the children…
评论 #28104939 未加载
评论 #28104945 未加载
评论 #28108730 未加载
评论 #28107237 未加载
xucheng将近 4 年前
While it is good that Apple seeks formal security analysis over its proposed system, there are some fundamental security assumptions baked in the system:<p>- It assumes that the server would not tamper its dataset (i,e., the list of CSAM). So it is OK to disclose the information if the client has enough matchings. But in reality, nothing prevents a malicious server adding arbitrary content to the list.<p>- It fails to consider the vulnerabilities of the perceptual hash. This includes false positives and adversarial collision attacks (<a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2011.09473" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2011.09473</a>).<p>Another potential long term issue is that it is unclear how long Apple will store the safety vouchers. As a storage service, Apple may store them forever. The system is based on Elliptic-curve cryptography. Despite it is the current state-of-the-art encryption technique, it will be broken when the quantum computer becomes a reality in the future. So it is possible that every encrypted safety vouchers can be decrypted in the next 50 years.
评论 #28104988 未加载
josephcsible将近 4 年前
&gt; Privacy for the server: A malicious client should learn nothing about the server’s dataset X ⊆ U other than its size. In particular, it is important that the client learn nothing about the intersection size |id(Y̅ ∩ X)|. Otherwise, the client can use that to extract information about X by adding test items to its list Y̅, and checking if the intersection size changes.<p>If one of your pictures is a false positive hash collision, you&#x27;ll have no idea until your front door gets broken down.<p>&gt; Privacy for the client: Let X be the server’s input from which pdata is derived. A malicious server must learn nothing about the client’s Y̅ beyond the output of ftPSIAD with respect to this set X.<p>Apple can&#x27;t check whether a hash match is a false positive or not, because they only get the matching hashes and not the pictures that triggered them. So if you have a bunch of false positives, your front door is getting broken down, with no opportunity for a human to realize the problem and intervene.<p>&gt; The protocol need not provide correct output against a malicious client. That is, the protocol need not prevent a malicious client from causing the server to obtain an incorrect ftPSI-AD output when the protocol terminates. The reason for this is that a malicious client can always choose to hide some of its data from the PSI system in order to cause an undercount of the intersection.<p>Their protocol isn&#x27;t (and can&#x27;t be) secure against the one attack that the people that this system is supposed to catch would actually commit.<p>&gt; Moreover, a malicious client that attempts to cause an overcount of the intersection will be detected by mechanisms outside of the cryptographic protocol.<p>This seems eerie to me but I can&#x27;t put my finger on why.
评论 #28104010 未加载
评论 #28107433 未加载
评论 #28104200 未加载
stefan_将近 4 年前
Where is the cryptographic proof that the only thing you are scanning for is CSAM?<p>This is just nice window dressing.
评论 #28104426 未加载
Renaud将近 4 年前
I&#x27;m sure the crypto part is well implemented.<p>However, I make the prognostic that within the next 2 years, the Chinese government will force Apple to use its own database of &quot;objectionable&quot; content and will require that the on-device photo roll be scanned, not just iCloud (they already have access to that).<p>And probably not just China, because any LEA and secret services would love to have the ability to use such a system for ad-hoc searches: dear Apple, for those users, please extend the database source to this new one we maintain and enable scanning of all on-device pictures.
评论 #28124568 未加载
gjsman-1000将近 4 年前
Also of potential interest besides the OP link, there is also the paper &quot;A Concrete-Security Analysis of the Apple PSI Protocol&quot;, also known as the &quot;Alternative Security Proof of the Apple PSI System.&quot;<p><a href="https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Alternative_Security_Proof_of_Apple_PSI_System_Mihir_Bellare.pdf" rel="nofollow">https:&#x2F;&#x2F;www.apple.com&#x2F;child-safety&#x2F;pdf&#x2F;Alternative_Security_...</a><p>Basically it&#x27;s a second opinion on the mathematics from a different perspective. The original post link is the formal proof by Apple employees and Stanford, the Alternative Proof is by the University of California.
评论 #28104142 未加载
doorknobs将近 4 年前
This document is a deflection from the main concern -- while it&#x27;s commendable that they took the effort to prove the cryptographic properties of their system, what good is that when your hash database is a government-controlled black box? Can&#x27;t exactly publish a white paper on that, huh?
supernes将近 4 年前
The crypto seems sound, but a lot of the tricky questions are waived aside with the blanket statement of &quot;mechanisms outside of the cryptographic protocol.&quot;<p>The part that worries me the most is that no one outside of Apple can verify that the hash set they&#x27;re pushing hasn&#x27;t been tampered with (Section 4, Remark 5). This allows them for example to add leaked product image hashes to hunt down and prosecute people who share info about their products before release.<p>In fact, the system seems designed to be impossible to audit, with only a subset of the whole hash set being pushed to clients, so that researchers can&#x27;t even tell when more hashes have been added. As a consequence of that design, they acknowledge that a &quot;small number&quot; of false negatives will be missed, and justify that with an argument that it improves performance (Section 2, Remark 3)<p>False positives on the other hand will be common (as detailed in Section 5, &quot;Duplicate images&quot;) - simply copying a file on two client devices that don&#x27;t share a cloud owner ID will count towards the threshold, and again fall back on &quot;mechanisms outside of the cryptographic protocol&quot;.<p>And last but not least, let&#x27;s spare a thought for the Apple employees that will be required to sift through potentially traumatizing imagery (assuming the company doesn&#x27;t outsource that to a third party.)
atoav将近 4 年前
You don&#x27;t own your apple devices anyways. It is a happy walled garden for those who don&#x27;t care too much, those who lost all hope and those who can lie to themselves very well.<p>Not that I am a open source extremist, but the moment where we couldn&#x27;t control the way our own machines run is the moment where they stopped belonging to us. Supporting a true open source phone OS might be a good idea even if you don&#x27;t use it. Because one day you might have to.
kfprt将近 4 年前
Police departments are a business. They are incentivized to secure the most convictions for the least amount of work. Finding people that posses CSAM files is easy and gets convictions with relatively little work. It does not however do much to deter child abuse. Police departments and CPS routinely ignore calls for investigations into alleged acts of CSA. Take the Sophie Long case for instance. A question for the reader, is it more important to spend resources stopping CSAM or CSA?<p>Is it really worth giving up our fundamental privacy rights when the police already routinely ignore CSA?
madmod将近 4 年前
Yeah what happens when someone abuses this for easy “untraceable” swatting. What if the next iOS malware uploaded some CSAM hashes to iCloud unless you pay 5 btc in the next 24 hours?
评论 #28104626 未加载
评论 #28106409 未加载
doe88将近 4 年前
All other flaws asides, that the hashes are not <i>auditables</i> by users on their own computers is deeply wrong and undermine the trust on any algorithm as sound as it could be. Moreover, this <i>blackbox</i> provides Apple plausible deniability if something turns ugly, they will say, <i>we&#x27;re just the middle man, we only relay the officialy approved database, we can&#x27;t vet it</i>.
评论 #28107457 未加载
unstatusthequo将近 4 年前
Time to notify the attorneys general. I can tell you personally that plaintiff firms are already gearing up to launch class actions against Apple on this. If you oppose it, notify your AG as well.<p>HN Link: <a href="https:&#x2F;&#x2F;www.naag.org&#x2F;find-my-ag&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.naag.org&#x2F;find-my-ag&#x2F;</a>
ahel将近 4 年前
Hashes can be of any file.<p>How long will it take before your hard drives are scanned for matching hashes of copyrighted material?
ljhsiung将近 4 年前
Thanks for the read. Pretty cool. The following is my layman&#x27;s understanding, mostly because it helps me to understand things by typing. Please criticize if I miss stuff.<p>Noteworthy is Dan Boneh&#x27;s contribution on this. Given his reputation in the crypto community, it seems he really does believe in this, despite all the controversy as of recent.<p>They discuss and build upon several security properties, but the most contentious point-- privacy&#x2F;leakage&#x2F;scanning of your photos-- is addressed as follows:<p>Firstly some background-- the private set intersection[1] (PSI) technique in general permits 2 sets to be compared with both parties learning *only* the intersections. In a nut shell, Apple uses this concept such that, if the number of intersecting elements is greater than a threshhold, they&#x27;re notified.<p>There are several modifications (Shamir&#x27;s secret sharing, Cuckoo tables, PKI-ish schemes) to create what Apple calls a ftPSI-AD protocol to optimize desired properties-- performance, integrity, and most notably to me, <i>zero</i> false passes. That is, innocent people will be minimized at the cost of real child-pornographic images slipping by.<p>Couple noteworthy things that still raise red-flags--<p>1) they prove that, for honest servers &lt;--&gt; malicious clients, and vice-versa, privacy is not violated for either party, but to me this considers the client as the phone and Apple the server. I&#x27;d argue that the phone is actually &quot;Mallory&quot;, and <i>you</i> are the client.<p>You might be honest, but how do you trust the Apple + phone short of reverse engineering it? This is the biggest hole to me, and so I don&#x27;t fully understand this proof (or, perhaps I have the parties mixed up).<p>2) Several things are handwaved and&#x2F;or left &quot;variable to implementation&quot;. E.g. Section 5, on &quot;near-duplicate images&quot; that may count twice to this threshhold--<p>&gt;&gt; Several solutions to this were considered, but ultimately, this issue is addressed by a mechanism outside of the cryptographic protocol<p>What the?? Hello?? Perhaps this is addressed in another whitepaper, given this is a theory&#x2F;protocol heavy paper, but this does not instill confidence.<p>Or, take this bit from remark 3--<p>&gt;&gt; If needed, these false negatives can be eliminated with a tweak to the data structure used<p>Uh, I thought not sending innocent people to jail was a pretty critical property. You&#x27;re telling me the server&#x2F;Apple, who <i>controls</i> the Cuckoo table, can just change this on a whim? How would I hold them responsible&#x2F;be notified of this?<p>These &quot;variations&quot; are remarked on several times in the paper. Again, not exactly confidence building.<p>Overall, while I really applaud this effort, and I&#x27;m not as outraged as I initially was, I&#x27;m only slightly less so and have a handful of more questions than before.<p>Again, please correct me if my annoyance might be misguided, given these technical details.<p>[1]: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Private_set_intersection" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Private_set_intersection</a>
评论 #28104257 未加载
评论 #28105600 未加载
cgio将近 4 年前
What are the storage and processing requirements for this functionality and how will they evolve over time? Can it be so that it is only loaded on my device if I use the relevant services? Other than impact on privacy it also has impact on property even though I know with Apple that ship has sailed a long time ago. My last Apple purchase was a couple days before I heard about this, and it will be my Last. Not like they’ll care, but I do, as I only accepted their walled garden in exchange for privacy, however naive that proves me to be…
renonce将近 4 年前
Even assuming the protocol is sound, Apple could choose a set of innocent photos that are frequently found on people&#x27;s devices, in addition to whatever photos they deem unlawful. In case they didn&#x27;t show the thresholds to the user, they can set it arbitrarily low. The client will never know which photos were used to determine they were guilty.<p>It&#x27;s still an interesting read from a cryptography point of view though.
dmitrygr将近 4 年前
Assuming perfectly trustworthy governments and perfectly flawless programmers, it is all good. Now where do I get me some of those in this imperfect world?
nojito将近 4 年前
This is absolutely earth shattering work in the world of differential privacy. Kudos to this research team for spending the time to work it out and publish.<p>Just wow.
naasking将近 4 年前
Others have done a good job addressing the political and social issues with this sort of endeavour, but I&#x27;m puzzled by something else. Isn&#x27;t this just trivially bypassed? Just changing a few bits in the image won&#x27;t be noticed by the human eye but the hash will be wildly different.
评论 #28106953 未加载
sydthrowaway将近 4 年前
How to learn enough crypto to understand this paper?
评论 #28104174 未加载
评论 #28124716 未加载
gjsman-1000将近 4 年前
Topic should be renamed the Apple PSI System or Apple Private Set Intersection System (not Psi or something related to tire pressure.)
评论 #28104043 未加载
评论 #28104077 未加载
elisharobinson将近 4 年前
show me the code or STFU
markrobin将近 4 年前
need enough crypto understand
darkhorn将近 4 年前
Can they scan for face recognition? For specific faces?