Having read this sorry bag of words to the end now I see their definition of Apple winning is the most cynical one possible: they are talking just about the PR win Apple gets from attempting to defend what is really important, the privacy rights of customers. They don't seem to think whether Apple succeeds in that attempt or not really matters, nor do they seem to think the underlying greater customer privacy concern (that of other users beyond this case) matters to Apple.<p>Their conclusion is stated in this staggeringly wrong paragraph:<p>"As mobile device forensics experts, we believe that, since Apple can create the software with confidence that it can never ever be used on another iPhone, they should reluctantly comply."<p>I doubt Apple has any such confidence. Matching the tool to a new phone would just be a matter of changing some configuration data such as keys and IDs.<p>Then later in the article, in a display of cognitive dissonance, they do a 180 and recognize that if Apple creates a pin cracking tool, it is likely the tool will be used beyond just this case.<p>And look at the chasm in their logic as they step through the implications of potential abuse of the tool:<p>"...you cannot make a backdoor for just law enforcement; the backdoor would possibly be accessible to anyone with enough skills and knowledge. So if these requests begin to multiply to the point where they are routine, or assumed in every case involving a phone, Apple and its brethren will then need to make a stand."<p>See what they did?<p>They acknowledged that the back door could be used by anyone, but then they pretend that all these people interested in using the tool will be submitting "requests" as if that is how it will work.<p>Yes, in a dream scenario where things do not leak, that could be the case. But these are supposedly security researchers writing this article. They should know that we don't live in a dream world.<p>But the authors, who have ties to vaguely specified US Government military entities including the US Army (the branch of the military which hosts, at its Fort Meade installation, the NSA) want Apple to play along and create a tool that could very well leak. They think everything will be OK because Apple can always make a stand after the fact:<p>"...if these requests begin to multiply to the point where they are routine, or assumed in every case involving a phone, Apple and its brethren will then need to make a stand."<p>It's a little too late then, though.<p>If a tool such as this were to leak, which is possible, then requests, and responses to those requests, would be moot. There would be no requests from criminals and rogue actors with access in governments; those people would just use the tool.<p>The typical response to "what if this leaks" from people of this caliber is to say "then make sure the tool doesn't leak!" But Apple realizes the stakes are too high, and the risk too great, so it wants to a more effective path, which is to not create such a tool in the first place.