TL;DR: "they're asking the public to grant them significant new powers that could put all of our communications infrastructure at risk, and to trust them to not misuse these powers. But they're deliberately misleading the public (and the judiciary) to try to gain these powers. This is not how a trustworthy agency operates. We should not be fooled."
I have to say, whoever at the FBI decided this was the right case to push their new doctrine, could have done his/her homework a bit better. Technically speaking, this is the last iPhone you can actually crack <i>without</i> assistance from Apple. They are making it harder for themselves. They only have to wait for another major incident, retrieve (or plant, why not) an iPhone 6 from the scene, and do it again, this time <i>for real</i>.<p>Unless they are trying to pre-empt something else (like the recently-touted shift to "devices even we can't access" from Tim Cook, which may or may not be simple advertising), they just picked the wrong time to stir this particular pot.
Heck, the FBI could also disable writes to the chip, or simply interpose some logic that pretends to write, but actually doesn't (a non-write-through cache :-) ).<p>That is, if the secrets in question are on that NAND chip.
Interesting technique, but it doesn't remove the long interval between permitted passcode attempts - an equally important problem for brute-forcing.<p>So the FBI would most likely still require Apple's assistance in this.
Apple said that could sync the data if the AppleID password wasn't changed. Can Apple just revert the AppleID account on their servers to a backup with the old password hash (or however it is stored)? Why wouldn't this work? Has something on the phone changed because of the password change or is Apple unwilling or unable to revert the AppleID account?
It seems like there are several articles and security experts out there explaining how to recover data from a locked iPhone as if it were a cakewalk but where is one example of a complete soup-to-nuts case study on unlocking the same model phone as the San Bernardino shooter?<p>If you want the American public to believe the FBI is making fraudulent claims, show demonstrable proof that it can actually be done instead of all the talk and theories.
The device is evidence, so all of you saying they can just start desoldering things and such need to think about that. What is the first thing a defense attorney would say if the data were to be used in a criminal trial? That's right, "the FBI replaced the memory chip on the phone with one they wrote their own copy of the data to." That is only after they potentially permanently damage the device and data.
I think this makes the FBI look dumb, but I don't think this really helps them either.<p>If the NSA did this for espionage it's one thing, but I'm curious as to whether substantially modifying the iPhone in this way would stand up in court.... How would the police assert that they preserved evidence after doing this?<p>I was involved in a drawn out case challenged the validity of data recovered from backup at great. That was easy to assert with normal IT people, and yet it took weeks to litigate. Couldn't imagine how this would go.
I wonder if the FBI has checked for any ways to circumvent the passcode screen using software bugs.<p>Edit: Not sure why I got downvoted. I can currently circumvent my keyboard passcode with a number of steps, and I'm on iOS 9. Steps to try for yourself:<p>Edit: Ok I've been tricked. The steps below are unnecessary as the first step actually unlocks your iPhone in the background. ¯\_(ツ)_/¯ The fact remains though that these bugs have existed in the past and may exist on the device the FBI wants to unlock.<p>1. Invoke Siri, "what time is it?"<p>2. Press the time/clock that is shown<p>3. Tap the + icon.<p>4. Type some arbitrarily long string into the search box. Highlight that text and copy it.<p>5. Tap on the search box. There should be a share option if your device is capable. Tap the share option.<p>6. Share to messages.<p>7. Press the home button.<p>Congrats, you're more effective than the FBI.
This reminds me of the republican congressman from Cali, Issa, telling the FBI in very technical terms (inserting in between that he could be completely wrong) the exact same thing mentioned in this article. I'm unsure if the author was inspired by congressman Issa or if he came to it by his own accord.<p>More over, what's more fascinating is, some people may say it's privacy v security and the fight for terror. But what has emerged from the last few weeks is multiple reason why the FBI should not win in court, regardless of your perspective of terror. It's been very clear from day 1 that the intentions of the FBI are vicious and non-genuine, and with every passing day, more people are finding out.
Seems like a pretty articulate explanation of what is going on here. Of course I realize that my confirmation bias will cause me to see articles more in line with my way of thinking as 'right' but I've also worked with NAND flash devices and believe that the chip[1] they use in the phone does not have any sort of protections on the NAND flash itself, you should be able to just drop it into a test fixture and read it out.<p>[1] <a href="http://toshiba.semicon-storage.com/info/docget.jsp?did=15002&prodName=TH58NVG4S0FTA20" rel="nofollow">http://toshiba.semicon-storage.com/info/docget.jsp?did=15002...</a>
Anyone else a little surprised that apples security feature here is so easy to sidestep? I'd have thought, in the least, that any such keys were stored in the main processor without external read/write capabilities.
From the sound of various blogs, articles, etc., it sounding like the FBI doesn't have anyone who has technical expertise in this area (or if they do, those persons are being kept buried). While the court case is important to the FBI (and very wrong to the public), the technical details of breaking into an iPhone should not have been an issue for them.<p>I'm starting to think no one is driving the clown car in their technical division.
> If it turns out that the auto-erase feature is on, and the Effaceable Storage gets erased, they can remove the chip, copy the original information back in, and replace it.<p>Sounds like a better hack would be to interpose the flash memory interface with a RAM cache that simulates writes without modifying the original flash data. Then they can hammer away at brute forcing it without the delay of reburning the flash.
The ACLU is not wrong, they are right in the <i>technical</i> sense.<p>But I very much doubt you would practically manage to remove that NAND chip and replace it very often on that umpteen layer ultra thin board. Instead, remove it once and stick it in a test fixture, then try brute forcing it.
Sorry about the slight OT, but what truth is there in this statement I was presented with?<p>>Even if an iPhone is locked, all of that encrypted data can technically be read easily so long as the phone had at least been unlocked once since the time it was booted up.<p>Obviously I think it's a nonsense, but I have no way of disproving it (even though the burden of proof is on the claimer, naturally).<p>Edit: OK I found this <a href="http://www.darthnull.org/2014/10/06/ios-encryption" rel="nofollow">http://www.darthnull.org/2014/10/06/ios-encryption</a> so never mind, I guess...
This attack was already widely discussed here, last week: <a href="https://news.ycombinator.com/item?id=11199093" rel="nofollow">https://news.ycombinator.com/item?id=11199093</a>
Maybe their exists experts that can get this right every time but there are significant risks to damaging a chip desoldering and resoldering. It's not just removing a through hole capacitor.
Never attribute to malice that which can be attributed to stupidity. Some engineer probably told upper management they couldn't decrypt the phone because the software would erase all data. Maybe because they didn't know, or didn't want to, but still this has blown out of proportion.<p>To be clear I don't think apple should compromise the phone, just that this is not a long con by the FBI to compromise all phones.
the most frustrating part of this whole thing is the multi-headed response by various agency chieftains. fbi says one thing. nsa says another. former generals say another.<p>am i crazy to want the president step up and say: "our position as a government is: x"? there's no/no way this has escaped his notice. isn't that part of the job description of "leader of the free world?"
Relevant grant from the Department of Homeland Security from 2011: <a href="https://www.sbir.gov/sbirsearch/detail/361729" rel="nofollow">https://www.sbir.gov/sbirsearch/detail/361729</a><p>I'm surprised someone at Uni hasn't made demonstrating this exact attack a class project.
What strikes me as odd in all those analysis is that they all assume that the FBI is not expecting that weakened security will mean that there will be far more difficult to address crime -- i.e. far more on their plate.
I don't know why this case is getting so much attention when it's readily apparently the FBI could just get everything off the phone with a cellebrite & call it a day.
> Why the FBI can easily work around “auto-erase”<p>If it's so easy, then the ACLU should have no problem demonstrating it with an actual iPhone 5c.
This is really annoying. I wrote a blog post last week making this exact same point, posted it here, and it promptly got flagged to death, most likely by the same people who were commenting that I was "absolutely, totally wrong".<p><a href="https://news.ycombinator.com/item?id=11199093" rel="nofollow">https://news.ycombinator.com/item?id=11199093</a><p>Nice to be vindicated though.
This article seems wrong to me. I don't know a ton about the iPhone's specific implementation. That said, I was under the impression that these systems all worked similarly to the PC's TPM. Essentially, the encryption key is stored in a chip that acts as a black box. That chip is manufactured in such a way that makes it extremely difficult to extract data from. You can't simply copy it. You'd have to take it apart, inspect it with a microscope, and hope you don't destroy the data in the process.<p>The OS should set the security level initially. The TPM would enforce it. You can't modify the OS to make an attempt without it counting against the initially configured limit.<p><a href="https://en.wikipedia.org/wiki/Trusted_Platform_Module" rel="nofollow">https://en.wikipedia.org/wiki/Trusted_Platform_Module</a>
With 14 million combinations just in a 4 character alphanumeric(upper/lower/numbers) password, I would think they would start to encounter flash reliability issues re-writing this "Effaceable Storage" long before the password could be broken.<p>This would also slow down their attack considerably.<p>I disagree that the claim is fraudulent.
<i>"The FBI can simply remove this chip from the circuit board (“desolder” it), connect it to a device capable of reading and writing NAND flash, and copy all of its data. It can then replace the chip, and start testing passcodes. If it turns out that the auto-erase feature is on, and the Effaceable Storage gets erased, they can remove the chip, copy the original information back in, and replace it. If they plan to do this many times, they can attach a “test socket” to the circuit board that makes it easy and fast to do this kind of chip swapping."</i><p>Right. They <i>could</i> do this, and risk destroying the device, or they could ask Apple to do the easy, reliable thing, and just install a build on this phone that allows brute-force attacks.<p>Given that Apple has a long history of complying with these kinds of requests for valid search warrants, and that this situation is about as clear as it gets when it comes to justifiable uses of government investigatory powers, it's obvious why they're taking the latter approach, and not the former.<p>There's a legitimate privacy debate in this case, but this isn't it.<p>Edit: I'm just stating facts here, folks. Downvoting me won't change those facts, or make the government change its tactic.