What's really disappointing is that there seems to be an all-or-nothing security model here. If I pair my phone with a computer, then suddenly it has complete access to spy on me, install monitoring tools that can continue to run, etc. Why can't there be a way where I can transfer music/photos to/from my phone without providing this full device access?<p>You'd be pretty annoyed if the front door to your house, when you opened it, also opened up your document safe, emptied your wallet onto the floor and invited visitors to leave bugging devices to spy on you later.<p>Also, the defence of "just don't agree to pair your phone with an unknown USB device" can actually be tricky. On a flight, I plugged my phone into the USB port on the seatback to charge it. The phone repeatedly kept asking if I wanted to pair it with something (who knows what it was? the entertainment system, maybe?). If I had accidentally hit the wrong button only once (on a prompt that randomly appeared), my phone could have been owned, and there's no easy way to un-pair.
His work on security in iOS is quite interesting, but he seems determined to spin everything for maximum publicity rather than, well, accuracy or truth, which is a shame. For example, on that blog post he writes about pcapd and developers:<p><pre><code> "Lets start with pcapd; I mentioned in my talk that pcapd has many legitimate uses such as these"
</code></pre>
Yet in the slides for his talk[1] under theories he writes"<p><pre><code> "Maybe for Developers for Debugging? No."
</code></pre>
There are many examples of things like this in his writing, where actual facts are unsaid in order to gain the maximum melodrama for a particular statement.<p>On top of that he seems to continually avoid the point that to enable these you need physical access to the device (for the pairing process to have a machine marked as trusted). If you have physical access, enabling debug[2] features are probably the least of your worries.<p>Anyway, rant over. It just annoys me that genuinely interesting information often seems to be spun by personalities to give it artificial gloss these days, making it all feel a bit slimy and self-serving.<p>[1] <a href="https://pentest.com/ios_backdoors_attack_points_surveillance_mechanisms.pdf" rel="nofollow">https://pentest.com/ios_backdoors_attack_points_surveillance...</a><p>[2] Debug if you're Apple, Back Doors if you're Mr. Zdziarski
I'm a little conflicted about this. On one hand it's good to learn about the security of your device, on the other hand he's far too partial and sensationalist about these iOS features. Yes, features.<p>• It's good to know packet capture can be remotely enabled on your device from data collected on a computer the device has trusted.<p>• It's good to know Apple has the power to look through your encrypted files given physical access (file relay).<p>• It's good to know one can extract files from his phone using a trusted computer (house arrest).<p>However, that's it. There's no "back door". There's no (implied or otherwise) NSA conspiracy. There's a reason why the media "misunderstood" his talk: it was full of hyperbole.
This post appears to be gone. Here's Apple's (new) documentation on the matter: <a href="http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=en_US" rel="nofollow">http://support.apple.com/kb/HT6331?viewlocale=en_US&locale=e...</a><p>If Apple is being truthful and transparent, calling this a "backdoor" is a bit like calling sshd a "backdoor".
>As usual, the media has completely derailed the intention of my talk.<p>Lol. The connotations in his presentation and his retweeting of all the press it got were pretty clear. Seems to me like this guy is looking for his next gig.
I'm getting 404s for this link and the root.<p>Cache: <a href="http://webcache.googleusercontent.com/search?q=cache:www.zdziarski.com/blog/?p=3466" rel="nofollow">http://webcache.googleusercontent.com/search?q=cache:www.zdz...</a>
So in short: Apple has back doors that they claim aren't really back doors since only Apple apps can use them. If the NSA hasn't been using them already, it is only a matter of time.
This sounds a bit like the same sort of customer-experience related hacks that MSFT used to (maybe still does) put in all their software and that caused so many holes in security. Poor attention to security won't just let US government intrusion, it'll also let in other governments and hackers. Seriously, letting a 'trusted computer' enable that data syncing? They're playing with fire. (feel free to let me know if I'm missing anything here)
No, what's really disappointing is FUD directed at debugging tools. These sorts of "presentations" and "research" are pointless. Anyone who does IOS development knows about these tools, they're not secret. Apple has Tech Notes and documentation in Xcode on them going back years. Let's please try to focus our ire where it's needed.<p>And to whoever said that Google is "very open" about their malicious app problems, well, gosh, where to start...<p>Google's Android is the cause of the malicious app problem. By not allowing users to have fine-grained access control on the various entitlements in Android, Google is forcing users to adopt an all-or-nothing approach to every app they download. Don't like that this app wants access to your Contacts? Fine, then don't install it. The root problem here is not allowing the user to determine, after-the-fact, what privileges an app should have. Apple gets this right, Google fails miserably.<p>Of course there's also no one Android. You know that, right? There's a bunch of different Androids from a bunch of different carriers all of which run different hacked-up versions littered with a bunch of crap code from carriers that almost no one wants. Code which I imagine is also littered with security bugs because it's written by carriers who barely give a damn if this junk even works and wouldn't know "secure" if it hit them in the head.<p>And on top of all that, depending on your phone and depending on your carrier, that brand new phone you just bought might even be running an Android that's years out of date and full of known vulnerabilities. There's no comparison when it comes to timely IOS security updates and Android. The Android ecosystem is a complete fail on the security front at the moment. Period.<p>Google can play dumb if they want. Plausible deniability is oftentimes quite useful after all...
Seems like iOS 8 should offer a settings screen to allow you to revoke sync keys and/or see a list of computers you've trusted in the past. Perhaps it should default to deleting the keys if you haven't sync'd with a specific computer in some timeout period (30 days?).<p>A few of the services should be locked down a bit further regardless of anything else.<p>I also don't see this as a valid bypass of encrypted files - you need the device to be on and have its passcode entered. That's a far cry from taking a cold device, booting it, then connecting with a stolen sync key. Besides the fact that we've known you were unsafe if the device was unlocked for some time - some police even carry Faraday bags and portable chargers to keep them accessible probably for this very reason.
I remember when it was trivial to examine artifacts from itunes backup until backup encryption was implemented with passphrase. (v6 I think?)<p>Something that still has the capabillity to bypass backup encryption sounds incredibly dangerous from my perspective.<p>There are plenty of legitimate concerns mentioned in his talk. I agree with the no cause for panic, but what about the fact that there are obviously services not disclosed to us, developers, users, enterprise executives relying in this for a trusted platform, etc…<p>The potential risk this poses (or implies) makes the lack of initial disclosure to be criminally ignorant at least.
If Apple wants to balance the scale, they will need to do more than address and resolve these issues. They need to extend their transparency a smidgen. :)
Resolving the hyperbole debate: asking a user "May I connect to some device?", then installing permanent remote access to the device, and never prompting the user again nor giving them further information, is a plain and simple backdoor.<p>The difference between this and malware is malware authors create web pages explaining to users to "Just click OK and don't ask what this is" before they deliver you a backdoored application.<p>If the prompt said "May we install remote access tools that allow us to remotely control and remove data from your device forever?", then it wouldn't be a backdoor. It would be a front door.
syncing iPhone to new computer
<a href="http://www.leawo.org/tutorial/how-to-sync-iphone-to-new-computer.html" rel="nofollow">http://www.leawo.org/tutorial/how-to-sync-iphone-to-new-comp...</a>
Not Found<p>The requested URL /blog/ was not found on this server.<p>Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
Kind of glad Apple just confirmed the services are there and ignored him otherwise. I'm sure he has a nice career ahead of him of complaining the cp command in the adb shell on Android isn't hard coded to ignore any path with DCIM (user pictures) in it next and other nonsense. Honestly, he isn't helping anything and he is just making it harder for Apple to fix broken phones and provide better customer service in general.<p>Wonder what he thinks of amazon MayDay showing your screen to custom support remotely. Users love it since the custom support can now guide you to exactly the right settings and other things, but I think privacy nuts like this will have seizures.
Most companies will downplay any negative aspect of their product; it's pretty normal, part of the survival aspect of an organization. Microsoft has done the same thing a few times as well.
<a href="http://www.zdnet.com/blog/security/microsoft-downplays-bitlocker-password-leakage/1841" rel="nofollow">http://www.zdnet.com/blog/security/microsoft-downplays-bitlo...</a>
<a href="http://www.computerworld.com/s/article/9133248/Microsoft_confirms_serious_IIS_bug_downplays_threat?intsrc=news_ts_head" rel="nofollow">http://www.computerworld.com/s/article/9133248/Microsoft_con...</a><p>I'm more surprised at the fact that Apple decided to actually confirm the existence of a back door in their product (even though they are "misleading" (as stated in the article) about what really is at risk here).
The fact that Apple was downplaying this tells me they haven't realized that a product, especially operating systems and computers, depends a lot on the userbase; if the userbase is kept ignorant then Apple will keep itself in its 'comfortable zone' since its not being pushed by the users to improve.<p>Nonetheless, its still pretty good that Apple has confirmed this, baby steps I guess.