Another sad indicator of the level Schneier is playing at today, in the same vein as "avoid elliptic curves, we don't trust the math".<p>Once again: the only reason this bug got so much attention and press is that it's easy for laypeople to get their heads around. All you have to understand is how "goto" works. The bug is vivid, and so (paradoxically) seems scarier.<p>Significantly worse bugs are found every week. Within a few days of the announcement of this TLS bug, a Flash bug was announced, after being detected in exploits in the wild, that enabled reliable drive-by hijackings of browsers --- multiple browsers. It was off the HN front page within an hour.<p>TLS bugs aren't even unusual. We get a new one every few years ago. Firefox managed a PKCS1v15 parsing bug that allowed anyone with a Python script and 30 milliseconds to generate a certificate for any domain. Other browsers have screwed up certificate chaining, so that any domain could sign any other domain. But nobody understands PKCS1v15 padding, nobody understands certificate chaining, and so nobody writes stories about these bugs. But their impact is identical to this one.
Often when things like this happen there's a large conspiracy at play in some peoples minds, "Apple <i>deliberately</i> left a security vulnerability". But it falls apart pretty quickly, it's in an open sourced package, so the assumption is <i>someone</i> is eventually going to see it, so it's not going to remain a secret and thus is useless as a stealth backdoor.<p>The likelihood is pretty simple, someone fucked up. On a potentially <i>huge</i> level, but a fuck up none the less. These things do unfortunately happen, and no doubt it'll prompt an internal review of their change management process, and their build chain, and what they can do to isolate issues like this in the future.
As noted in the article, <i>plausible deniability</i> is a key criterion for thinking of this as a possible NSA insertion.<p>In other words, it's only possible that it's an NSA job if it's also possible that it isn't. Something tells me therefore that we're unlikely ever to know for sure.
If I were at the NSA and wanted to introduce a bug like this, I'd get access to Apple's build servers (either through an exploit or by just talking to an employee who has access) and arrange for a binary patch to be applied to the generated object files at build time. It would be basically undetectable, as no amount of source code auditing would reveal it. Could probably make it look like a compiler bug without too much difficulty.<p>Of course, this presumes that the NSA <i>needs</i> to introduce bugs like this. I imagine they do just fine for now merely taking advantage of naturally occurring bugs.
Well, <a href="http://www.opensource.apple.com/release/os-x-109/" rel="nofollow">http://www.opensource.apple.com/release/os-x-109/</a> has the list of open sourced code from Mavericks. Let the code analysis begin, it would help everyone out.
It's not just an "iOS" flaw. It's a "latest, greatest Apple OS" flaw.<p>And anyone who given a choice between SSL and TLS relies on TLS is not putting security as a top priority.<p>And why would anyone who cares even the slightest about security use Flash? Are you serious?<p>I guess this is why security consulting could be easy money... clients want to use Flash and "stay secure". Yeah, sure, we can handle that for you.<p>Well, now you cannot even use a Mac without the potential for HTTPS authentication not working. Better make sure the OS is updated. Sounds a lot like Microsoft. Maybe you could start a business updating Mac OS's.<p>"Does anyone know what's going on inside Apple?"<p>If they did they couldn't say. All employees are sworn to secrecy.<p>I blackhole all traffic from Apple devices to *.apple.com<p>You would not believe (or maybe you would, if you are a "security consultant" or some such)... you would not believe the amount of "phoning home" that these devices do.<p>I agree you can't trust "security consultants" who do their marketing via blogs and forums.<p>But you surely cannot trust Apple either.<p>The flaw was one line of code.<p>I'm curious. What is the size on the update?<p>Imagine if you could make the change yourself, recompile and dd an image to your device.
I'd like to believe Apple simply as no code review but it's odd they removed a specific check -- was it not suppose to be there? Does anyone know if the patch added the line back in or if it removed the duplicate goto?<p>The one thing that points to it not being a backdoor is I doubt Apple would open source the code. Surely they'd maintain a separate branch or something?
I really hope Apple will disclose its findings on this matter, in name of the transparency they're advocating.
Imagine if Apple itself states something along the lines of "we found out that the line of code responsible for the bug was planted on purpose".
Should they make such a statement? Better: COULD they make it?
Why is is so important to know wether this bug was left deliberately or not, or even sneaked in by the NSA?<p>To the more important questions 'Did the NSA almost immediately discover this bug' and 'Did they exploit it', I answer with a resounding yes.
I think I'll apply Betteridge's law[1] to this one.<p>[1]: <a href="http://en.wikipedia.org/wiki/Betteridge%27s_law_of_headlines" rel="nofollow">http://en.wikipedia.org/wiki/Betteridge%27s_law_of_headlines</a>
I honestly think it was deliberate. What class of Operating System developer ships their OS releases without 100% CODE COVERAGE? Apple do code coverage testing, surely? I mean, more than the "-warn-dead-code" args that get flung around.
I can't understand how this would have gotten released into the wild if they were doing industry-standard code coverage tests. And .. if they're not doing industrial-strength code-coverage testing on their iOS/OSX release builds, thats the real news here ..
I don't think the bug was deliberate, could of been just an honest mistake.<p>But on the other hand, why didn't the compiler generate an "Unreachable Code" warning during build?<p>We have explicitly set this warning to "Treat as Error" during our builds.
Where am I mistaken that having this bug deliberate is meaningful only if you control most of the networks? In which case there are a lot scarier things to worry about.
I don't think the NSA is nearly as technologically capable as people think they are. If they want info from someone all they have to do is detain them and bust open their kneecaps with a hammer. No one would ever find out. So there's really no reason to go through all the shadow games.<p>If this is an intentional bug, I think it's likely a hacker or just a disgruntled employee. But I'd be willing to wager that it was a tiredness error, rushed to implementation by an overworked and likely underpaid engineer deep within Apple.
> The flaw is subtle, and hard to spot while scanning the code<p>The author can't be serious. This particular bug has made the rounds and is understood even by non-programmers.