I've run into the programmer vs normal person difference in thinking quite often with regard to customer support calls.<p>Occasionally I will be called by someone from some company or government department because they want to notify of something. Lets say for example, I forgot to pay my insurance bill.<p>At some point in the call they will ask me "I just need to verify your identity with some security questions." and ask me for something like my date of birth or my home address.<p>The only correct answer to this is "I can't give you that information. You called me. I have no idea who you are."<p>I'm always met with complete incredulity at this concept. About 50% of callers don't understand at all what I'm trying to get at. Most of the rest just don't have any idea how to continue.<p>What I tell them at this point is that the correct way to handle this is that they need to give me an extension number for them personally and I will find the external number of their company/dept myself on their website and then call them back.<p>Unfortunately a lot of these callers either can't (due to not having a personal extension number) or wont (it's off protocol I guess?).<p>The problem is, I feel like an asshole for taking a stand on things like this ("Why is this guy trying to make my job difficult"), but more people need to understand that it's all too easy to be socially engineered!
I think this is a fantastic article - and I thought it was genuinely funny, but my sense of humor is about 80% butt jokes so I think that's just an unusual alignment of my taste with the author's ;).<p>Now, allow me to take this article about irreduceable complexity and reduce its complexity: the question is not even about which shade of security gray to go with. It's an ongoing psychological battle between security and security theater, which is an unrelated set of activities that is almost, but not entirely, exactly unlike actual security.<p>Security theater operates on the level of what feels right, instead of what is logically right. That makes it powerful. It offers an appearance and feeling of safety, and there's value in that. Of course, if you ask someone "do you want a phone that feels safe or is actually safe," they'll pick the latter, but actually, they want and need both.<p>That's the problem with this issue. The general public doesn't feel the difference between these two domains clearly enough to know how dangerous the governments plan for the iPhone is - they don't understand that it shifts the balance wholly from security to security theater, when what you actually want is a blend of both. You need The Great Tagliatelle and the locked cockpit door. You need laminated paper and you need to have pilots with secret codes. Without security, an iPhone will still FEEL safe - it just won't be.<p>The problem is, feeling safe is good enough for most. That's why we mostly have metal locks and not giant flaming Doberman-lauching turrets on our lawns. Until the public gets the need for a balance, this debate will go nowhere fast, and the government - who is very used to getting its way - will skillfully play on our desire to feel safe in order to get what it needs.
It is really surprising the amount of paranoia and thought that goes into software security compared to pretty much everything else. A driver's license is mostly a laminated piece of paper with some holograms. Social security numbers are 9 digit passwords you share over and over again that can't really be changed.<p>I was recently asked to sign a receipt at a store when I'd used Apple Pay. My phone uses a fingerprint reader to authorize a one-time-use token for payment that's transmitted in a cryptographically secure way. But that signature - that's the real unfakeable proof.
The anecdote about the airline industry in the US is half-correct. It's true that cockpit protocol didn't change after the German crash, but that's because the airlines in the US already have a better version than the German one. When a pilot leaves the cockpit to drop a grenade, a flight attendant must enter the cockpit to sit with the remaining pilot until the bomber returns.<p>While this doesn't protect against a completely insane pilot (he/she could kill the flight attendant), it does eliminate scenarios where the cockpit only has one person present.
<This is the moment you realize that some people just want to watch the world burn.><p>Or, maybe the user is "kicking the tires" to see how robustly it was coded, concerned that poor data verification practices reflect weaknesses elsewhere in the code as well.<p>EDIT: s/inadequacies/weaknesses for clarity
Good article, very click-baity title.<p>The article is about software security and how it compares (or doesn't compare) to real-world security, and what this means for the Apple case.<p>What drew me in is mostly that the beginning is written in a very light-hearted style, so it's a pretty easy read at first.
In “thinking like a technologist”, this post is missing the context/subtext in the airline security game.<p>The metal detector makes the airplane neither more nor less safe than the security theater porno scanner machine, and the precheck also doesn’t accomplish anything. The only reason most of the people need to be diverted through the porno scanner machine is that the federal government spent a few billion on them in a handout to some senator’s friends, and to scrap them now would make the tremendous waste of money obvious to everyone.<p>But at the same time, business travelers don’t want to go through the new machines, so we let them pay a nominal fee (easily amortized down to trivial if you fly a few dozen times per year) to go through the old metal detectors instead. Bonus: they now get to take a shortcut in the security line that they didn’t used to get. If someone without a real precheck manages to sneak through the metal detector line by counterfeiting some paper token, it isn’t a real security risk.
I was astonished by how un-safe the road/traffic system really is 8 years ago when I started to learn driving. Just think about it, driving on road is extremely vulnerable: any other driver on the road could make a small mistake to get you both killed, accidentally or intentionally. Yet the road system is far more secure than its cyberspace counter part. Why?<p>* Potential damage is roughly symmetric. A bad/evil driver might kill others but very likely also kill himself.<p>* Threat is local. There is no way a bad/evil driver to kill all the drivers.<p>* The road system as a whole does not have the single point of failure.<p>I think the claim in the article is dangerously wrong. We should never be given a binary choice in such big issue.
The best security: be honest and place complete trust in those you employ. Hire people you trust. If there is nothing that is blocking you, then morale is higher, and people get more focused on what is important. If there is nothing to break into, there is less temptation. Your employees won't be perfect, but if they are trusted, if you let things be, there's a good chance everything will be fine- at least as fine as it would have otherwise.