This is awful!<p>Even ignoring the obvious moral problems with this, I don't think such a system could be as accurate as hoped. All the publically available face training datasets contain mostly US demographics (read: white people), and it's unclear how the system performance will be when applied to a data distribution that's dissimilar from the training distribution (read: nonwhite faces). I'm not aware of a lot of research about this.<p>Even if such a system could be built with 99% accuracy, there are hundreds of thousands of people that pass through international flights every day. For every false positive, your security people have to go through all of the steps. How many innocent folks will confuse the scanner and be taken into custody for false positives?<p>This is just a tool for oppression. Nothing more.<p>(See Part 1 of Scheirer and Boult's tutorial slides at IJCB 2011, "Biometrics: Practical Issues in Privacy and Security," for a great high-level overview of these kinds of issues: <a href="http://web.archive.org/web/20130412032945/http://www.securics.com/~walter/IJCB2011/IJCB11-tutorial-part1.pdf" rel="nofollow">http://web.archive.org/web/20130412032945/http://www.securic...</a> In particular, the slides starting on page 19 have more about this kind of analysis)