This is a classic example of the false positive rate fallacy.<p>Let's say that there are a million people, and the police have photos of 100,000 of them. A crime is committed, and they pull the surveillance of it, and match against their database. They have a funky image matching system that has a false positive rate of 1 in 100,000 people, which is <i>way</i> more accurate than I think facial recognition systems are right now, but let's just roll with it. Of course, on average, this system will produce one positive hit per search. So, the police roll up to that person's home and arrest them.<p>Then, in court, they get to argue that their system has a 1 in 100,000 false positive rate, so there is a chance of 1 in 100,000 that this person is innocent.<p>Wrong!<p>There are ten people in the population of 1 million that the software would comfortably produce a positive hit for. They can't all be the culprit. The chance isn't 1 in 100,000 that the person is innocent - it is in fact at least 9 out of 10 that they are innocent. This person just happens to be the one person out of the ten that would match that had the bad luck to be stored in the police database. Nothing more.