As the article says,<p>"Before the advent of these new technologies, time and effort created effective barriers to surveillance abuse. But those barriers are now being removed. They must be rebuilt in the law."<p>Time and effort maintained privacy in the past, by default. However, as technology improves, privacy is no longer the default, so we need to request it, specifically.<p>I hear again and again that "no one cares about privacy", but I'm starting to think this is a relic of the past. Perhaps 10 years from now we will say "no one <i>used to</i> care about privacy, but that was before technology became sufficiently advanced."
Given the rate of false positives when "cold hits" are used with DNA databases, facial recognition (which is presumably higher entropy than DNA) has the potential to send a lot of innocent people to jail.<p>I also have some concerns of inadvertent "software racism". See for example, this video: <a href="https://www.youtube.com/watch?v=t4DT3tQqgRM" rel="nofollow">https://www.youtube.com/watch?v=t4DT3tQqgRM</a>. Obviously facial recognition isn't <i>inherently</i> incapable of recognizing people of different races, but if it's designed and calibrated on subjects of one race, it won't necessarily work well for subjects of different races. The last thing we want is a facial recognition database that thinks e.g. all Arabs look alike.<p>And of course, there are a ton of ways that it can be thwarted easily and inconspicuously, even if it works very well. Are all surveillance cameras going to be fitted with IR filters to ensure that a bright IR LED necklace doesn't blind them? And how hard is it for a criminal to hire a makeup artist?
If VR headsets like Oculus Rift take off, this problem can be mitigated. People will walk around with headsets to cover their entire face, defeating the cameras.<p>Surveillance is going to happen, if you ban government surveillance, the public will get this tech and drones and be doing it anyway. So the robust way to fight it is to plan your life knowing that you can always be monitored and someone will always know where you are, even at home.<p>This is a great opportunity for startups. Webapps that allow you to plan your security, kind of mini-fiefdoms with your family, friends, neighbors, associations that plan against attacks from rivals. Electronically activated weapon systems that shoot when a threat is detected. The Mad Max era is coming, and hackers will get many opportunities to earn a fortune.
Given the 15.8% success rate of the cat detector on large data sets: <a href="http://arxiv.org/pdf/1112.6209.pdf&embedded=true" rel="nofollow">http://arxiv.org/pdf/1112.6209.pdf&embedded=true</a>,<p>My chief worry is overconfidence from hopelessly technologically ignorant bureaucrats combined with a slew of false negatives and false positives when applied to human faces on general surveillance feeds.<p>In contrast, driver's license, ID, and arrest photos are at least severely constrained input filters wherein a single individual is placed against relatively uncomplicated background pixels and the face is in roughly the same orientation.
> "While this sort of technology may have benefits for law enforcement (recall that the suspects in the Boston Marathon bombings were identified with help from camera footage), it also invites abuse."<p>1. Yes /without/ this technology, and<p>2. How would that have prevented the Boston tragedy if it were in use?<p>What exactly does NYT mean here.
> <i>"But as surveillance technology improves, the distinction between public spaces and private spaces becomes less meaningful."</i><p>Or more to the point: as the NSA mandates access to all private data, the distinction of recordings from public spaces and recordings from private spaces is made moot.