The ACLU is doing a lot of great work to hold government accountable when it comes to facial recognition tech.<p><a href="https://www.aclu.org/press-releases/aclu-challenges-fbi-face-recognition-secrecy" rel="nofollow">https://www.aclu.org/press-releases/aclu-challenges-fbi-face...</a><p><a href="https://www.aclu.org/press-releases/aclu-challenges-dhs-face-recognition-secrecy" rel="nofollow">https://www.aclu.org/press-releases/aclu-challenges-dhs-face...</a><p>Would be great to see Amazon's support.<p>The ACLU ran an experiment with Rekognition and these are their findings:<p>"Using Rekognition, we built a face database and search tool using 25,000 publicly available arrest photos. Then we searched that database against public photos of every current member of the House and Senate. We used the default match settings that Amazon sets for Rekognition.<p>... the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.<p>... Academic research [0] has also already shown that face recognition is less accurate for darker-skinned faces and women. Our results validate this concern: Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress."<p><a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28" rel="nofollow">https://www.aclu.org/blog/privacy-technology/surveillance-te...</a><p>[0]: <a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf" rel="nofollow">http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a...</a>
Step 2: End Ring contracts with police departments.<p><a href="https://www.eff.org/deeplinks/2019/08/five-concerns-about-amazon-rings-deals-police" rel="nofollow">https://www.eff.org/deeplinks/2019/08/five-concerns-about-am...</a>
AWS is running a nice screen here. I recall reading about Rekognition being documented as having accuracy problems when individuals in question had darker skin [2,3].<p>>> <i>"The latest cause for concern is a study published this week by the MIT Media Lab, which found that Rekognition performed worse when identifying an individual’s gender if they were female or darker-skinned."</i> [1]<p>I can't really comment. Just recalled this in the memory banks and thought they might address this directly [they may have].<p>1 - <a href="https://www.theverge.com/2019/1/25/18197137/amazon-rekognition-facial-recognition-bias-race-gender" rel="nofollow">https://www.theverge.com/2019/1/25/18197137/amazon-rekogniti...</a><p>2- <a href="https://www.media.mit.edu/articles/amazon-is-pushing-facial-technology-that-a-study-says-could-be-biased/" rel="nofollow">https://www.media.mit.edu/articles/amazon-is-pushing-facial-...</a><p>3- <a href="https://www.marketwatch.com/story/ai-experts-take-on-amazon-after-researchers-findings-of-racial-bias-in-facial-recognition-2019-04-03" rel="nofollow">https://www.marketwatch.com/story/ai-experts-take-on-amazon-...</a>
> We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.<p>That's a little weak. If they were serious, the moratorium would extend indefinitely, or until such rules were in place.<p>One year might just be long enough for the fervor to die down, so they don't take such a PR hit when they resume sales.
What about in the UK? At the BLM protest, the police rub around with their camcorder on the stick to justify kettling people for 4+ hours and squash people's will to protest. They require you to show your face or arrest you. All because they want to use Rekognition to cross reference everyone's face.
Wonder if that extends to the Bodycam analysis services running under a different brand, which allows searching for people based on various criteria and matching against a watch list?<p><a href="https://www.ibm.com/support/knowledgecenter/SS88XH_2.0.0/iva/ref_analyticpbwc.html" rel="nofollow">https://www.ibm.com/support/knowledgecenter/SS88XH_2.0.0/iva...</a>
Has anyone ever here actually demoed Rekognition? I did two years ago, maybe.<p>From that, I felt like it doesn't work and shouldn't be used in production, never mind police production.
I see this as an extension of the Facebook <moderating/censoring> discussion, which is really a broader question of what moral obligations do corporations have beyond following the law and trying to provide the optimal product to their consumers?<p>Also there seemed to be no substantive discussion prior this about the police using Rekognition until it became a hot button issue. What will the widespread effects be if corporations start allowing their decisions to be governed by <outrage of the mob/principled consumer pressure>?<p>Finally I wonder how they will implement this, I mean after all I can sign up and start using any AWS service with just a credit card what's to stop police departments from simply using a corporate card and signing up for a different account? Also does this apply to just local PDs or does it extend to the FBI, NSA, CIA, or other 3 letter government agencies?<p>Disclaimer: The purpose of these comments are intended to be observational not advocational.
It's probably the legal and PR teams' fault, but this surely could have been worded to sound less like potential corporate doublespeak:<p><i>"We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested."</i><p>Dead giveaway is that Legal and PR teams relentlessly edit out self-agency.
I remember at the AWS summit maybe two years ago, they were casually showcasing how some police depts were using Rekognition. Oh my, what a culture shock. How can you basically foreshadow 1984 on stage without blinking an eye?
"We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology"<p>Or you know... you could do it yourself. Ethics don't have to come from regulations.
What prevents a private company (ex. Clearview) from using Rekognition to accomplish the same thing for the police as a government-contractor?<p>Without any kinds of laws, wouldn’t things like this incentivize new niches to popup to milk money from the government?
Isn't it open to anyone with an AWS account? So how are they even trying to implement this and what would be stopping any third party from using this to submit reports to law enforcement?
<a href="https://techcrunch.com/2020/06/10/amazon-rekognition-moratorium/" rel="nofollow">https://techcrunch.com/2020/06/10/amazon-rekognition-morator...</a><p>> Amazon is known to have pitched its facial recognition technology, Rekognition, to federal agencies, like Immigration and Customs Enforcement. Last year, Amazon’s cloud chief Andy Jassy said in an interview the company would provide Rekognition to “any” government department.<p>> Amazon spokesperson Kristin Brown declined to comment further or say if the moratorium applies to federal law enforcement.
What is the difference between Rekognition and Clearview AI? I'm assuming that Rekognition is just using government photo databases rather than social media?<p>It seems that Amazon has a far better reputation on HN compared to Clearview AI. Is that deserved?
Anyone with a modicum of skill and a few GPUs can do what Rekognition does using code freely available on GitHub and public datasets. This cat is _way_ out of the bag.
It's always a slippery slope when companies morally compel themselves to block use cases from their service.<p>I welcome this move from Amazon, but I hope it doesn't foreshadow more moral bans in future e.g. spurred on by the next angry mobs who will try to limit free speech in society.
So i have a schitzo view on subjects like this;<p>1. I am against places that say “photography prohibited - private property” — if i can see it, i should be free to photograph it.<p>2. I am against ANY use of facial recognition ever anywhere. I own my face and i am allowed to keep it private if i choose to.<p>So,yes completely schitzo and i realize this.<p>But its not an evenly distributed spectrum of a problem. Its a weighted web of nuanced issues.<p>I just dont know how to balance it.<p>Id love to discuss this if anyone is open.
This reminds me of an FBI agent who likes to keep his gun in his back pocket and have it bulging out, just to impress people who knew what he did. I can't imagine how a few bad apples within the police force would sit behind a computer playing with Facebook profile pictures and matching them against Rekognition.
Does anyone know how exactly is the Police integration done with Rekognition? I mean, they must have it integrated into their IT systems, right? Who did that integration? AWS itself? Or some consulting companies? Or the PD's Tech Department?
There's a lot of talk about the technology - but this aside, why are police even using this?<p>We don't need this kind of hyper surveillance for common crime, people with warrants, it's just too much of an intrusion.<p>I can see this being used in certain places for 'high value individuals' such as those marked by the FBI (major crimes, multiple murders) or literally 'terrorists' - but for regular crime, I think it's way too much.<p>We can't be under constant surveillance by the police computers that's just no way to live.
Arguing in the other direction, it turned out to be very important that Floyd killing was captured on video and the killer identified.<p>Since that happened, there have been dozens (at least) of murders and vicious, life-changing assaults, most captured on video. I'd be very happy to see every one of the bad guys identified, and this seems like it would be effective toward that end.
“Police use” == contractors providing subsequent services to said police and bilking tax payer money for said service, with likely companies founded by public servants’ significant others to do said billing (remember this actually happened in 2009 by wives of bankers setting up companies to get bailout funds)
"We are implementing a one-year moratorium on police use of Rekognition" ... until this whole thing blows over. /s<p>They're leaving money on the table, but it will still be there in a year, and they'll only miss whatever Amazon's functional analog of "compound interest" is.
I wonder if this means that existing customers/products had to stop using the service? If so this might be the first time I recall seeing a cloud vendor flex like this.<p>CTOs of city and law enforcement orgs are probably seriously questioning the vulnerability of relying on cloud SaaS.
Can someone explain what has kicked off this retraction from face recognition from IBM and now Amazon? I mean it’s always had dubious uses. What has made this happen right now?
I can see how this decision would stop immediate problems following recent events. However, in the long-term, wouldn't machine learning help regulate and fix human bias?
Translation: We already made a bundle on this, but we're seeing too much pushback, so we're getting out before the downside eats into our profits.
So... cloud service providers now have the right to determine what services they want to allow and what they want to shut off? HAL 9000: "I'm sorry Dave, I'm afraid I can't do that"
Pretty soon you will see telephone companies refusing service, car companies refusing to service cars etc. based on a moral judgement of the customer. Then whey will finish the process of getting anybody accused of wrong-think purged from employment etc.. Hmmmm, where is history was that tried before? What could possibly go wrong?
Personally I am pro facial recognition being used, and believe it is very necessary if police budgets are cut and patrols are reduced. We need a way to hold criminals accountable and bring them to justice. Cameras with facial recognition let us identify their location and send police officers to apprehend criminals, instead of relying on the random chance that an officer spots someone while driving around and matches their face with a list of perps they've seen before.<p>I haven't heard of any police departments using facial recognition as a definite match. All of them use human confirmation. So basically, the number of false positives does not matter - it's more that facial recognition reduces the total amount of data to a more manageable number that are scrutinized by human eyes.<p>I don't know why people would be against this. Steps like this moratorium just seem like posturing or an overreaction. The recent policing incidents that have been in the news do not involve facial recognition and there is no reason to tie them in.