> His tiny company, Clearview AI, devised a groundbreaking facial recognition app. You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared. The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants.<p>This doesn’t sound groundbreaking all and I’d be very surprised if FBI/NSA/DHS/Palantir didn’t already have a system like that. Maybe they were just reserved for higher-value targets. Of course, NSA isn’t gonna tell you what it has constructed until decades later, so claiming that this goes far beyond NSA capabilities is reckless at best and clueless at worst.
I’ve been waiting to see this story and this company show up. It has felt inevitable and it’s sad that it’s being allowed to happen. How long will it be before the Ring cameras on every block feed into a real-time database where anyone can figure out where anyone else is, track their movements, and monitor their every moment?<p>Facial recognition is a bigger threat to our current way of life than just about anything else I can fathom other than climate change or nuclear war. The scariest part is that few people seem to recognize or care about the risk.
"One of the odder pitches, in late 2017, was to Paul Nehlen — an anti-Semite and self-described “pro-white” Republican running for Congress in Wisconsin — to use “unconventional databases” for “extreme opposition research,” according to a document provided to Mr. Nehlen and later posted online."<p>Giving hate groups a greater ability to stalk and harass their victims is also pretty scary.
This illustrates hypocrisy of Silicon Valley.. don’t be evil indeed<p>Peter Thiel funding this, while he demolished Gawker Media, through litigation for his own invasion of privacy.<p>Forget Silicon Valley, we urgently need federal regulation to limit this assault on our privacy (at the very least it can slow down our country’s inevitable decline into a black mirror episode)
I'm not okay with this - and you shouldn't be either. It's only a matter of time before it gets and misused/abused - to identify protestors, for law enforcement officers' personal uses, to identify those doing legal things deemed "immoral" (see: China), etc.<p>We really need regulation here. Urgently.<p>The US appears to have been the leader in such regulation in the past. The problem is, they don't do that anymore. They haven't passed any laws related to user rights or privacy in a long time, and are actively trying to make encryption illegal.<p>The same is true for the Australian government, and those of several developing nations. We can hope that the EU does something, but... the impact will be limited.<p>It's especially bad for people living in non-first-world countries like India where the citizens aren't educated on the consequences of law enforcement agencies using tech like this. Laws taking away the right to privacy are being pushed through regularly. Recently they've started using facial recognition to identify protestors: <a href="https://www.fastcompany.com/90448241/indian-police-are-using-facial-recognition-to-identify-protestors-in-delhi" rel="nofollow">https://www.fastcompany.com/90448241/indian-police-are-using...</a><p>I really wish that some leading tech companies would try and push regulation through, but that will never happen since apparently privacy erosion and constant user tracking is critical for revenue for seemingly all of them (except Apple, I suppose).<p>Also, even if somehow regulations were put in place that made it necessary for websites to try and protect user data and made it illegal to scrape PII, there's nothing stopping government agencies from developing tools like these for themselves. Aaaand we go back to the first paragraph of this comment. This is a sad state of affairs.
I can imagine the EU having a field day with this. Given the scraping of data from other sites, this is clearly processing of PII without explicit consent - it's effectively being used as a biometric identifier too so the rules around sensitive data also apply...
Is this unique? I feel like there are many companies providing this sort of surveillance/data-scraping aggregation and indexing functionality to law enforcement and TLAs. We just don’t hear about them too often because we are not their customers.<p>Our government has abandoned us, and total surveillance is the future unless something radical changes.<p>Fun tip: get an old analog radio, like in an old non-connected car or a boombox or walkman or clock radio or something, go somewhere quiet and private, and listen to whatever you want. And realize that nobody knows what you’re listening to—it’s your secret. I find this to be a strangely powerful experience.
Holy shit, this quote is crazy:<p>> While the company was dodging me, it was also monitoring me. At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from company representatives asking if they were talking to the media — a sign that Clearview has the ability and, in this case, the appetite to monitor whom law enforcement is searching for.
> That’s because Facebook and other social media sites prohibit people from scraping users’ images — Clearview is violating the sites’ terms of service. “A lot of people are doing it,” Mr. Ton-That shrugged. “Facebook knows.”<p>The last kind of person on Earth I want making an app like this is someone that doesn't care about terms of service, morality, contracts, or upholding the law. It seems like he just got into it for the money, and has no compunction about unethical behaviour. "Everybody's doing it" is a cliche, and idiotic, response. Don't take any wooden nickels when you sell your soul...
Was not there a Russian app 4-5 years ago that did something similar? Its premise, if I recall correctly, was to trawl VK/dating app profiles based on photos of people taken on a street.<p>If we are to believe the tests journalists did, it was pretty good considering the app authors just pulled some VK/russian dating app photos<p>Edit: Thanks for downvotes. Here's the article: <a href="https://www.theguardian.com/world/2016/apr/14/russian-photographer-yegor-tsvetkov-identifies-strangers-facial-recognition-app" rel="nofollow">https://www.theguardian.com/world/2016/apr/14/russian-photog...</a><p>The app is called FindFace
More background, including access to the public records requests that led to this investigation:<p><a href="https://www.muckrock.com/news/archives/2020/jan/18/clearview-ai-facial-recogniton-records/" rel="nofollow">https://www.muckrock.com/news/archives/2020/jan/18/clearview...</a>
How is this not a massive copyright violation? When I upload a photo to, say, Instagram, I know I’m granting <i>them</i> a perpetual license to do basically anything with my photo, but scrapers don’t inherit those rights.
For an article seemingly about the dangers of such a product, it sound an awful lot like an advertisement. They mention several influential people involved, had law enforcement agencies rave about how well it worked, included the price point, and free trials are available!
This is/was inevitable.<p>We all know that once technology allows a beneficial behavior you can easily get away with, nothing can stop it. See torrents, ad blocking, reverse-engineering, cracking, etc.<p>But even from a legal/moral perspective, it's not clear where the line is. The data is publicly available, uploaded voluntarily by the people themselves. The algorithms are freely available. People are allowed to take photos...
Sure, the end product is creepy, but where along the way did we go too far?
> While the company was dodging me, it was also monitoring me. At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from company representatives asking if they were talking to the media — a sign that Clearview has the ability and, in this case, the appetite to monitor whom law enforcement is searching for.<p>Like in the movies.
We as a society are still experimenting with new new-found toys like this in a lawless territory. Similar to how car-traffic was largely unregulated until we realized that traffic rules are really necessary.<p>It is not clear how this will play out though. Is it even possible to hope for a state that doesn't spy on its citizens? I'm not so sure anymore (thanks, all you f-g terrorists). Maybe our struggle has to be to regulate and enforce <i>how</i> the spying is done, and used, and live with the fact that it can be abused before it is corrected.<p>If anyone has a clear view of how such pessimism might be wrong I'll be happy to hear it.
We have already reached the point of no return unfortunately. The only option at this point is to open source and normalize this tech somehow so that everybody has access to it.
This company has too much power to fake and manipulate things in different ways. It’s just a matter of time for false positives that crush the already vulnerable people. The level of abuse people can (and will) be subjected to is going to be horrendous.<p>Without regulations, laws and audits, everyone is screwed. Oh yeah, even law enforcement is screwed if/when it blindly believes in these systems and considers them fool proof and beyond suspicion.<p><i>Creepview — now, that’s my name for this company. It also makes sense that Peter Thiel put money into it.</i>
As this has been and will be inevitable given the tech, it's not a question of regulation as so many call for. Regulation will just put this tech in a small group of hands to imbalance the power of probably government versus citizen.<p>The only way to deal with this is to recognize that privacy in public spaces was a temporary concept in society available for a limited period. Allow everyone this information and let the cards fall in a more balanced way. Anything else would be oppression.
Lots of discussion on ethics (important!), but little in actual metrics.<p>What actually are the precision and recall of systems that search over faces of the entire national population? I would have thought enough people look similar that precision would be inherently low (even as a human it is occasionally hard to tell people apart in photos), but the claims here (and in similar articles NYT has had on Chinese companies doing similar things) is implying near perfect numbers.
I would hazard a guess that this will lead to more anti-mask laws since politicians, who love appearing tough on crime, will view anyone wearing a mask as a likely criminal. Which will probably decrease crime but also effectively eliminate any checks on state power.<p>How has this guy not been targeted by organized crime groups already?
This loss of privacy is just another consequence of our perpetual focus on and devotion to competition against each other -- everything gets weaponized. Privacy hinders the real game-changer, the weaponization of you.
Even if we ban it here in the US, there's nothing stopping adversarial foreign governments from building the same databases, and probably with a lot more computing power and technical expertise.
I have often wondered if you form an LLC and sign over the rights to your DNA / looks / et al to said LLC contractually are you more protected in todays society ?
The social media companies might be able to buy some good press by suing this company, its founders, its investors, and its employees into oblivion for terms of service violations.<p>On the other hand, it would push copycats into developing alternative business models to capture the demand while still hiding from civil legal action.<p>And there’s wider societal costs from potentially chilling innovation like this through the social media companies acting like a trust.
Ton-That was a scammer, reported on by Thiel's nemesis Gawker, and an HN poster (hi!)<p><a href="https://news.ycombinator.com/threads?id=hoan" rel="nofollow">https://news.ycombinator.com/threads?id=hoan</a>
<a href="https://gawker.com/tag/hoan-ton-that" rel="nofollow">https://gawker.com/tag/hoan-ton-that</a>
I am waiting for a government that puts together all the tech we are developing right now and creates the perfect surveillance state. Even in the worst governments like Russia under Stalin, the Nazis or North Korea citizens still have/had the possibility to move around and do things without anyone knowing. That soon may be over.
the look that clearview has cultivated is at best clandestine and might even be criminal<p>it is telling about the attitudes of law enforcement that they skirt prohibitions of firstparty use of facial recognition by consorting with an entity such as clearview
Oh I know Hoan. At least he isn't alt-right any more. I think he has matured and is a decent guy. It's just commodity technology, but it is going to make him rich. <i>shrug</i> Cool.
How did Clearview scrape so much of Facebook? Is this fundamentally hard for Facebook, Instagram, Linkedin, etc. to stop if someone determined wants to suck in photos for a tool like this?
The counterpart to this dystopian future is other people taking photos/video without your consent. Just as a social norm I wish that was illegal first, and if it already is, enforced more.<p>Apple and android should blur faces by default until other people explicitly give consent to be photographed. That's the future I want to see.
The article summary pretty much highlights that anybody can aggregate public data for personal or commercial use, so Clearview AI is not the only player.<p>Data aggregation and transparency were supposed to be the foundation of open government, but it looks like the citizens (by way of consumerism/capitalism/communism) are the victims of legislated privacy violations by law makers that want nothing to do with transparency. It seems like a conflict of interest if business is pulling the strings of politics.<p>If the citizens pushed harder for a transparent government, other than encryption, what else can they legislate to turn the tables on that debate (ie. No privacy for government is a no go)?
My personal rule: Ignore any news story that uses the word "might".<p>Sure, something might happen. Anything <i>might</i> happen. The news is supposed to tell us things that <i>did</i> happen.