The interesting part is not the story itself, but that they chose to demo it to the BBC. Marketing at its best, they want everyone to be fully aware of their capabilities. Wouldn't be surprised if they were overstated, so as to keep citizens in check. The BBC willingly plays along and does not even question the capabilities either way.
> For ordinary people we will only extract their data when they need our help.<p>> When they don't need help, we won't gather their data and it remains only in our big database.<p>In the same sentence:<p>"We won't gather their data"<p>and<p>"The data that we gather remains in a database".
It's not just about this, but given all that is happening in the space of facial recognition, I think it would be really valuable for researchers to start looking at it like they would weapons research. Almost entirely nefarious applications.<p>After WW2 the University of Tokyo banned weapons research from happening in its research labs. I think we can be drawing more lines on research like this and take a stand against a lot of this.
This is entirely unsurprising. Facial recognition for personalized (in public!) advertising is already a big market, and oh, various US localities are using facial recognition. This is, of course, on top of other tracking methods like ALPR and vehicle recognition, credit cards, cell phones, and so on.<p>It is no longer possible to be "off the grid". And that's quite scary when you realize that every single adult has committed many crimes (however minor) in their life.
This article is 2 years old, but quite relevant to current situation.<p>Deep learning and computer vision has progressed quite a bit in the past 2 years, mainly due to incremental algorithmic advances / improved engineering. Unlike fundamental advances, these can be scaled up with increased investments in a more predictable way.<p>Face++, the unicorn company mentioned in this article, is actually the product division owned by Megvii, a machine learning startup based in mainland China. They are planning to IPO soon (see: <a href="https://reut.rs/2L7zzGn" rel="nofollow">https://reut.rs/2L7zzGn</a> ) to raise more funds from overseas investors (hence IPO'ing in Hong Kong).<p>There's some irony when a China-based computer vision startup working on facial recognition technology used in Xinjiang is filing for IPO in Hong Kong, rather than in mainland China.
This tech works, and the only reason we don't have this in the US is because of our cultural and political aversion to mass surveillance.<p>I'm genuinely curious of what this leads to in China, because this tech is easy enough to deploy if you're willing and politically able. What I'm cynical about is that I don't think China is competent enough to safeguard the data it collects, or forever keep it out of the "wrong hands" like the last guy in the video claims.<p>It's hard enough for US government agencies to protect data that should be privileged (police often run illegitimate checks on license plates), I can't imagine China can prevent its various factions from abusing this power.<p>It'd be ironic, but the incompetencies of their bureaucracy could actually lead to their own downfall by way of this technology (their mass surveillance apparatus) somehow getting coopted by dissidents. There's a misconception that China somehow has an iron grip over its people because it's authoritarian, but if that were true they'd have weeded out corruption.
I believe the problem is not so much about finding and flagging criminals, but treating the reasons criminals act like that.<p>These cameras do not explain how they got there.
Who profits from it? Whoever likes the situation that put criminals in that place.<p>Then there's no need to treat poverty anymore because they'll be dying in a corner as intended, instead of stealing food.
Not defending surveillance at all. But I want to mention that China's CCTV network is an important part of the legal improvement on self-defense.<p>From ancient time, the law in most dynasties strongly discourages self-defense because it's unjudgeable most of the cases. And it also could be a tool of murdering. Modern China pretty much inherited this tradition. Although the self-defense is defined in the law just like most countries, in practice it's way too strict almost like non-existence for tens of years.<p>Until last year, a famous case[0] causes massive debate across the country, and finally justified as self-defense. The main reason behind this is the massive surveillance network makes legal apartments confident enough to judge.<p>[0]: <a href="http://www.chinadaily.com.cn/a/201809/03/WS5b8c877ea310add14f38926c.html" rel="nofollow">http://www.chinadaily.com.cn/a/201809/03/WS5b8c877ea310add14...</a>
> China has the largest monitoring system in the world. There are some 170 million CCTV cameras across the country<p>I wonder how this compares to the UK per capita considering how massive China is.
Is the kind of facial recognition only bad if wielded by a nefarious government?<p>Partly I think that huge DNA databases, biometric databases and millions and millions of cameras would be wonderful for catching criminals. On the other hand, the opportunities for abuse are too many to number. Has anyone written anything interesting on where/how to draw the line?
Jesus... This is a very representative example of what happens when technology is not used in a democratic way, and when it is. It can be abused as much as it could be used for the public benefit
You need to see this in the proper social context. Yes, it's technology and seems like a new phenomenon ("dystopia"!), but that already existed and not much has changed. What if I told you that the local security bureau there always had all the information about people's whereabouts? If you were burglarized, you called one of them up and the thieves were caught. China never worked the way that most people understand.