I remember the outrage when people discovered that Google's AI wasn't properly trained on black faces. It makes sense that they try hard to avoid that happening again by paying black people to let Google scan their faces. It is not unethical to try to diversify your training data.<p><a href="https://www.telegraph.co.uk/technology/google/11710136/Google-Photos-assigns-gorilla-tag-to-photos-of-black-people.html" rel="nofollow">https://www.telegraph.co.uk/technology/google/11710136/Googl...</a><p>Anyway, this part sounds directly illegal, seems like it was just Randstad being greedy but if anyone from Google knew about it then it is bad but I doubt that they couldn't budget enough money to get the scans legally:<p>> They said Randstad project leaders specifically told the TVCs to (...) conceal the fact that people’s faces were being recorded and even lie to maximize their data collections.<p><a href="https://www.nydailynews.com/news/national/ny-google-darker-skin-tones-facial-recognition-pixel-20191002-5vxpgowknffnvbmy5eg7epsf34-story.html" rel="nofollow">https://www.nydailynews.com/news/national/ny-google-darker-s...</a>
Kudos to Google's contractor for offering this opportunity to the people who need it most.<p>I would happily sell anyone a picture or scan of my face for $5. But I would even more happily have that chance go to someone who needs it more than myself.<p>This article also mentions that the contractor may have lied to or misled the homeless, which is deplorable. But the behavior described by the title itself is nothing objectionable. The fact that many will object is a phenomenon I've seen called "Copenhagen Ethics": <a href="https://blog.jaibot.com/the-copenhagen-interpretation-of-ethics/" rel="nofollow">https://blog.jaibot.com/the-copenhagen-interpretation-of-eth...</a>
It sounds like there were three issues:<p>1. The contractor targeted homeless people<p>2. They targeted people with darker skin<p>3. They may not have been forthright or truthful about what they were doing.<p>Number 3 is clearly wrong. But I think so long as the contractors were upfront and truthful about what they were doing, I don't know if 1 or 2 are problematic.<p>The only argument I can see for why they shouldn't pay homeless people money for an easy job is that the prospect of money might be so enticing that they're willing to give up personal rights or freedoms (the same argument why we don't allow selling of organs). But $5 neither seems high enough, nor the process invasive enough, that this argument would hold water.<p>As for ensuring that enough of a sample range is in the database as an attempt at avoiding data bias, this should be a no-brainer good thing.
At least the Verge has the less clickbait headline, mentioning that it was contractors. The original source mentions Google in the headline but the rest of articles only refers to Randstad.<p>One part that is a bit confusing to me is, the original source makes no references whatsoever to any consent form. Usually you can't collect this sort of data without signed consent, and previous reports [0] do mention such a form. I know most people don't read the form, but I'm curious how you can get away with telling someone you're just playing a game and lie so much when the form should clearly state what you're collecting.<p>Still, there should definitely be better vetting of contractors and stories like this definitely look very bad, even if the intentions were actually to help reduce ML bias.<p>[0] <a href="https://www.engadget.com/2019/07/29/google-paid-for-face-scans-to-improve-pixel-4/" rel="nofollow">https://www.engadget.com/2019/07/29/google-paid-for-face-sca...</a><p>EDIT: The original article does indeed mention an show a picture of the "agreement".
A while ago Google got bad press because some image-tagging service identified some people as "gorillas", and IIRC it was blamed on not having enough diversity of skin color in the training data. So... it sounds like at least the "instructed them to target people of color" part of this is them trying to correct that. But in isolation that sounds even worse than the first instance. I guess you're damned if you do, damned if you don't.
Next news: Google engineer sneezes in subway, google trying to infect people.<p>I mean there's no need to have google's name in there, other than to click-bait-trick people into viewing their subpar journalism with ads.<p>But, It's kinda shitty to cheat people no matter what. You cannot say hey pixel 4 is gonna have face unlock and i want you face scanned for that obviously, but contractor should have done a better job.
There's a long, long history of leaving women, people of color, poor people and other groups out of data sets. For example, I've read articles that indicate we can't create good photos of people of color because film standards were normalized to white skin.<p>So, try to fix that and... there's hell to pay?<p>File under: "No good deed goes unpunished."
This seems like a fake outrage to me. If I were a homeless I'd be more than happy for someone to take a photo of me for $5. In fact, I'd find it pretty hypocritical if someone was spending their energy fighting against possible infringement of my rights in such scenario instead of actually providing me with money or food.
> <i>Google and Randstad didn’t immediately reply to requests for comment.</i><p>The content of the article is interesting enough, but this line at the end caught my attention.<p>Is it reasonable to expect someone to "immediately reply" before you publish the article? Because that doesn't sound like ethical journalism to me, unless I'm misunderstanding the meaning of "immediately" in this context.
This is too desperate. It seems AI will help in criminalizing and unbiased profiling of people of color, as a person of color it feels really hard to imagine a future where justice is done with due diligence.<p>Joy at Media Lab has been looking at this issue for a while and advocating for balance. <a href="https://www.technologyreview.com/s/612775/algorithms-criminal-justice-ai/" rel="nofollow">https://www.technologyreview.com/s/612775/algorithms-crimina...</a><p>Also I find it weird that Nvidia was able to simulate realisticly looking people last year, and Google is struggling to find humans, can't they use that as ground-truth?
The first thing they teach you about research, do not do it on the vulnerable population. It's not like people going to the public would matter; The Pixel 4's features and hardware all leaked way before the announcement.
> “They [Google contractor] said to target homeless people because they’re the least likely to say anything to the media,” the ex-staffer said. “The homeless people didn’t know what was going on at all.”
Life imitates video games: <a href="https://www.youtube.com/watch?v=BZ6TuxmgJN0" rel="nofollow">https://www.youtube.com/watch?v=BZ6TuxmgJN0</a>
This strikes me as the ethically best possible way to collect this data. Google is paying people who need the money for something simple and completely harmless.<p>The main counterargument appears to be that those who sold data "didn't understand what was going on". It's hard to imagine moral convictions in which someone could consistently argue that the homeless don't understand money in exchange for photos, but it's acceptable to leave them to fend for themselves on the street.<p>Google is, at worst, helping people who need help.