> While the study did not use the exact software companies like Tesla use to power self-driving cars because they are confidential, the software systems used for the study are based on the same open-source AI those companies use, according to Zhang.<p>I was under the impression that commercial self-driving software was deeply proprietary and confidential, and there is no way to know that this study will generalize if run on state of the art detectors. Tesla and Cruise are name-checked in the article - how do we know this isn’t a problem they have worked extensively on and made great improvements to, relative to the open source components?<p>Feels like a case of outrage-for-clicks.
Human driver eyes (and I suspect any other optical systems working in the visible color range) are also less likely to detect people of color. Five years ago I avoided running over a pedestrian at night only by luck: he was black, wearing a black jacket, black pants, walking across a badly-lit suburban street; I think that either my visual system did not perceive him at all until the last fraction of a second, or perhaps perceived him as a shadow. I managed to swerve. But a fraction of a second later? I am afraid to think about it...<p>I am a big fan of Scandinavian style pedestrian safety reflectors. Attach one to your bag or jacket if you are walking late at night; it might save your life. But if you don't have a reflector, wear at least one piece of bright, light-colored clothing; this is particularly important it your skin color is dark!
they're testing 8 different detection algorithms<p>> The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.<p>while they all had a harder time with adults vs children, that 7.52% is gotten by averaging 2 algorithms that performed abysmally, with 6 that had no statistically significant differences<p><a href="https://arxiv.org/pdf/2308.02935.pdf" rel="nofollow noreferrer">https://arxiv.org/pdf/2308.02935.pdf</a> table 6
How do they work in winter then? You can't see much skin if someone is wearing a winter coat.
Right - self driving cars are a solution for Silicon Valley only only so they don't even bother testing those cars elsewhere.
It's strange that the paper doesn't seem to include any of the actual data, but it is available on their github page <a href="https://github.com/FairnessResearch/Fairness-Testing-of-Autonomous-Driving-Systems">https://github.com/FairnessResearch/Fairness-Testing-of-Auto...</a><p>From what I can see, a couple of the detectors used really seem shit overall, making the combined data of questionable value.
"A new technology reduces mortality risk for all people, but has slightly better outcomes for white adults."<p>Conclusion - we call on lawmakers to make this technology illegal. We prefer more people die at equal rates more than we prefer less people to die at unequal rates.<p>I am not sure I agree with the ethics that underlies this way of seeing the world.
In other news, it turns out that detecting smaller and lower contrast objects is harding with optical sensors. Almost, you know, like how it is with real people.
It won't be long till they won't just be detecting people, by identifying specifically which people they are. Think of the data! All those cars logging time/location of all those people. (which, of course, will only be used for good and the occasional targeted ad)
This seems like a pretty poor article on a pretty poor research subject.<p>The way I read it is something like this...<p>Some researchers got their hands on software that purports to do similar stuff to what self driving cars might also do, but crucially isn't the same as what the cars actually use, and then extrapolate the results into the headline-like title of the research paper: "Dark-Skin Individuals Are at More Risk on the Street: Unmasking Fairness Issues of Autonomous Driving Systems". That's justified isn't it? After all, all software in a category is more or less the same program and the car company software and their research subject software all runs on computers? Right? Must be valid... clearly you can make factual assertions on that kind of extrapolation about computer systems and software.<p>Then some bright-eyed-bushy-tailed reporter comes along and applies the criticality of the typical college educated/professional journalist, which is to say they carefully considered the headline they could write, but otherwise just took the word of the researchers that something resembling knowledge was actually gained by the study. News is delivered! Job done!<p>Look, sarcasm aside, could I have read/understood things incorrectly? Sure... I'm not an expert in this field. Could this be a problem in production-used-in-the-real-world pedestrian detection systems? Sure. But insofar as I can tell, the best the paper could be telling us is that racial biases in pedestrian detection systems is a viable possibility: not the assertion that "Dark-Skin Individuals Are at More Risk on the Street". It might be true, but I don't think these researchers know that any better than I do. Of course, "Dark-Skin Individuals Could Be at More Risk on the Street" isn't nearly so catchy or attention grabbing, is it?<p>And who knows... maybe this research team should pick up the search for low temperature/low pressure super-conductivity... sounds like they have the right temperament.
The abstract sounds a bit different than the (rage baiting) article<p>>bias towards dark-skin pedestrians increases significantly
under scenarios of low contrast and low brightness<p><a href="https://arxiv.org/pdf/2308.02935.pdf" rel="nofollow noreferrer">https://arxiv.org/pdf/2308.02935.pdf</a>
In other words: "Objects with less surface area and lower albedo are less reliably picked up by visual deep neural nets. We haven't benchmarked this against human drivers."
The researchers supposedly used similar but not quite the same approach as Tesla, and claimed it worked worse for people of color. That makes sense since optical recognition is harder for dark skinned people.<p>However, the article leads with a picture of a Cruise car, which use lidar technology. Those should afaik recognize people with the same accuracy regardless of skin color.
Anyone know how likely it is that this is the result of imbalanced training data? You have fewer dark-skinned people and children in the training data, you end up with a model less-skilled at detecting those people.
One good bit of news is that I can confidently predict that at least they don't follow the sizeism of our society, and are likely able to detect larger people more easily than skinnier.
as much as I dislike "AI" and its adjacent topics in HN, I think this can be solved with the companies which have the stake to get data from Asia and African nations. I don't know, pay someone group of people to drive around Cape Town, Bengaluru, Shanghai, Shaanxi, Jakarta, and whichever place that has a lot of "PoC" and kids.