Can someone who is knowledgable on this subject please comment? I am extremely skeptical of the claim that this problem is as easy to remedy as adding more pictures of people with darker skin tones to the training dataset. Are companies really as dumb as not to have tried that after news stories of this nature have been coming out almost as long as facial recognition has existed?<p>I am more likely to believe that there is something different about people whose faces are misidentified. Similarly dark furred dogs are usually not used in movies because lighter furred dogs photograph better because of how the light reflects off of them. Perhaps it is as simple as photos of darker skinned people are harder to identify the contours of the face, maybe that is why Apple resorted to adding infrared sensors for their Face ID system. Another possibility is the racial backgrounds of people being misidentified are more androgynous looking and that the AI needs to incorporate other attributes like hair style, makeup and clothing as an additional indicator.