> This was based on a few publicly cited examples of Twitter's photo framing using images of both light-skinned and dark-skinned people that skewed toward the light-skinned person. [..] And Zehan Wang, engineering lead at Twitter<p>For all their talk about the importance of diversity, why did the complaining Twitter users only check White-Black bias, and not Asian-Black as well? Especially since the engineering lead is Asian, and Twitter's diversity report claims Asians are over-represented (27% vs US 5.4%) and whites under-represented (42% vs US 61%) at their company: <a href="https://blog.twitter.com/en_us/topics/company/2019/Board-Update-Inclusion-Diversity-Report-May2019.html" rel="nofollow">https://blog.twitter.com/en_us/topics/company/2019/Board-Upd...</a><p>> Vinay Prabhu, chief scientist at UnifyID and a Carnegie Mellon PhD, ran a cropping bias test on a set of 92 images of white and black faces and found a 40:52 white-to-black ratio, which argues against bias for that particular set.<p>So how many test images did the people that found bias use? Because this could be just cherry-picking, as only the cases where bias was found get attention.