I think beyond not feeding race in as a <i>feature</i> to any model, this stuff is mostly nonsense. If you include race as a feature, then I think it's likely that the model will become racist, because race is so highly correlated to behaviors and patterns which are in large part the consequence of all sorts of things, including historical racism, that a model could easily mistake race as a causal factor. If you don't feed race in as a feature, however, the outputs are hardly racist. My impression has been that by and large the argument <i>actually</i> being made is that "we have been trying to correct for historical injustices by actively using race and gender as mechanisms for advantaging minorities and women, and an unbiased model is not properly accounting for these particular objectives."<p>Take something like a bank loan. If you had a model at a bank which took credit score, income, wealth, and collateral into account, black Americans would have loans rejected at a higher rate than white Americans. Is this model racist? No, this model doesn't even know what race is, all it knows is credit scores, income, wealth, and collateral. Does the fact that black Americans used to be slaves in the US, or were kept out of certain housing markets, contribute towards the fact that black Americans, on average, have lower credit scores, income, wealth, and collateral? Of course. But is this model racist? Literally not at all. It is completely unbiased, and exactly what the model <i>should</i> be. If the case you're making is that you think that there should be a national effort to correct for historical injustices that were done by the state by actively discriminating by race, that is a <i>completely</i> different discussion.<p>Having all of our decision-making apparatuses factor in the infinite pile of historical injustices that may have contributed to an individual's particular circumstances is not the way to go. Keep models simple and limited to what is relevant for that particular criteria. Fix injustices further upstream, or you make the whole system a convoluted nightmare.