TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Machine Bias: Man Is to Computer Programmer as Woman Is to Homemaker?

15 pointsby acoravosabout 8 years ago

4 comments

backpropagandaabout 8 years ago
If I were training a classifier to predict whether a sentence is talking about household activities v&#x2F;s not, wouldn&#x27;t the occurrence of man&#x2F;woman in the sentence be a <i>good</i> feature? Today, woman do perform household activities more (whether we like it or not), and wouldn&#x27;t it make sense to <i>use</i> that piece of information when performing some predictive analysis?<p>The technical sense of &quot;bias&quot; arises when the train and test distributions differ. Obviously if you train with a dataset of text from a foreign country&#x27;s news and then apply it on an American context, the difference in the data distributions will introduce bias, but why do we need a social twist to this already well-functioning term? If the same classifier is trained and evaluated in India (with its sexist roles, say), then there&#x27;s no (technical) <i>bias</i> and I don&#x27;t see why it&#x27;s a bad application.
评论 #14371942 未加载
评论 #14372036 未加载
评论 #14371959 未加载
评论 #14372100 未加载
vtangeabout 8 years ago
This is the tug-o-war of influencer v. influencee. A machine that just tells-it-as-it-is might hold an advantage over one that willingly ignores some data to promote a different view of the world.<p>Personally, I see more danger in people trying to make machines that evangelize their own biases to the world than machines being molded by the existing social assumptions of society, given that we expect machines to perform most of the work&#x2F;control most of the resources in the future.
mkrumabout 8 years ago
If you are going to &quot;debias&quot; your model, what is the point of even training the model to handle these issues in the first place? Not surprisingly, human language can be biased. If you train a model on human language it will not magically transcend those biases. The problem is that people have this expectation that ML is going to lead to these perfect decision makers.<p>Machine Learning creates models that reflect the data, not the truth.
reader5000about 8 years ago
In the sjw-religion, why is &quot;homemaker&quot; considered inferior to &quot;computer programmer&quot;? One of the oldest and most important human occupations versus hunched over at a desk slaving for a salary until being outsourced to a bot in 5 years? I&#x27;ve never understood the default sjw&#x2F;&quot;feminism&quot; assumptions that anything feminine is &quot;bad&quot;.
评论 #14372020 未加载
评论 #14372047 未加载
评论 #14372213 未加载
评论 #14372087 未加载
评论 #14372083 未加载
评论 #14372222 未加载