TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Intro to Bias in AI

28 pointsby beluis3dabout 4 years ago

2 comments

Orasabout 4 years ago
In jobdescription.ai [0] I have the challenge of making the job descriptions gender-neutral. I have tested ten job descriptions with Jobvite tool, and the results showed zero biased, but then I started researching more about the gender bias tools and found one study and an article about gender bias [1,2].<p>Study [1] suggests that men and women will decode wording differently. For instance, women felt that job adverts with masculine-coded language were less appealing and that they belonged less in those occupations. Some masculine-coded words are challenging and lead while some feminine-coded words are support and commitment.<p>That does not mean to imply that men lack the ability to be supportive or collaborative, nor women lack leadership or challenging skills “But, based on data analytics on the kinds of jobs men and women apply for, research shows that the adjectives matter.”<p>Article [2] supports the study and added &quot;Many women won’t apply for a job unless they meet almost all of the listed requirements&quot; so the list of requirements matter as well.<p>I plan to research more to better understand the gender bias in terms of wordings before implementing tools to create a feedback loop to improve the algorithm.<p>[0] <a href="https:&#x2F;&#x2F;www.jobdescription.ai" rel="nofollow">https:&#x2F;&#x2F;www.jobdescription.ai</a><p>[1] <a href="http:&#x2F;&#x2F;gender-decoder.katmatfield.com&#x2F;static&#x2F;documents&#x2F;Gaucher-Friesen-Kay-JPSP-Gendered-Wording-in-Job-ads.pdf" rel="nofollow">http:&#x2F;&#x2F;gender-decoder.katmatfield.com&#x2F;static&#x2F;documents&#x2F;Gauch...</a><p>[2] <a href="https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;hbsworkingknowledge&#x2F;2016&#x2F;12&#x2F;14&#x2F;how-to-take-gender-bias-out-of-your-job-ads&#x2F;?sh=7b6d0c761024" rel="nofollow">https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;hbsworkingknowledge&#x2F;2016&#x2F;12&#x2F;14&#x2F;...</a><p>edit: to provide more information instead of links with no context
评论 #26302510 未加载
评论 #26301051 未加载
评论 #26302421 未加载
armoredkittenabout 4 years ago
Bias of various forms in the datasets we use can absolutely be a big issue, and this is a pretty good summary of some of those areas. However, I think it&#x27;s important to look beyond just the data and also look into the assumptions and choices we make regarding models, performance metrics, etc.<p>I came across a good Twitter thread[1] explaining some of these other types of bias -- a lot of them come down to various ways in which model decisions end up impacting performance on the &quot;long tail&quot; of data (i.e., the less frequent categories and groups) long before they impact the bulk of the distribution. This means overall performance may be minimally impacted (or even improved), but performance for subgroups can be drastically reduced.<p>Anyway, the thread is definitely worth a read, and it links to many sources for further reading.<p>[1] <a href="https:&#x2F;&#x2F;twitter.com&#x2F;sarahookr&#x2F;status&#x2F;1361373527861915648" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;sarahookr&#x2F;status&#x2F;1361373527861915648</a>
评论 #26305505 未加载