TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

AI expert calls for end to UK use of ‘racially biased’ algorithms

22 pointsby datashrimpover 5 years ago

4 comments

casionover 5 years ago
I assume this might be an unpopular opinion, but shouldn&#x27;t these programs be evaluated on some metric of success rather than their racial selection?<p>If the software is accurate at picking choices that match the set goals, then it would seem that: the software is doing its job and is &#x27;blameless&#x27;, the racism is on the side of those that set the goals and&#x2F;or the people claiming &#x27;racism&#x27; because a person of a given skin colour was selected.<p>If it could be shown that there&#x27;s low success in the software, yet it&#x27;s relied on despite racial bias, then I would think few rational people could argue against its removal. (edit: s&#x2F;for&#x2F;against)<p>Edit: I suppose that I&#x27;m asking for better auditing of the results vs the desired outcomes. Race seems like a red-herring until it&#x27;s shown that the metrics support poor performance of these systems.
评论 #21796344 未加载
评论 #21796395 未加载
评论 #21796362 未加载
评论 #21796384 未加载
评论 #21796328 未加载
评论 #21796273 未加载
评论 #21797113 未加载
bsenftnerover 5 years ago
The issue that facial recognition is racially biased has only ever been an issue for non-industry leaders in facial recognition. And for the point, Amazon and Microsoft and Google have NEVER been industry leaders in FR - they just have enormous marketing budgets. Any police or any article you see where Amazon&#x27;s FR is being used is an article about that police force being duped by marketing and expecting &quot;best in class&quot; from an organization that does not even rank in the industry as a serious player. The industry leaders in FR can easily be identified by going to the NIST Facial Recognition Vendor Test web site to view the annually ranked testing results from FR vendors who desire to work on Federal government contracts. Also, the media treatment of FR is beyond pathetic. Pretty much every article I encounter is pulp crime level fiction. FWIW, I&#x27;m lead developer of one of the industry leading FR applications, typically within the first 4 on the NIST vendor rankings.
throwGuardianover 5 years ago
Notice how the article is very light on direct examples of actual bias - that&#x27;s not a bug, it&#x27;s a feature intentionally adopted to create fear, uncertainty and doubt (FUD). If anyone wants to FUD anything, just start with this article as a template, find-and-replace the subject from AI to insert_here, and change the name and quotes of the subject-matter-expert, and viola, you&#x27;ve got yourself a hit piece<p>We should be upvoting more substantive writing, making serious accusations on racism should be backed up by real evidence, not quotes and opinions of someone parroting the author&#x27;s narrative
stolenmerchover 5 years ago
Every time this topic comes up, I&#x27;m always confused how the algorithms are biased. Which algorithms? Isn&#x27;t it actually the labeled data they used to train that is biased by not including enough samples of non-white faces? What mechanisms are in place that prevent literally everyone from just re-training on better data? Why does my toy facial recognition software written in javascript detect every face I throw at it?<p>Many of these pieces smell phony. I&#x27;m certainly not saying there isn&#x27;t institutional racism at work here, but I think we need way more detail to evaluate these claims.
评论 #21802297 未加载