TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Biased Algorithms Are Everywhere, and No One Seems to Care

14 pointsby smb06over 7 years ago

2 comments

gopalvover 7 years ago
&gt; even those who know their algorithms are at a risk of bias are more interested in the bottom line<p>There&#x27;s more to it than the bottom line, the trouble is that nobody wants to take responsibility trying to change the world that they think they&#x27;ve distilled into a model - there&#x27;s a lot of fitting decisions to past patterns than actually changing society.<p>The algorithms are inheriting biases from the past and hiding them - your zipcode affecting your repayment potential, that&#x27;s good old redlining in action all over again. Change can happen, but the algorithms will hide the bias, where it can&#x27;t be challenged.<p>And then there&#x27;s carried-over sample bias - if most people who sign up to clinical trials for cancer are those of a certain level of wealth, with adequate help to drive them to the chemo, then the uncovered genetic samples that come out of the research program will reflect the economic inequality advantages which was offered to a geriatric during their earning years.<p>Algorithms which rely on past information to be incrementally better are in fact conservative towards their underlying model. Anything else would be self-destructive to the algorithm and in effect, be a problem to the user of the algorithms.
jonnydubowskyover 7 years ago
This problem will haunt us forever if we don&#x27;t take it seriously today. What does taking this seriously look like? Is there an effective model for &quot;lobbying&quot; the industry and specific companies in a meaningful way? As I am just descending down this particular rabbit hole, I don&#x27;t have any answers yet. Perhaps exposing the worst examples of these biases, like in criminal sentencing algorithms, and other public verticals, we might begin drawing more attention to the urgent matter at hand.