TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Weapons of Math Destruction (The Dark Side of Data Science)

14 点作者 oldbuzzard超过 8 年前

2 条评论

trendia超过 8 年前
The problem with most of these is not adjusting for cohort changes. For instance, in the SAT example, the author writes:<p>&gt; In the 1980s, the Reagan administration seized on a report called A Nation at Risk, which claimed that the US was on the verge of collapse due to its falling SAT scores.<p>Suppose that low-income individuals start to take the SAT in 1980 whereas they didn&#x27;t in 1970. The <i>wrong</i> way to analyze SAT scores is to evaluate:<p>sum over cohorts P(SAT Score | cohort, Y)<p>where Y is the year. For instance, you might compare the total average score in 1980 vs. 1970. Doing so will show a decrease in SAT score because of the increase in low-income individuals taking the SAT, <i>not</i> because the high-income individuals are doing worse. (This assumes that low-income people have less access to SAT training materials, and those training materials affect the score).<p>The correct way is to only compare scores <i>within a cohort</i>:<p>P(SAT Score | cohort, 1980) &gt; P(SAT Score | cohort, 1970)<p>That is, did the same cohort do better in 1980 vs. 1970?<p>(There might <i>still</i> be some differences between the cohorts in 1980 vs. 1970. Maybe the low-income individuals who took it in 1970 had high confidence in school, whereas the 1980s kids were from a broader background.)
评论 #12439488 未加载
ccvannorman超过 8 年前
&gt; These brokers are training their model on the corrupted data of the past. They look at the racialized sentencing outcomes of the past -- the outcomes that sent young black men to prison for years for minor crack possession, while letting rich white men walk away from cocaine possession charges -- and conclude that people from poor neighborhoods, whose family members and friends have had run-ins with the law, and &quot;predict&quot; that this person will reoffend, and recommend long sentences to keep them away from society<p>This is an extremely important point to our times. Be aware that this sort of algorithm is harming society when it comes to prison sentences, and you&#x27;re paying for it at multiple levels.<p>&gt; Amazon carefully tracks those customers who abandon their shopping carts .. interested in knowing everything they can about &quot;recidivism&quot; among shoppers .. [and they seek out and talk] to their subjects -- to improve their system.<p><i>If the prison system was run like Amazon ... [it would be] oriented toward rehabilitation ...</i><p>(emphasis mine) (edit:formatting)