This is totally fucked. Morally wrong, deeply unethical, and probably illegal – if you're adding punishment without having that additional punishment based on new evidence, isn't that like being treated guilty without proof? Obviously I'm not a lawyer, but how could anyone, let alone the whole huge set of people that led to these policies, think that applying group statistics to individuals to determine the severity of their punishment is ok?<p>On the other hand, these biaces (most notably the racial ones) exist in the process anyway, and now they're simply being codified and exposed. If these algorithms were published we could see exactly how much more punishment you get for being black in America versus being white.<p>Thanks again to ProPublica for an important piece of reporting; hopefully changes get made for the better.
One of the most mind boggling sentences in that article was:<p>"On Sunday, Northpointe gave ProPublica the basics of its future-crime formula — which includes factors such as education levels, and whether a defendant has a job. It did not share the specific calculations, which it said are proprietary."<p>How on earth can you lock people up based on secret information? That is Kafka meets Minority Report.
Weapons of Math Destruction<p><a href="http://boingboing.net/2016/01/06/weapons-of-math-destruction-h.html" rel="nofollow">http://boingboing.net/2016/01/06/weapons-of-math-destruction...</a><p>It's easy to hide agenda behind an algorithm; especially when the details of the algorithm are not publicly visible.
According to propublicas own analysis, the claim of bias cannot be shown to be statistically significant. <a href="https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm" rel="nofollow">https://www.propublica.org/article/how-we-analyzed-the-compa...</a><p>This article is terrible data journalism and probably deliberately misleading.<p>Step 1: write down conclusion.<p>Step 2: do analysis.<p>Step 3: if analysis doesn't support conclusion, write down a bunch of anecdotes.<p>Really, here's her R script: <a href="https://github.com/propublica/compas-analysis/blob/master/Compas%20Analysis.ipynb" rel="nofollow">https://github.com/propublica/compas-analysis/blob/master/Co...</a><p>Just read that. It's vastly better than this nonsensical article.
Thanks for posting this. I encourage this crowd to to take a look at the methodology too: <a href="https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm" rel="nofollow">https://www.propublica.org/article/how-we-analyzed-the-compa...</a>
I don't have an issue with using statistical analysis to direct crime prevention efforts. I think it's unconscionable to use statistical analysis for sentencing. We don't want Minority Report in real life.
Consistent with the theme of this story is the content from discussions held at a conference at NYU School of Law, featuring human rights and legal scholars. Coincidentally, I submitted a link on this yesterday<p>See <a href="https://news.ycombinator.com/item?id=11753089" rel="nofollow">https://news.ycombinator.com/item?id=11753089</a>
In block [37] of the ipython notebook, are racial main effects missing? I only see interactions.<p><a href="https://github.com/propublica/compas-analysis/blob/master/Compas%20Analysis.ipynb" rel="nofollow">https://github.com/propublica/compas-analysis/blob/master/Co...</a>