TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Early evidence on “predictive policing” and civil rights

78 pointsby mirajover 8 years ago

9 comments

dansoover 8 years ago
I thought this was an excellent treatment of the garbage-in-garbage-out aspect of this debate. Too often it seems people get wrapped up in the &quot;Is it ethical to use a computer to judge someone?&quot; implications, which are non-trivial, but also seem to be borne out of a general ignorance about the many ways we <i>do</i> allow computers (or rather, computation) to make judgements about people.<p>Also, it&#x27;s an excellent example of the power of research and tabulation, though it seems the link to the list of departments they researched is broken.<p>I think the use of crime modeling doesn&#x27;t have to be controversial, but the first step is demanding transparency. It&#x27;s scary enough when the police are the primary collectors and judgers of data, but even worse when they leave it to a vendor and assume it&#x27;s hunky-dory out of the circular reasoning that proprietary algorithms are good because they are proprietary.<p>&gt; <i>When the Fresno police briefed the city council about the Beware system, the council president asked a simple question about those color-coded threat levels: “How does a person get to red?” The officer didn’t know, because the vendor wasn’t saying. As the officer delivering the briefing explained: “We don’t know what their algorithm is exactly… We don’t have any kind of a list that will tell us, this is their criteria, this is what would make a person red, this is what would make a person yellow. It’s their own system</i>
评论 #12475369 未加载
评论 #12474605 未加载
评论 #12475966 未加载
h4nkosloover 8 years ago
The problem is that the police are being asked to do what amounts to population control, of a historic problem population in the US, that has no real desire to be controlled, on the behalf of people with no real desire to be seen as controlling them. The most the algorithm can do is provide a justification for well-known solutions to crime (or the perception of crime), but laundered through math &amp; obfuscated reasoning so as to not be seen as too racist.<p>Somehow, up until like the 1960s-1970s, this was not an issue. What changed, and what reason do we have for believing that any modernized algorithmic solution would be better than now-abandoned social technologies with actual empirical track records?<p>Hell, the vast majority of the time, you can&#x27;t even accurately describe the problem in eg Chicago without being shouted down or deflected into someone&#x27;s pet policy proposal.
评论 #12476586 未加载
carapaceover 8 years ago
&quot;The target of the Jihad was a machine-attitude as much as the machines,&quot; Leto said. &quot;Humans had set those machines to usurp our sense of beauty, our necessary selfdom out of which we make living judgements. Naturally, the machines were destroyed.&quot;<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Butlerian_Jihad" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Butlerian_Jihad</a><p>Seriously though, how is this not the crudest sort of cargo-cult machine worship? I&#x27;m not paying taxes to be judged by a goddamned machine. It&#x27;s crazy. If the people <i>we&#x27;re paying</i> to administer the system are so degenerate that they would delegate their most essential function to a machine then we need to vote them out in favor of sober, responsible, educated adults. It&#x27;s like they&#x27;re admitting that they can be replaced by a script!!!<p>Knowing what I know about computers, and computer modelling, I&#x27;m appalled that people would make, buy, and employ these sorts of &quot;voo-doo&quot; machines.<p>(Thin sans-serif body font means you hate your readers, and making it grey means you <i>really</i> hate them.)
评论 #12474870 未加载
评论 #12474856 未加载
Houshalterover 8 years ago
Remove the scary word &quot;algorithm&quot; and this whole issue seems blown up around something fairly mundane. If the police hand calculated how many crime reports came from each area no one would care.<p>People are so scared of algorithms. There is now scientific evidence confirming that severe distrust of algorithms is a pervasive human bias. Search for &quot;algorithm aversion&quot;. They do trust them when they are believed to be infallible, but as soon as they learn they can make mistakes, they vastly underestimate them. And this is a damn shame, because in almost every case even very simple statistical procedures outperform human &quot;experts&quot;.<p>Worse is this nonsense that algorithms can be &quot;racist&quot;. As if a computer can have prejudice or care about anything other than maximizing accuracy. The media loves to fuel this narrative. E.g. the big hoopla when Google&#x27;s image tagger tagged a black person as a gorilla. They had to remove it and apologize.
评论 #12477337 未加载
omarforgotpwdover 8 years ago
Wow, cool! I am the technical founder of PredPol and he uses a screenshot of the original version, which I wrote in my college dorm room in his article. Somehow the fact that somebody took so much time to criticize it somehow makes me feel special. (I don&#x27;t work there an
patcheudorover 8 years ago
&quot;The fact that we even call these systems “predictive” is itself a telling sign of excessive confidence in the systems.&quot;<p>If you are worried about the outcome of the language in police intervention and court cases then &quot;Speculative policing&quot; would seem a more appropriate term for the technology and cary less weight. This might be an area suited for the FTC to step in and ask the industry to fix their marketing to be more inline with what they actually offer.
gumbyover 8 years ago
I like that they begin by addressing the use of the word &quot;predictive&quot;. How we frame an issue deeply effects how it is received, perceived, and accepted.
shmerlover 8 years ago
Also see &quot;Watchbird&quot; by Robert Sheckley: <a href="https:&#x2F;&#x2F;www.gutenberg.org&#x2F;ebooks&#x2F;29579" rel="nofollow">https:&#x2F;&#x2F;www.gutenberg.org&#x2F;ebooks&#x2F;29579</a><p>By the way, DRM is an example of such approach, where people are judged as guilty by default by digital systems.
tomjen3over 8 years ago
Possible heretical thought, could it be that predictive policing suggests the same high crime neighborhoods that are normally responsible for most crime, because they are responsible for most crime (at least the kind that the systems track)?<p>That would make them useless, true, but not racists (most of these neighborhoods would probably be predominately african-american, but they would also be predominately poor, which we know correlates with crime).
评论 #12474743 未加载
评论 #12474834 未加载
评论 #12475101 未加载
评论 #12474988 未加载