Courts and police are turning to AI to reduce bias, but some argue it’ll make the problem worse

FAN Editor

We all know humans are imperfect. We’re subject to biases and stereotypes, and when these come into play in the criminal justice system, the most disadvantaged communities end up suffering. It’s easy to imagine that there’s a better way, that one day we’ll find a tool that can make neutral, dispassionate decisions about policing and punishment.

Some think that day has already arrived.

Around the country, police departments and courtrooms are turning to artificial intelligence algorithms to help them decide everything from where to deploy police officers to whether to release defendants on bail.

Supporters believe that the technology will lead to increased objectivity, ultimately creating safer communities. Others however, say that the data fed into these algorithms is encoded with human bias, meaning the tech will simply reinforce historical disparities.

Learn more about the ways in which communities, policemen and judges across the U.S. are using these algorithms to make decisions about public safety and people’s lives.

Free America Network Articles

Leave a Reply

Next Post

Almanac: Nat King Cole

<![CDATA[ ]]> Watch CBSN Live Copyright © 2019 CBS Interactive Inc. All rights reserved. <![CDATA[ .title, .evening-news .listing-full-focus-with-label .items .item>.label, .evening-news .listing-full-lead-media .items .item>.label, .evening-news .module-listing.module-basic .module-heading.title, .evening-news .module .module-heading.title { font-family: “futura-pt”,Helvetica,Roboto,sans-serif; } .mobile [data-role=”content”] > .container::before { display: none; } ]]]]> ]]> <![CDATA[ ]]> <![CDATA[ ]]> <![CDATA[ ]]> […]

You May Like