e.Woke #78: Dismantling Digital Prisons

Generation Justice (GJ) is a multiracial, multicultural project that trains youth to harness the power of community. GJ youth are training YOU to harness the power of your cyber security.
Privacy only works when we work together. Subscribe to e.Woke & donate.

A quick fix does more harm than good. Algorithms in pretrial risk assessment programs across the US are exposed for built-in racial disparities. Robots and AI are supposed to have no biases, but someone is behind that screen writing the code. Not to fear, community organizing is here!

“Mapping Pretrial Injustice”

Check out this new database, from our friends at MediaJustice and Media Mobilizing Project! In their own words, “Many jurisdictions across the country are adopting risk assessments to end the use of money bail without fully understanding or measuring the impact risk of assessments have on the pretrial process in their communities, or on their jail populations. Therefore, this website was designed with the specific intention to support organizers and advocates fighting to end pretrial incarceration who must contend with how these instruments are actually placing additional barriers between people of color and our freedom.” (Photo by MediaJustice and Media Mobilizing Project)

“Can algorithms help judges make fair decisions?”

We’re only human! In an effort to address human mistakes and biases, the judicial system overshot with risk assessment technology.  Alan Yu writes for WHYY, “It’s common to think of data as a kind of technological and social mirror that reflects human biases–garbage in, garbage out. [Annette Zimmerman, a political and moral philosopher at Princeton University] said data is actually more like a magnifying glass that could amplify inequality if left unchecked.” (Photo by Dake Kang/AP Photo)

“Algorithms Were Supposed to Fix the Bail System. They Haven’t.”

The biggest fan turned harshest critic! The Pretrial Justice Institute, once an advocate for algorithmic risk assessment, recently released a statement on how technology perpetuates racial inequities. From WIRED, Tom Simonite says there’s concern “about the bias baked into the statistics underlying risk scoring algorithms stemming from the realities of American policing and justice.” (Photo by Guy Cali/Getty Images)

“Reverb: Racial Profiling 2.0”

We can read about it, but it’s something else to see these issues make their way into mainstream media. Adam Yamaguchi with CBSN Originals features a piece on predictive policing programs in Season 5, Episode 1 of this documentary. This episode asks, “Are predictive policing programs actually super-charging racial bias?” (Opening Credit Image)

Overall Mood Meme for this week is…

Police departments like, “You can’t have human biases if there are no humans involved”

  Resources and Guides

 

<i> “P.S. e.Woke is a project of Generation Justice” </i>
Support our work with a donation​ today. GJ is an independent youth media org that relies on individual donors like you. We don’t receive any corporate, government or university funding for this project. 

Leave a Reply

Your email address will not be published.