Data analysis

Go Read This Data Analysis Reveals Flawed Predictive Policing Algorithm

Gizmodo has published an in-depth review of the data collection process behind its survey co-reported with Markup in PredPol, a software company specializing in predictive policing (hence the name, which it has since changed to Geolitica) through machine learning.

PredPol’s algorithm is supposed to make predictions based on existing crime reports. However, since crimes are not reported the same everywhere, the readings he provides to law enforcement might simply copy the bias in the reports in each area. If the police use them to decide where to patrol, they could end up over-policing areas that don’t need a greater presence.

When Gizmodo and Markup Assessed the areas, they found that locations targeted by PredPol’s software for increased patrols “were more likely to house blacks, Latinos, and families who would qualify for the federal free and discounted meals program.”

Even though police tactics have evolved to include data on crime and arrests, there has been a historical disparity in how these tactics affect communities of color. Like Gizmodo points out in his analysis, in New York City in the 1990s, researchers at the time found that the methods reduced crime without simply shifting it to other areas. However, the approach included tactics such as stop and search, which were criticized as violations of civil rights.

PredPol’s algorithm has already been reviewed and criticized more than once by academics. Like Vice quoted Suresh Venkatasubramanian, ACLU Utah board member, in 2019:

“Because this data is collected as a by-product of police activity, the predictions made on the basis of the models drawn from this data do not relate to future crime cases as a whole,” notes the study by Venkatasubramanian. “In this sense, predictive policing is aptly named: it predicts future policing, not future crime. “

Yet there has not been an investigation as thorough as this one. This survey used figures taken from public data available through the web. According to Gizmodo and Markup, they found an insecure cloud database linked to the Los Angeles Police Department website. This data contained millions of predictions going back several years.

In addition to supposedly predicting individual crimes, a 2018 report by The edge reviewed Pentagon-funded research by PredPol founder Jeff Brantingham on the use of software to predict gang-related crime. The former professor of anthropology at the University of California, Los Angeles adapted previous research on predicting battlefield casualties in Iraq to create the platform, and the article – “Partially Generative Neural Networks for Gang Crime Classification with Partial Information “- raised concerns about the ethical implications.

Critics have said this approach could do more harm than good. “You create algorithms out of a fake narrative that was created for people – gang documentation is the state that defines people based on what they believe… When you plug that into the computer , every crime will be gang related, ”activist Aaron Harvey said The edge.

Relying on certain algorithms can work magic for certain industries, but their impact can come at a real human cost. With the wrong data or the wrong settings, things can go wrong quickly, even in less difficult circumstances than policing. Look no further as Zillow recently had to shut down his house flipping operation after losing hundreds of millions of dollars despite “pricing models and automation” that he said would provide an edge.

Globally, Gizmodo and MarkupReports are a good consideration of how predictive algorithms can significantly affect people they unknowingly target. The accompanying analysis by Gizmodo provides insightful insight into the data while giving readers a behind-the-scenes look at these metrics. The report says 23 of the 38 law enforcement agencies tracked are no longer clients of PredPol, although they initially signed up to help distribute anti-crime resources. Perhaps by using methods that build transparency and trust on both sides, law enforcement could spend less time on the technology that leads to things like this, highlighting the exactly the opposite approach.