In recent years, researches have taken notice of the treatment of racial minority groups across the U.S. Recently, a social justice group called the Stop LAPD Spying Coalition uncovered something troubling.
The Los Angeles Police Department was forced to turn over documents about a predictive policing and surveillance algorithm they use, and there are some noticeable problems.
Futurism describes the documents as revealing that, “evidence that policing algorithms, which require officers to keep a checklist of (and keep an eye on) 12 people deemed most likely to commit a crime, are continuing to propagate a vicious cycle of disproportionately high arrests of black Angelinos, as well as other racial minorities.”
Algorithms are Only as Useful as The Information Entered Into Them
The problem with using a predictive algorithm is that it is reliant on the information provided by police offers – imperfect police officers that carry bias and other flaws with them.
Los Angeles has an established record of over policing neighborhoods with high percentages of black residents. It only makes sense then that the algorithm would reflect that data.
The algorithm, powered by Palantir, uses a point system to reflect the need for surveillance. These points are assigned for past offenses but can even be assigned for merely coming in contact with an officer whether any wrongdoing occurred or not.
According to the LAPD, these points are an indicator of someone’s likeliness to commit a crime.
Since officer are required to surveil 12 people with a high number of points, people are subject to increased police activity simply for “potentially” being associated with or knowing about criminal activity.
Sociological studies, like the Stanford prison experiment, show that when individuals are subjected to unfair speculation they are more likely to fall in line with the actions people expect of them. It’s possible that overpolicing actually makes people more likely to commit a crime.
The algorithm maintains someone’s status for two years after their last interaction with police. This means that former criminals who are just trying to move on with their future are subject to continued surveillance that can keep them in a cycle of police interaction for years to come.
In one report, a man targeted by the algorithm was stopped by LAPD twice per day on four separate days over the course of six weeks. Every stop by police resets the clock for you to escape the cycle.
The Stop LAPD Spying Coalition calls it a “racist feedback loop.”
Did the LAPD Intentionally Create a Racist Predictive Algorithm?
We don’t have the authority to determine the purpose of the LAPD, but it’s likely the algorithm was not directly intended to be a racist program.
The LAPD reports that surveillance programs of “probable offenders” reduces crime rates and even calls the technique “smart policing.” The LAPD says that the surveillance is “laser-like” and “noninvasive” and stresses that they still follow “constitutional policing” protocol.
Despite their intentions, there is evidence to suggest that predictive policing systems are harmful in all settings. Scholars from the University of Chicago reported this in a 2005 paper titled, “Against Prediction: Sentencing, Policing, and Punishing in an Actuarial Age”
In the paper, Bernard E. Harcourt a professor at the Law School at the University of Chicago, argues that we should work in spite of what might be considered “common sense” as it pertains to predictive policing.
“In criminal law and enforcement, the presumption should be against prediction. Actuarial methods should only be employed when it can be demonstrated to our satisfaction that they will promote the primary interest of law enforcement without imposing undue burden or distorting our conceptions of just punishment. Barring that, criminal law enforcement and correctional institutions should be blind to prediction.”
On June 5th, Los Angeles will have a hearing that will allow for discussion on the impact of predictive surveillance and policing.
The LAPD has claimed success with this program, and in fact, homicide rates in the city have declined. It’s not possible to directly correlate the drop-in homicide rates in Los Angeles with predictive surveillance and policing. In one area where the program was implemented, between 2011 and 2012, the area experienced a 56 percent decrease in homicides.
Homicides have steadily been decreasing in Los Angeles, however, since the 90’s and leveled out in about 2015.
Technology Can Be Useful But We Must Be Careful
Many technological advancements bring incredible benefits to society at large, but in some cases, they do more harm good.
That is potentially the case with the LAPD’s policing algorithm. While it is a great goal to use tech to keep our cities safer, when it comes at the expense of underrepresented communities it must be rethought.