If you haven’t seen the movie Minority Report, try to track down a copy this weekend. Without giving away too much of the plot, the movie centers around a futuristic way to prevent crime. With the help of three precognitive humans, known as “precogs,” the Washington DC police department can essentially look into the future and prevent crime before it occurs. The movie takes a twist when one of the precogs has a vision that the main character, played by Tom Cruise, will commit murder, and thus a warrant is issued for his arrest. A chase ensues, all while the viewer contemplates the moral ramifications of the idea, “Can you be guilty of a crime without attempting it?”
While we don’t yet have precogs on the police force, the Chicago PD is using a similar technology in hopes of deterring future crimes. The technology attempts to look into the future using a mathematical algorithm to formulate a list of individuals deemed likely to be involved in a crime. The names the formula spits out land on what the CPD calls its “heat list.”
Unlike Minority Report, a person on the list isn’t arrested or tried for a future crime, but they are informed that the police are keeping a close eye on their actions. Police believe the extra attention can help deter crime.
“This program will become a national best practice,” said CPD Commander Jonathan Lewin. “This will inform police departments around the country and around the world on how best to utilize predicative policing to solve problems. This is about saving lives.”
At What Cost?
While the technology sounds harmless enough, as a person wouldn’t end up on the list unless they met several factors – another CPD Commander put it bluntly, saying “if you end up on this list, there’s a reason you’re there” – and even then they only get a warning from the police, those who oppose the technology say it’s invasive, and at times, racist.
The exact science behind the algorithm remains hidden, and a Freedom of Information request to obtain the list was denied out of safety concerns, saying its release could “endanger the life or physical safety of law enforcement personnel or any other person.” Without knowing how someone ends up on the list, some wonder if it relies too heavily on demographic information, like age, ethnicity and living location. Some fear this could unfairly target minorities.
“First of all, how are we deciding who gets on the list and who decides who gets on the list?” said attorney Hanni Fakhoury. “Are people ending up on this list simply because they live in a crappy part of town and know people who have been troublemakers? We are living in a time when information is easily shareable and easily accessible, so, let’s say we know that someone is connected to another person who was arrested. Or, let’s say we know that someone’s been arrested in the past. Is it fair to take advantage of that information? Are we just perpetuating the problem?”
He added, “How many people of color are on this heat list? Is the list all black kids? Is this list all kids from Chicago’s South Side? If so, are we just closing ourselves off to this small subset of people?”
Without going into any details, the National Insititute of Justice, which provides grants to police departments interested in using predictive technology, simply stated that the algorithm only finds people “who the model has determined are those most likely to be involved in a shooting or homicide, with probabilities that are hundreds of times that of an ordinary citizen.”
So where do you land on the spectrum? Do you believe police departments should be able to profile and compile lists of potential criminals, or do you believe the technology is crossing into murky legal waters?
Related source: The Verge