You’ve already had a glimpse of predictive policing if you’ve ever seen the 2002 movie Minority Report.
The movie, starring Tom Cruise, predicts a future where police have technology that lets them locate and arrest murderers before the crime ever happens.
Believe it or not, this fiction is fast becoming a reality today.
How Predictive Policing Works
One of the pioneering companies in this field is Predpol. On its website it describes its software as follows:
“PredPol aims to reduce victimization and keep communities safer. Our day-to-day operations tool identifies where and when crimes are most likely to occur so that you can effectively allocate your resources and prevent crime. Don’t just map crime, prevent it.”
This description sounds innocuous, but it does beg the question how exctly the tool “identifies where and when crimes are most likely to occur.”
Where are the data points coming from, in order for the system to conduct that analysis? And exactly how accurate is the analysis?
Predicting Gang-Related Violence
The reality is that the system is only as accurate as the data it’s provided. And as it turns out Predpol’s dataset could already contain pre-existing, built-in biases.
To understand this, it’s important to take a closer look at the management of Predpol and the path that led them here.
Dr. Jeff Brantingham is a PredPol co-founder and a Professor of Anthropology at UCLA, according to the company’s management page. He was also the focus of an article on Predpol conducted by The Verge.
It’s this same technology, which Brantinham patented, that’s now used as the basis for what PredPol is offering police agencies around the world.
As The Verge revealed, Dr. Brantinham’s work with the LA Police Department in determining “gang-related” classification for crimes, reveals where PredPol’s inputs come from.
The system uses machine learning to insert criminal data and a gang-territory map to automate crime classification. Criminals who have their crimes classified as gang-related can receive “additional criminal charges, heavier prison sentences, or inclusion in a civil gang injunction that restricts a person’s movements and ability to associate with other people.”
The problem with this approach, as The Verge article reveals, is that existing data in the criminal data system are already “highly subjective”. In many cases it is biased and based on racial profiling.
By using that existing data and outdated gang-territory maps, that same racial bias gets “baked” into an automated predictive policing system like PredPol’s.
Facebook’s Approach to Privacy an Influence?
It’s interesting to note that the Chairman of the Board at PredPol is Chris Kelly, who was Chief Privacy Officer and Head of Global Public Policy for Facebook.
Facebook doesn’t exactly have the best track record when it comes to user privacy. This doesn’t instill much faith that PredPol will be much better.
Before this year, Facebook was allowing numerous companies like Cambridge Analytica to exploit users’ private data without their consent. This was on Chris Kelly’s watch.
PredPol is a system that similarly will collect data about people from many different sources to try and predict and classify crime. Think about who might get access to that data and what they could do with it. If you live in or around a high-crime area, what will that do to your own “profile” in such a system?
Another player is Brian MacDonald, the CEO of PredPol.
Brian’s past is as revealing as that of other members of the PredPol management team. He was previously VP of Sales at Vidcie, and In-Q-Tel company.
We’ve previously reported on CIA connections to In-Q-Tel and their funding of technologies that allow for the collection of information about private American citizens. Again, this is an association that’s a bit unnerving when you’re talking about a company that’s developing “predictive policing” technology to be used on the general public.
Predictive Policing is Premature
Research from the RAND Corporation revealed that one trial of predictive policing in Chicago has been a dismal failure. The “heat list” used there to identify individuals most likely to commit a violent crime has been grossly inaccurate.
Out of all violent crimes since the system was started, barely 1% of actual criminals who committed a violent crime were ever on the “heat list”. That is even well within a margin of error.
One could say that existing technology being used to predict crime simply doesn’t work. There are too many factors, and many of those factors just can’t be collected as a dataset.Originally published on TopSecretWriters.com