If you like science fiction, you might know Steven Spielberg’s Minority Report. A hit 2002 sci-fi in which an experimental police unit “Pre-Crime” is able to predict crimes. The alleged perpetrators are arrested before the crime and put into an artificial permanent sleep without a trial. The film is about whether the “Pre-Crime” unit should be introduced nationwide. But then an eager pre-crime investigator finds himself in the crosshairs of the prediction…
It is now 2021 and the authors Monika Hielscher and Matthias Heeder have examined the possibilities of modern police work with their documentary “Pre Crime”. They describe how investigative authorities are already testing so-called “predictive police” software solutions in field trials worldwide.
As in the “Minority Report”, the investigative authorities want to predict crimes. The “Predictive Police” software uses the first possibilities of artificial intelligence, which, based on historical data from investigative authorities, financial transactions, social media and public video surveillance, optimizes police operations and names potential perpetrators.
Since personal data is also used for the predictions, the documentary is also repeatedly about whether we are willing to sacrifice privacy and freedom for the promise of more security.
Another important point that becomes clear in the documentation is that decisions that artificial intelligence or algorithms will make about us in the future must be comprehensible and transparent .
In their documentation, Hielscher and Heeder essentially show 2 approaches to “predictive crime”:
(1) An AI-based operational planning for the police (the soft variant)
Here you try to make a prediction about “what, when, where” could happen more often (e.g. in the case of a series of burglaries or violent crimes). Based on this, the deployment and routes of the police patrols are optimized. In the film, for example, a man is checked at a Munich underground station because his sports bag (it came from fitness) could also contain tools for stealing a bicycle. The operational plan for this police patrol and the identity check came from an AI algorithm.
(2) An AI-based identification of potential perpetrators (the extreme variant)
The filmmakers accompany Chicago police officers on a home visit. A man living in a deprived area of Chicago is told that an algorithm (original sound) has flagged him as a potential criminal. The man has a job and is trying to stay out of trouble. Because of his origins, he probably had contact with people who later became criminals. He is now under surveillance by the Chicago Police and is being told to change his life.
I think the film is an important documentary. It sometimes seems a bit bulky, since a lot of “undesirable side effects” of digitalization be lumped together. The authors show that new “pre-crime” police systems (e.g. PRECOBS , PredPol , HunchLab ) can of course only make results and predictions with enriched and linked data. In order to obtain this data, investigative authorities and “pre-crime” manufacturers use an unregulated gray area:
at the large portals of the data economy that collect and sell our private data (e.g. Facebook, Google, WhatsApp, Instagram etc.),
with data dealers who condense this data from various sources into personal profiles,
from video surveillance (face recognition).
And there we have it again!!!
Learn how you can better protect your data and privacy yourself and start with my digital self-defense course.
Commit yourself to digital self-determination and a right that incriminating algorithms must be transparent, verifiable and fair. Major decisions must be made by people.
The Every Human Action proposes an extension of fundamental rights across Europe that would do just that. More than 240,000 people across Europe have already voted for it. Are you there too?