The Police Plan to Roll Out AI in ‘Predictive Analytics’ Should Worry Us All

In light of recent revelations regarding West Midlands Police’s use of artificial intelligence (AI) to fabricate information about Israeli football fans, you would think that the police would be a little hesitant on the wider use of such technology. But you would be wrong.

In a recent interview with the Telegraph, Sir Andy Marsh, the head of the College of Policing, said that police were evaluating up to 100 projects where officers could use AI to help tackle crime. This includes utilising such things as “predictive analytics” to target criminals before they strike, redolent of the 2002 film Minority Report. The aim, according to Home Secretary Shabana Mahmood, is to put the “eyes of the state” on criminals “at all times”. This is to be outlined further in an upcoming white paper on police reform.

The expansion of AI use in British policing is continually being sold as innovation, efficiency and protection. But in reality it marks a decisive step towards a society in which liberty is treated as a risk to be managed. Wrapped in the language of safety and reform, AI represents a quiet but profound transformation of the state’s relationship with its citizens: from upholder of the law to permanent overseer of behaviour.

Every policing area already has an intelligence unit responsible for ‘predictive analytics’. Crimes which are logged into police indices are scrutinised by analysts, who then produce reports and briefings relating to crime hotspots and the like. Appropriate police resources can be subsequently directed to a particular location at a particular time in order to tackle or prevent the crime. AI can never adequately replace a team of trained professionals going through data. It probably can, however, do it at a fraction of the cost, which is more important to most senior officers than civil liberties. Not so much Minority Report as Heath-Robinson.

The core injustice is clear. Policing in a supposedly free society responds to crimes that have already occurred, or prevention involving highly visible uniformed patrols. So-called predictive policing reverses that logic by directing the power of the state at everybody, nearly all of whom will have done nothing illegal. It will be based on statistical IT guesses about what they might do. This is not a mere technical adjustment to policing as some would have us believe; it is a complete change of emphasis to everyone being potentially guilty until proven innocent. Mass surveillance (for that is what it is) will be imposed without charge, without trial and without a verdict due to there being no formal accusation.

Defenders of this approach pretend that there is no threat to an individual’s liberty. That is patently false. Liberty is eroded wherever the state inserts itself permanently into a person’s life. Persistent scrutiny is a form of soft coercion. Knowing that your movements, associations and behaviour are being logged and evaluated by the state is tantamount to coercion. A society in which citizens must behave as if they are always being watched is not free; it is merely orderly.

Worse still, this system destroys any real degree of accountability. Decisions that once belonged to identifiable officers will be attributed to the system or the programme. When mistakes occur, as they inevitably will, there will be no discerning human judgement to interrogate the system, as operators will almost certainly defer to the machine in the first instance. Power will diffuse upward into institutions and outward into private sector software developers, while the citizen will be left in some form of legal limbo facing an unchallengeable process. An algorithm cannot be cross-examined or shamed.

The claim that these systems are objective is also dangerous. AI will not discover truth; it will go through past policing data, solidify past errors and enforce them with mathematical certainty. Historical mistakes will become future risk indicators.

Nobody in Government is stating that the rollout of AI is an experiment. Surveillance infrastructure never retreats. Every database, camera and algorithm built for the worst offenders will inexorably become, over time, available for broader use. Today the target is violent or prolific criminals; tomorrow it could be protest organisers or those deemed by the political class to be a problem. We have already seen this with the policing of social media and the use of Non-Crime Hate Incidents. How can the police be trusted with transformational technology such as this?

Efficiency is the final lie. Any assumed reduction of paperwork, better targeting and smoother processes do not justify expanding state surveillance. And in any case, during my time in the police, the introduction of new technology never reduced the amount of bureaucracy – it merely transferred it from the page to the screen, and often increased it. Swift injustice is not progress.

Enshrining the use of artificial intelligence across UK law enforcement will abolish any anonymity in the public space and replace it with permanent identifiability. Every journey will become traceable, every gathering recordable, every deviation from the norm potentially suspicious. Yes, this already happens during the course of a police investigation, but that is to establish the movements and behaviours of identifiable suspects, not to generally monitor the entire populace.

This is not policing by consent, as per the original Peelian Principles; it is policing by omnipresence and, unlike watching a Hollywood movie, we won’t be able to walk away if we don’t like it.

Paul Birch is a former police officer and counter-terrorism specialist. You can read his Substack here.

Subscribe
Notify of

To join in with the discussion please make a donation to The Daily Sceptic.

Profanity and abuse will be removed and may lead to a permanent ban.

16 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
happycake78
happycake78
2 months ago

Two Tier AI??

transmissionofflame
2 months ago

Thanks for this.

Another variant on trading freedom for the illusion of safety.

See also: “covid” (among others).

Some people just want to be safe at all costs. Others realise life is full of risk and tradeoffs that but that’s what makes it LIFE. I see little hope of reconciling those two worldviews.

Jack the dog
Jack the dog
2 months ago

And many people are simply complacent.

Despite the last 5 years.

transmissionofflame
2 months ago
Reply to  Jack the dog

I tend to think that if the last 5 years didn’t wake you up then there is no hope, but I guess hope should spring eternal

EppingBlogger
2 months ago

Just in time for allegations against Reform ahead of a General Election. Then, afterwards, it turns out they got the wrong team.

Seems just like the Le Pen prosecution.

what is wrong with the DS website?!!

Arum
Arum
2 months ago

Presumably the AI will be warning the cops that people who leave comments under Daily Sceptic articles carry a high risk of invading Poland.

Jack the dog
Jack the dog
2 months ago

You’re quite right and 2TK doesn’t care.

He’s determined to ensure his position as worst most destructive and unpopular PM in history is unassailable.

Tonka Fairy
2 months ago
Reply to  Jack the dog

Of that, he is doing a mighty fine job. It is difficult to imagine how anyone could possibly be worse.

mike r
mike r
2 months ago

The sell will be that this will prevent crimes like robbery, gangland killings, serial rapists and their ilk being committed. The reality will be that anyone who questions government policy on climate change, gender, their anti-semitism, destruction of capitalism, critical race theory will be branded a far right activist and preemptively found guilty and punished.

John Kitchen
John Kitchen
2 months ago

Our “Justice”system is already two tier; this will make it worse.

For the favoured tier – the rape gangs, those from important communities etc the police will continue to ignore evidence of actual wrongdoing and certainly won’t look further.

For the unfavoured tier – law-abiding people with the wrong views, the police will have AI to help fabricate “evidence” of imaginary risks or imaginary future offences. I’m sure AI will produce lots of useful stuff, especially if it is fed with the “right” data, background etc etc

And then they will hope that the bogus evidence produced by AI is sufficient to gain a conviction. Will it convince a jury? Possibly.

And when juries are abolished will it convince a regime-appointed judge? Absolutely.

varmint
2 months ago

Why don’t they just monitor every second of our Internet use, open all our envelopes and listen to every phone call. —–Maybe then government would be happy, and the cops can just sit in front of a computer all day or work from home. ——But actually government will never be happy till they can have an official sitting in our living rooms to see if we swore at the telly or said something not so nice.

RTSC
RTSC
2 months ago

Isn’t it strange ….. when it comes to the likelihood of committing a murderous terrorist attack, such as blowing up a plane …. it’s “wrong” to focus on people from one particular ethnic and religious minority.

So elderly, white, British grannies have to be checked very carefully and randomly by the airport security systems and under no circumstances must the people most likely to do this be identified by predictive analysis and targeted for security checks.

But the police are now using predictive analysis to identify people who are most likely to commit crimes ……

Ummmm. Two-Tier Britain, again.

sskinner
2 months ago

Look up UN global governance and there is something called AI governance.

shred
shred
2 months ago

It would be nice if the plods spent their time solving crimes that have happened instead of predicting crimes that haven’t .

JXB
JXB
2 months ago

Remember the TV detector vans which not only could tell you had actually, but in which room and what you were watching.

Technically impossible but meant to frighten.

Arthur C Clarke: “Any sufficiently advanced technology is indistinguishable from magic.”

So we have a new advanced technology to which people are attributing magical properties.

kev
kev
2 months ago

Will AI do this?

Police Oath

I, … of … do solemnly and sincerely declare and affirm that I will well and truly serve the King in the office of constable, with fairness, integrity, diligence and impartiality, upholding fundamental human rights and according equal respect to all people; and that I will, to the best of my power, cause the peace to be kept and preserved and prevent all offences against people and property; and that while I continue to hold the said office I will to the best of my skill and knowledge discharge all the duties thereof faithfully according to law.

Sure, there will be area’s where it can be useful, such as scanning Camera’s to track an identified “criminal” or “suspect”, or spot an ongoing crime and identify the perpetrator.

 The problem is going to be, what limits exist.