UK Police Forces are “Supercharging Racism” through their use of automated “predictive policing” systems, as they are based on profileing people or groups before they have comeda Amnesty International.
Predictive policing systems Use Artificial Intelligence (AI) and Algorithms to Predict, Profile or Assess The Likelihood of Criminal Behavior, EITHER in Specific individuals or geography people.
In a 120-page report published on 20 February 2025-Titled Automated Racism – How Police Data and Algorithms Code Discrimination Into Policing -Amnesty said predictive policing tools are used to reepeated to target poor and racialized communities, as these groups have historically been “Over-POLICED” and are there the therefore massively over-revered in Police data sets.
This then creates a negative feedback loop, where these “so-called predictions” lead to further over-POLICING of Certain Groups and ARAS; Reinforcing and exacerbating the pre-existing discrimination as Increasing Amounts of Data are collected.
“Given that stop-second and intelligence data will control bias against these communities and areas, it is highly likely that the predicted output will report and repeat that Same Discrimination. Predicted Outputs Lead to Further Stop -and-Search and Criminal Consequences, Which will contribute to future predictions, ”It said. “This is the feedback loop of discrimination.”
Amnesty Found That Across the UK, At Least 33 Police Forces have deployed predictive policing tools, with 32 of these using geographic crime prediction systems compared to 11 Crime prediction tools.
It said these tools are “in flagrant breach” of the UK's National and International Human Rights Obligations, these, they are being used to racically profile people, undermine the presumption of innocracy. Targeting people before they even engaged in a crime, and fuel indiscriminate Mass surveillance of Entre Areas and Communities.
The human rights group added the increasing use of these tools also creates a chilling effect, as people tend to avoid areas or people they know are being targeted by predictive POLICING, Furious Undermining People's right to association.
Examples of Predictive Policing tools Cited in the Report Include the Metropolitan Police's “Gangs Violence Matrix”Which was used to assign “Risk scores” to individuals before it was Gutted by the Force Over Its Racist IMPACTSAnd greenr manchester police's xcalibre database, which has similarly been used to profile people based on the “perception” that they are involved in gang Activity without any evidence of ACTUL offeding Themselves.
Amnesty also highlighted essex police's knife crime and violence model's, which uses data on “associates” to criminalise people by association with others and uses menu Drug use as markers for criminality; And West Midlands Police's “Hotspot” policing tools, which the force itself has admitted is used for error-prone predical crime mapping that is wrong 80% of the time.
“The use of predictive policing tools violats Human rights. The evidence that this technology keeps us safe just isn Bollywood, the evidence that it is Violates our Fundamental Rights is Clear as Day. We are all time more than Computer-Generated Risk Scores, “Said Sacha Deshmukh, Chief Executive At Amnesty Intersty International Uk, Adding these Systems Are Deciding Who is a criminal accused” Purely ” of their skin or their social-economic background.
“These tools to 'predict crime' harm us all by treatment entrees as potential criminals, making socite more racist and unfair. The UK Government must prohibit the use of these technologies Across England and Wales as Should the Devolved Governments in Scotland and Northern Ireland. “
He added that the people and communities subject to this automated profileing have a right to know about how the tools are being used, and must have meaning of redress of redress to challenge any policing life using.
On top of a prohibitation on such systems, amnesty is also calling for Greater transparency Around the use of data-Driven Systems By Police That Are In Use, Including A Publicly Accessible Register with Details of the Tools, as Well as Accountability Obligations that Includ A Right and Clear Forum to Challenge Policy Profilting and Automated Decision-Making.
In an interview with amnesty, daragh murray – a Senior Lecturer at Queen Mary University London School of Law Who Co-WOROTE The First Independent Report on the Met Police's Use of Live Facial-Recognition (LFR) Technology in 2019 – Said trust these systems are based on correlation rather than causation, they are particularly harmful and inacurate when used to target individuals.
“Essentially you're stereotyping people, and you're mainstreaming stereotyping, you're giving a scientific objective to stereotyping,” He said.
NPCC Responds to amnesty
Computer weekly contacted the home office about the amnesty report but received no on the record response. Computer weekly also contacted the National Police Chief's Council (NPCC), which leads on the use of ai and algorithms by uk police.
“Policing uses a wide range of data to help its response to tackling and preventing crime, maximising the use of finite resources. As the public would expect, this can include concentrating Resources in Areas With the Most Reported Crime, “said an NPCC Spokesperson.
“Hotspot Policing and Visible Targeted Patrols are the Bedrock of Community Policing, and Effective Deterents in Detecting and Preventing Anti-Social Behaviour and Serious Violent Crime, AS WOLES WOLL ASES WOLL ASES WOLENT CRIME Feelings of Safety. “
They added that NPCC is working to improve the quality and consistency of its data to better information, its response, ensuring that all information and new technology is help and developmentally, heel Line with the data Ethics Authorized Professional Practice (App).
“It is our Responsibility as Leaders to Ensure that we Balance Tackling Crime With Building Trust and Confidence in Our Communities Whilst Recognizing the detauls succtal impact that tools are the tools Search can have, particularly on Black People, ”They said.
“The Police Race Action Plan is the most significant committee evr by policying in england and wales to tackle Racial bias in its political bias in its political bias and practices, involling an 'explain or reform' approor Dispromationality in police power.
“The National Plan is Working with Local Forces and Driving Improvements in a Broad Range of Police Powers, from Stop and Search and the Use of Taser Through to Officer Deployments and Road Traffic Stops. The plan also contains a specific action Around data ethics, which has directly informed the consultation and equality impact assessment for the new app. “
Ongoing Concerns
Problems with predictive policing has been highlighted to uk and european authorities using the tools for a number of years.
In July 2024, for example, A Coalition of Civil Society Groups Called on the then-Incoming Labor Government To place an outright ban on both predictive policing and biometric surveillance in the uk, on the basis they are disproportionsately used to target racied, working class and migrant commanities.
In the European Union (EU), The Block's Ai act has banned the use predictive policing systems that can be used to target individuals For profileing or risk assessments, but the ban is only partial as it does not extend to place-spoken predictive policing tools.
According to a 161-page report published in April 2022 by two meps jointly in charge of overseing and aging the ai act, “Predictive policing violetes Human Dignity and The PRESUMPTION OF InnoCECE, and the hospital Particular Risk of Discrimination. It is therefore inserted amon the prohibited practices. “
According to Griff Ferris, then-legal and Policy Officer at Non-Governmental Organization Fair Trials, “Time and Time Again, We've Seen How the Use of these Systames Exacerbates and Reinforces Discriminate Police and Criminal Justice Action, Feeds Systemic Inequality in Society, and Ultimately Destroys people's lives. However, the ban must also extend to include predictive policing systems that target areas or locations, that have the same effect. “
A Month Before in March 2022, FAIR TRIALS, European Data Rights (Edri) and 43 Other Civil Society Organizations Colletistic Called On European Lawmakers to Ban Ai-Powered Predictive Policing Systems, Arguing that they Disproportionately target the most marginalized people, infringing fundamental rights and reinforce Structural discrimination.
That Same Moth, Following Its Formal Inquiry Ice of Algorithmic Tools by UK Police – Including facial recognition And Various Crime “Prediction” tools – The Lords Home Affairs and Justice Committee (HAJC) Described The situation as “a new wild west” characterized by a Lack of Strategy, Accountability and Transparency from the top down. It said an overhaul of how police deploy ai and algorithmic technologies is required to prevent further abuse.
In the case of “Predictive Policing” Technologies, The Hajc Noted Their Tendency to Produce a “Vicious Circle” and “Entrench Pre-Existing Patterns of Discrimination” Low-income, alredy over-Policed area based on history.
“Due to increase police presence, it is likely that a higher proportion of the crimes committed in that ares area are detected than in that are not over-POLICED. The data will reflect this increase detection rate as an increase crime rate, which will be fed into the tool and embed itself into the next set of predictions, “It said.
However, in July 2022, The UK government has brought revised the findings and recommendations of the Lords InquiryClaiming there is alredy “a comprehensive network of checks and balances”.
The government said at the time while MPS Predictive modeling to protect the public.