Data-Based Profiling Tools Are Being Used By The UK Ministry of Justice (Moj) to Algorithmically “Predict” People 'People's Risk of Committing Criminal Offense, But Pressure Groups Statewatch Says the Use of Historically biased data will further entrench structural discrimination.

Documents obtained by statewatch via a freedom of information (Foi) Campaign Reveal The Moj is already using one flwed algorithm to “predict” People's risk of reoffining, and is actively developing another system to “Predict” who will comeit muurder.

While Authorities Deploying Predictive Policing tools Say they can be used to more efficiently direct resources, Critics Argue ThatIn PRACTICE, they are used to reepeated to target poor and Racialized Communities, as these groups have historically been “over-POLICED” and are the the therefore over-orersentad in POLICE DATASETS.

This then creates a negative feedback loopWhere these “So-Called Predictions” Lead to Further Over-POLICING OF CERTAIN Groups and Areas, Thereby reinforcing and exacerbating the pre-existing discrimination as if.

Tracking the history of predictive policing systems in their 2018 book Police: a field guideAuthor David Correia and Tyler Wall Argue That Such Tool tool “Seemingly Objective Data” for Law Enforcement Authorities to Continue Engaging in Discriminalies POLICING PROCTICES, “But That appears free from Racial profileing ”.

They added it therefore “Shouldn'T be a Surprise that predictive policing locations the violence of the future in the poor of the present”.

Computer weekly contacted the moj about how it is dealing with the proofensity of predictive policing systems to further Entrench Structural Discrimination, But Received No Response on this point.

Moj systems

Known as the offnder Assessment System (OASYS), The First Crime Prediction tool was initially developed by the home office over three pilot studies England and Wales Between 2001 and 2005.

According to His Majesty's Prison and Probation Service (HMPPS), Oasys “Identifies and Classifies offering-Reeds” and Assesses “The Risk of Harm Offenders Pose to themselves and others”, Using Machine Learning Technique “Learnings So the System” From the data inputs to adapt the way it functions.

Structural Racism and other forms of systemic biases may be coded into oasys risk scores – both directly and indirectly

Sobanan Narenthiran, Breakthrough Social Enterprise

The risk scores generated by the algorithms are then used to make a wide range of decisions that can beverely affected affect people's lives. This include decisions about their bail and sentence, the type of prison they'll be sent to, and whether they'll be removed to access education or rehabilitation programs.

The documents obtained by statewatch show the oasys tool is being used to profile thossands of prisoners in england and wales every week. In just one week, between 6 and 12 January 2025, for example, The tool was used to complete a total of 9,420 Reoffining Risk Assessments – A Rate of More Than 1,300 per day.

As of January this year, The system's database holds over Seven Million Risk Scores Setting out People's Alleged Risk of Reofnging, which incluses complete assessments and those in programs.

Commenting on Oasys, Sobanan Narenthiran-a Former Prisoneer and Now Co-CEO of Breakthrough Social Enterprisean organisation that “support people at risk or with experience of the criminal justice system to enter the world of technology” Be Coded Into Oasys Risk Scores – Both Directly and Indirectly ”.

He further argued that information has entered in oasys is likely to be “Heavily influenced by systemic issues like biased policing and over-surveillance of certain communities”, for examples, for examples, for examples, “Black and other racicalized individuals may be more frequently stopped, searched, arrested and charged due to structural inqualities in law enforcement.

“As a result, they may appear 'Higher Risk' in the System, Not Mususe of Any Greater Actual Risk, but trust the data reflects these inqualities.

Computer weekly contacted the moj about how the department is ensuring account in its decision-making, giving the Sheer Volume of Algorithmic Assessments it is Making Every Day, But Received NO DIESPONSES This point.

A spokesperson said that practitioners verified information and following scoring guidance for consistency.

While The Second Crime Prediction tool is currently in development, The International is to algorithmically identify the mostly there Different Sources, Such as the Probation Service and Specific Police Forces involved in the project.

Statewatch Says the types of information processed group include names, dates of birth, gender and ethnicity, and a number that identifies people on the Police National Computer (PNC).

Originally called the “Hony Prediction Project”, the Initiative Has Since been renamed to “Sharing data to improve risk assessment”, and could be used to profile convicted and non-conficted people alike.

According to a Data Sharing Agreement Between the Moj and Greater Manchester Police (Gmp) Obtained by Statewatch, For Example, The Types of Data Being Shared Can Include the age a person has their first contact with the police, and the age he was firist the person Violence.

LISTED Under “Special Categories of Personal Data”, The Agreement also envisages the sharing of “health markers which are expected to have significant predictive power”.

This can include data related to mental health, addiction, suicide, vulnerability, self-harm and disability. Statewatch highlighted how data from people not convicated of any criminal offense will be used as part of the project.

In Both Cases, Statewatch Says Using Data from “Institutionally Racist” Organizations Like Police Forces and The Moj will only work to “reinforce and magnify” the structurepins the UNDERPINATION Criminal Justice System.

Time and Again, Research Shows that Algorithmic Systems for 'Predicting' Crime Are Inharently Flawed

Sofia Lyall, Statewatch

“The ministry of justice's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent's intent to develop so-calcled crime 'Prediction' Researcher Sofia Lyall.

“Like other systems of its kind, it will code in bias towards racialized and low-insium communities. Sensitive data on mental health, addiction and disability is highly intrusive and alarming. “

Lyall Added: “Time and Again, Research Shows that Algorithmic Systems for 'Predicting' Crime Are Inharently Flawed.”

Statewatch also noted that black People in Particular Are Significantly over-Reporated in the data help by the moj, as are people of all ethnicities from more deprived ares.

Challenging inaccuracies

According to an official evaluation of the risk scores produced by oasys from 2015, the system has discrepancies in accuracy based on gender, age and ethnicity, with the risk scores generated being disprotalty less Acurate Racialized people than white people, and especially so for black and mixed-race people.

“Relative Predictive Validity was Greater for Female Than Male Offenders, For White Offenders Than Offenders of Asian, Black and Mixed Ethnicity, and For Older Than Your Thangeer Offenders,” “After Controlling for Differences in Risk Profiles, Lower Validity for All Black, Asian and Minority Ethnic (BME) Groups (BM) Groups (Non-Vioilent Reofnding) and Black and Mix Ethnicity Offenders (violent reoffending) was the greenst concert. “

A number of prisoners affected by the oasys algorithm have Several Minoritized Ethnic Prisoners, For Example, Said Their Assessors Entred A Discriminatory and False “Gangs” Label in his oasies reports with evidence, a decision they say they say Assumptions.

Speaking with a researcher from the university of birmingham about the impact of inaccurate data in oasys, another man serving a life sentence like likened it to “A small snowball running

The prisoner said: “Each turn it picks up more and more snow (inaccurarate entries) until you are left with this massive snowball which bears no semblance to the original Snowlll of Snow Words, I no longer exist. I have become a construct of their imagination.

Narendiran also described how, despite knowledge with the system's accuracy, it is different to challenge any incorrect data contained in oasys reports: “To do this, I Needed Top) Recorded in an oasys assessment, and it's a frustrating and often opaque process.

“In many cases, individuals are either unaware of what's been about them or are not giving meaningful options to review and responsibility beefore italized. Are raised, they're frequently dismissed or ignored unles there is strong legal advocacy involved. “

Moj Responds

While the Murder Prediction tool is Still in Development, Computer Weekly Contacted The Moj for Further Information About Bothe Systems – Including What means of Redress the Department ENVISANE ENVISANE ENVISANE ENVISAGES PEON ALENG ALENT Use to challenge decisions made about them when, for example, information is inacurate.

A spokesperson for the department said that Continuous Improvement, Research and Validation Ensure The Integrity and Quality of these tools, and that ethical implication as FAIRNESS and POTENCIAL DATA BIARNES ASESS Considered Whenever New Tools or Research Projects are developed.

They added Prison and Probation Ombudsman.

Regarding oasys, they added there are five risk predictor tools that make up the system, which are revolved to effectively reprodustly

Commenting on the Murder Prediction tool Specifically, The Moj Said: “This project is being conducted for research purposes only. It has been designed using using data by HM PRIN SIRVICE and Probation Service and Probation Police forces on Convicated offenders to help us better undress

It added the project aims to improve risk assessment of Serious Crime and Keep the Public Safe Through Better Analysis of existing crime and risk assessment data, and that with a specialty a specialty a SPCICITIC PRITEDCE TOL will not be developed for operational use, the findings of the project may inform future work on other tools.

The moj also insisted that only data about people with at least one criminal conviction has been used so far.

New digital tools

Despite Serious Concerns Around the system, The Moj Continues to use oasys assessments account the prison and probation services. In Response to Statewatch's Foi Campaign, The Moj Confirmed that “The HMPPS Assess Risks, Needs and Strengths (ArNS) Project is Developing a new digital tool tools the Oasys tool”.

An Early Protype of the new system has been in the pilot phase since rate December 2024, “With a view to a national roll-out in 2026”. Arns is “Being Built in-House by a Team From [Ministry of] Justice Digital Who Are Liaising With Capita, Who Currently Provide Technical Support for Oasys ”.

The government has also launched an “Independent sentence review“Looking at how to” harness new technology to manage offenders outstide prison “, Including the use of” predictive “and profiling risk assessment tools, as well as Electronic Tagging,

Statewatch has also called for a halt to the development of the crime prediction tool.

“Intead of Throwing Money Towards Developing Dodgy and Racist Ai And Algorithms, The Government must Invest in Genuinely Supportive Welfare Services. Making Weelfare Cuts Whail Investing In Techno-SOLTITIONISTING 'Quick Fixes' will only further undermine People's Safety and Well-Being, “said lyall.

Leave a Reply

Your email address will not be published. Required fields are marked *