The UK Information Commissioner's Office (ICO) has launched an artificial intelligence (AI) and biometrics strategy, which the regulator says Rights.

Published on 5 June 2025, The strategy Highlights how the ICO will focus its efforts on Technology use cases where most of the risk are concentrated, but where there is also “significant potential” for public

This include the use of automated decision-making (adm) systems in recruitment and public services, The use of facial recognition by police forcesand the Development of Ai Foundation Models,

Specific actions outlined for these areas include Conducting audits and producing guidance On the “Lawful, Fair and Proportionate” use of facial recognition by police, setting “clear expectations” for how People's personal data can be used to train generative ai modelsAnd development a statutory code of practice for organizations' use of ai.

The regulator added it would also also consult on updating guidance for adm profilers, Working Specifically With Early adopters such as the department for work and pensions (DWP), and production a horizon scanning report on the implications of agentic ai That is Increasingly Capable of Acting Autonomously.

“The same data protective protects append Commissioner John Edwards at the launch of the strategy. “Public trust is not threatened by new technologies themselves, but by reckless applications of these technologies outdeide of the Necessary Guardrails.”

The strategy also outlined How – Secause “We Consistently See Public Concern” Around Transparency and explainability, bias and discriminationAnd rights and redress – these are area the regulator will focus its efforts.

On AI models, for example, the ico said it will “secure assurations” from developers around how they are using people's perceonal information so people are aware, with police facial Recognition, it said it will publish guidance class how it can be deployed lawful.

Police Facial Recognition Systems will also be audited, with the findings published to assure people that Systems are being well governed and their rights are being protected.

Artificial intelligence is more than just a technology change – it is a change in social. But ai must work for everyone … and that involves putting fairyness, openness and inclusion into the underpinnings

Dawn Butler, AI All Party Parliamentary Group

“Artificial intelligence is more than just a technology change – it is a change in social. It will increase Said Dawn Butler, Vice-Cair of the Ai All Party Parliamentary Group (Appg), at the strategy's launch. “But ai must work for everything, not just a less people, to change things. And that involves putting fairness, openness and inclusion into the underpinnings.”

Lord Clement-Jones, Co-CHAIR OF THE AI APPG, Added: “The AI ​​Revolution must be founded on Trust. Privacy, transparency and accountability are not important to Innovation-Thei Constutututout All AI is Advance Rapidly, Transitioning from Generative Models to Autonomous Systems. Compromise Public Trust, Individual Rights, or Democratic Principles. “

According to the ICO, negative perceptions of how ai and biometrics are deployed risks haempering their uptake, noting that Trust in Particular is Needed for People to Support or Engage With these TechnologiesOr the services powered by them.

It noted that Public Concerns are particularly high when it comes to police biometrics, the use of automated algorithms by recruits, and the use of ai to determine people eligibility for welfare form Benefits.

“In 2024, just 8% of UK Organizations Reported Using AI Decision-Making tools when processing personal information, and 7% reported using using facial or biometric reconstrance. Previous year, “said the regulator.

Our objective is to Empower Organizations to Use these complex and evolving ai and biometric technologies in line with data protection law. This means people are protected and ever in Inspected and ever Confidence in how Organizations are using these technologies.

“However, we will not hesitate to use our formal pores to safeguard people iF organizations are using personal information recklessly or seeking to avoidsibidden. Intervening Proportionately, We will create a Fairr Playing Field for Compliant Organizations and ENSURE ROBUST PROTECTIONS For people. “

In late may 2025, an Analysis by the ada lovelace institute Found that “significant gaps and fragmentation” in the existing “Patchwork” Governance Frameworks for biometric surveillance technologies means people rights from rights are not being adequately protectly.

While the ada lovelace institute's analysis focused primarily on deficiencies in UK Policing's Use of Live Facial Recognition (LFR) Technology – Which it identified as the most prominent and highly governed biaometric surveillance use case – it noted there is a need for legal care and effective governance for Technologies “Across the Board.

This include other forms of biometrics, such as fingerprints for cashless payments in schools, or systems that claim to remove Infer People's Emotions or TruthfulnessAs well as other deployment Scenarios, Such as when Supermarkets Use Lfr To Identify Shoplifters or Verification Systems to Assure People's Ages for Alcohol Purchases,

Both Parliament and Civil Society have made reepeated calls for new legal frameworks to govern uk law enforcement's use of biometrics.

This include three separete inquiry by the lords justice and home affairs committee (jhac) Into Shoplifting, Police algorithms and Police facial recognitionTwo of the UK's Former Biometrics Commissioners, Paul wiles and Fraser Sampsonan independent legal review by matthew ryder Qc; the UK's Equalities and Human Rights Commissionand the House of Commons Science and Technology CommitteeWhich called for a moratorium on lfr as far back as july 2019

However, while most of these focused purely on police biometrics, the ryder review in particular also took into account private sector uses of biometric data and biometric data and teachnologies Public-Private Partnerships and for Workplace monitoring

Leave a Reply

Your email address will not be published. Required fields are marked *