UK project aims to predict deadly violence using personal data and algorithms, drawing fierce criticism from privacy advocates

The British government is developing a controversial research initiative intended to forecast the likelihood of individuals committing homicide — using algorithmic analysis of data belonging to those already known to the criminal justice system.

The scheme, initially labelled the “homicide prediction project” but now called “sharing data to improve risk assessment”, is intended to enhance public safety through new data science techniques. However, critics have branded it “dystopian” and “deeply intrusive”.

The project was launched under the administration of former Prime Minister Rishi Sunak and draws on official records including information held by Greater Manchester Police before 2015 and data from the Probation Service.

Although the Ministry of Justice insists only individuals with at least one criminal conviction are being analysed, campaigners argue that the scope of the data suggests otherwise. Documents obtained through Freedom of Information requests and highlighted by civil liberties group Statewatch reveal that data from people without convictions — including victims of domestic abuse, those with mental health conditions, or people who have self-harmed — could be used in the model.

Personal details processed as part of the programme include age, ethnicity, gender, and police identification numbers. In addition, “special category” data involving mental illness, suicide risk, addiction, and disability is expected to be included due to its “significant predictive power”.

A key concern is that such sensitive data, when analysed through automated systems, could replicate and deepen existing social and racial inequalities. Sofia Lyall of Statewatch warned that the model risks entrenching systemic bias, especially against low-income and racially minoritised communities.

“The government’s attempt to create a tool for labelling future murderers is another dangerous step towards automated criminal profiling,” she said. “It’s not just flawed — it’s morally wrong.”

The Ministry of Justice has reiterated that the programme is still in the research phase, designed to evaluate whether data-driven tools could help identify risks posed by individuals already under probation supervision. A final report will be made public.

A spokesperson explained: “This is an analytical project using historical data about convicted offenders, aimed at improving how we assess the risk of future violence. It is not operational and exists solely to test whether new data sources can enhance current systems.”

At present, probation officers already use risk assessment tools. The new research seeks to determine whether integrating broader data inputs — such as police records and custody history — would lead to more effective violence prevention.

Nonetheless, the project has ignited debate about the balance between public protection and individual rights, particularly as artificial intelligence continues to influence public policy and law enforcement strategies.

Despite official assurances, concerns remain that the government’s reliance on opaque algorithms and sensitive personal records could open the door to discriminatory surveillance — targeting vulnerable individuals before any crime has occurred.

Leave a Reply

Your email address will not be published. Required fields are marked *