The Challenge of AI and Algorithms in Policing

The Challenge of AI and Algorithms in Policing

The Challenge of AI and Algorithms in Policing

In April 2025, BBC published a story about Lina, who was murdered on 9 February 2025 by her ex-partner. Only a week before her death, Lina had reported her ex-partner’s threats of violence to the police.

As part of her report, Lina was assessed by VioGen, an algorithmic technology system used by the Spanish government, designed to prevent, monitor, and manage gender-based violence. VioGen categorised Lina as being at ‘medium’ risk. As a result, she was sent home, with an officer scheduled to follow-up with her in 30 days.

In this instance, the algorithm failed to assess the danger correctly, and the result was fatal.

Algorithmic Law Enforcement

Lina’s story raises questions over the role that algorithms and artificial intelligence (AI) play in investigating crime. On one hand, copying the experience and decision-making skills of a trained police officer is a significant challenge. On the other hand, experienced officers are becoming increasingly more difficult to find.

With stretched budgets and workforce shortages, AI is becoming more prominent within law enforcement to alleviate pressure on teams and drive efficiency. These systems depend on the data they use. The information must be accurate, complete, and based on enough past examples that are similar.

The Challenge of AI in Policing

Spain is not the only nation utilising artificial intelligence to streamline its processes. The UK uses the Domestic Abuse, Stalking, and Honour-Based Violence risk assessment tool (DASH), and Canada has the Ontario Domestic Assault Risk Assessment (ODARA). Meanwhile, Non-Governmental Organisations (NGOs) are also using algorithms to support gender violence. UNDP is using AI to tackle the issue in Central America and UN Women is utilising the technology to support survivors of sexual violence in Thailand.

These systems have a range of roles, including data integration, risk assessment, case management and to provide real-time alerts. In many cases, they are proving valuable. However, unfortunately, they cannot be 100% effective from the very beginning. And agencies should not expect them to be. As these technologies use machine learning to become more reliable, they will, over time, continue to improve. Until then, police must keep working to combine human judgment with help from technology.

A Complicated Task

At Focus Data, we have been testing AI to detect murder suspects from victim call records. It is not as simple as typing a question into a large language model (LLM) such as ChatGPT. We are dealing with real, complex cases. Call Data Records overcomes one of the basic challenges of collecting sufficient data in a consistent and accurate format. However, there is still the challenge of getting sufficient training examples that are similar e.g. domestic, gang, drug or contract murders. Just like the rest of the policing world operates, asking the right question is critical. It is a difficult but promising task, and we’re making careful progress.

Interested in understanding how artificial intelligence can help catch criminals and save lives? Get in touch today.

No Comments

Post A Comment