My Lords, it is a great pleasure to move Amendment 23. I will speak also to Amendments 28 and 62, in the name of the noble Baroness, Lady Grey-Thompson. I am glad to say that she will speak later in our debate.
The amendments are based on research by the LSE, which found that during lockdown, abuse by current partners, as well as by family members, increased on average by 8.1% and 17.1% respectively, whereas abuse by ex-partners declined by 11.4%. This increase in domestic abuse calls is driven by third-party reporting, which suggests that there is significant underreporting by actual victims, particularly in households where the abuse cannot be reported by an outsider.
An analysis of more than 16,000 cases of domestic violence enacted on one individual by another showed that the current predictive system failed to classify over 1,700 situations as high risk, which subsequently saw a repeat attack—a negative prediction rate of 11.5%.
The LSE research found that by utilising technology, through machine-learning methods, or AI, this negative prediction rate could be cut to between 7.3% and 8.7%. In England, domestic violence accounts for one-third of all assaults involving injury. A crucial part of tackling this abuse is risk assessment—determining what level of danger someone may be in so that they can receive help as quickly as possible. This means prioritising police resources in responding to domestic abuse calls accordingly.
This risk assessment is currently done through a standardised list of questions, administered to the victim by the responding officer, as well as the officer’s own professional risk assessment of the case. The DASH—domestic abuse, stalking, harassment and honour-based violence—form consists of around 28 questions used to categorise the case as standard, medium or high risk. If a case is assessed high risk, this suggests that an incident of serious harm could occur at any time, and this triggers resources aimed at keeping the victim safe. However, the DASH data is available only after an officer has appeared on the scene.
The research shows striking inconsistencies in DASH across the country. In 2014, HMIC found that 10 police forces classified fewer than 10% of domestic abuse cases as high risk, while three forces designated over 80% as high risk. This vast deviation casts serious doubt on the accuracy of current predictive methods.
A recent report from Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Services reveals concern that the police are sometimes too slow in getting to domestic abuse incidents and that there were delays in responding to cases in over a quarter of forces. The inspectorate also found that, in a small number of cases, the delays are because the forces do not have enough officers available to attend.
LSE data analysis compared the predictive power of conventional DASH risk assessments with risk assessments using a machine-learning approach. It applied the different prediction models to calls to Greater Manchester Police between 2014 and 2018, and compared predictions made, case-by-case, to actual violent recidivism over a period of 12 months from the initial call. When tested against the sample data, the predictive power of risk assessments from the conventional DASH method are low; a machine-learning prediction based on the underlying data from the DASH questionnaire performs better; while a machine-learning prediction based on two-year criminal histories of victim and perpetrator performs much better still.
The researchers—Professor Tom Kirchmaier, Professor Jeffrey Grogger and Dr Ria Ivandic—therefore suggest that police forces should use machine-learning predictions based on two-year criminal histories, rather than DASH, to make risk assessments and prioritise responses to domestic violence calls.
Vitally, the research also found that by improving the data compiled during the investigation of domestic violence cases, to include details such as previous criminal convictions, incidents of violence, and the number of previous reports of domestic abuse, the negative prediction rate could be cut further to 6.1%. Up to 1,200 repeat attacks missed under the current system would have been identified.
We all know that there is a real problem with the use of data by the police. The Royal United Services Institute, in a report last year, identified some of the issues facing police forces in the use of data. It reported that in recent years, police use of algorithms has expanded significantly in scale and complexity. It argued that this was driven by three closely related factors. First, a significant increase in the volume and complexity of digital data has necessitated the use of more sophisticated analysis tools. Secondly, ongoing austerity measures have resulted in a perceived need to allocate limited resources more efficiently, based on a data-driven assessment of risk and demand. Thirdly, the police service is increasingly expected to adopt a preventive rather than reactive posture, with greater emphasis on anticipating potential harm before it occurs.
3 pm
But—and here is the “but”—interviewees highlighted the lack of an evidence base, poor data quality and insufficient skills and expertise as the three major barriers to successful implementation. In particular, the development of policing algorithms is often not underpinned by a robust empirical evidence base regarding their claimed benefits, scientific validity or cost-effectiveness. In this case, we do have evidence. The Minister will obviously know the Greater Manchester police force well. I hope she might be able to look at this to see how far it could be extended to other forces and encourage best practice.
The failure to use data effectively is also at the heart of Amendment 62, to which I have also put my name. At a briefing last week for noble Lords, LSE researchers noted that domestic abuse prevention notices will be an important and much-appreciated new tool in the fight against domestic abuse. However, when an officer is considering handing one out, having access to the criminal history of the alleged perpetrator should be a crucial aspect of their decision-making. As mentioned, more than one in 10 people who report domestic abuse will call again within a year to report a repeat violent attack, and that is only one aspect of the kaleidoscope of past violence and abuse that may be known to the police but not necessarily utilised, or even known, by an officer attending a case of reported abuse.
One key challenge that we have to overcome is that police forces do not currently have systematic ways of recording the same person, victim or perpetrator. This means that, oftentimes, repeat victims or perpetrators are not spotted or no action is taken to protect from and prevent abuse. Forces rely on correct spelling of
the full name and date of birth to access records, but data entry can be found to be incorrect or incomplete. Thames Valley Police has taken positive steps to address this issue and it could be used as a case study example that others could follow.
However, we know that police forces do not share data systematically, apart from via the national police computer, which records only charges. This calls into question the full effectiveness of Clare’s law and police forces’ ability to give full information to potential victims about known abusers and, in the process, to prevent future abuse. The system can clearly be improved. Enshrining in legislation the ability for the police to use previous criminal records to determine whether to hand out a notice could be an important prompt to improve data sharing and, in doing so, save lives.
I appreciate that this is rather technical but, given the current failures in the system, we need to use all the ammunition we can. I hope the Minister might be able to respond sympathetically to the amendment. I beg to move.