My Lords, I added my name to speak to this group, primarily in support of Amendment 23. I, too, declare my vice-presidency of the Local Government Association. This matter has been magisterially covered by the noble Lord, Lord Hunt of Kings Heath, so anything I say will be a mere shadow of what he and the other speakers have put down.
I, too, received the briefings, both before Second Reading and more recently, from the London School of Economics. I pay great tribute to it for having brought that matter to the attention of Members of this House. At Second Reading, I and other noble Lords—in particular the noble Lord, Lord Dholakia, who has just spoken—commented on the failure of crime recording to pick up many cases, particularly cases of domestic abuse. In defence of those who are charged with the recording of suspected crimes, especially domestic abuse, they are often difficult to identify in the snowstorm of all the other issues that may be involved. Indeed, domestic abuse may not be the primary purpose of the initiating call to the police or some other agency.
Professor Gadd of the University of Manchester, to whom I had the privilege of speaking last week, suggested to me that we need to be much more curious in our responses to crime, and in particular possible abuse. Complex patterns of behaviour and the way in which they manifest themselves are meat and drink to data analysts. It seems to me that if big tech companies can build up accurate pictures of all our various spending preferences and other things, so too can algorithms help us spot and codify trends of abuse.
I do not claim expertise in artificial intelligence, but I know about the need for accurate input data and, of course, we have had problems with police recorded crime. This obviously has not been helped by failings to record offences in, I would say, several police forces
over quite a number of years and, of course, the recent loss of data from the police national computer. Even so, the negative prediction rate of 11.5%, which the noble Lord, Lord Hunt, referred to and which the LSE comments on, must be a matter for some significant concern, given the proportion that domestic violence, and repeat behaviour of that, represents as a component of all crime. Any machine-learned means of reducing this, and with it the tragic outcomes that cost this country so much in torment and treasure, must have a place. That is why I support this group of amendments, and Amendment 23 in particular.
However, collecting all the data in the world, as has been pointed out, is not going to be a great deal of use if it is not consistently collated, made available at the right time and shared with people who have a need to see it at the appropriate moment. The sort of checklists that have been referred to under the DASH system—a number of standard questions, consistently recorded, collated and available at the earliest possible stages of a proposed intervention—would, I am certain, be invaluable. There, I am satisfied that technology can help. I do not think that this requires reinvention but better management, oversight and adoption of appropriate IT systems. This would help reduce human errors and omissions. Above all, it is about avoiding unnecessary risk and optimising resources, as has been pointed out. This necessitates good training of call handlers and, as I say, being altogether more inquisitive and interrogative of data and callers to see what is actually lying behind the call. Otherwise, I do not think that we will make the best use of what IT offers. That apart, I believe that these amendments are extremely important in pointing a way forward.
3.15 pm