UK Parliament / Open data

Police, Crime, Sentencing and Courts Bill

I hope Hansard does not repeat that.

Artificial intelligence has huge potential benefits and raises huge concerns, and it is not anticipating the work of the committee to refer to them this afternoon. For instance, collaboration between authorities—Part 2 of the Bill—requires the sharing of information. Will this contribute to profiling and predictive policing? Predictive policing algorithms identify likely crime hot spots; officers are deployed there, and so more stop and search takes place and more crime is reported. It is a feedback loop; a self-fulfilling prophecy which can teach the algorithm to alert the user to particular geographical areas, communities and ethnicities. It has been put to the committee that it is important to involve at a very early stage of the process, and in a meaningful way, members of the communities that are likely to be at the sharp end of these algorithms, and not to leave it to people such as the witness or me—a

white, middle-class, university-educated person, who is unlikely, one hopes, to be profiled as a future risk—because even with the best will in the world, we might not spot some of these problems and risks. A tick-box exercise is not enough.

Trust in systems translates to trust in authorities and in government itself—or, of course, the converse. The Bill permits the disclosure of data, but who owns it? What consents are required? Who knows about disclosure? We all expect some information—for instance, that between us and our medical professionals—to remain confidential. Transparency is important at an individual level, as well as more broadly. A defendant, or indeed someone questioned, will find it difficult to establish what technology—what combination of facial recognition technology, number plate recognition, predictive techniques—has led to his being identified as a suspect. If he cannot identify it, he cannot challenge it. How are we to ensure governance, regulation, accountability and scrutiny on an ongoing basis in the case of machine learning?

The technology has to be procured, and it will be procured from the private sector, whose interests are not the same as the public sector’s, and it is differently regulated, if at all. How can we be sure that purchasing authorities in the public sector understand what they are procuring? In the US, some police departments accepted a free trial of body-worn cameras, but they came with an obligation to be part of the manufacturer’s data ecosystem, including an obligation to use that company’s software and store data on its servers.

It is said that we need “human override”, but humans can get it wrong too. Human operators need to understand the limitations of particular technology to avoid overreliance on it or misinterpretation; they need to retain their critical factors.

These issues apply to identification, the extraction of information from electronic devices, monitoring and more that is in the Bill. They are the context for the development of policing and sentencing, such as the new cautions; for scrutiny, both general and in particular cases; and for our assessment of ethical considerations. We should be clear that there are clear principles to be applied. The National Audit Office has just reported on the national law enforcement data programme from a value-for-money point of view, of course, but there are other costs. The NAO mentioned, as I have, trust and the cost of damaging it. AI impacts society, communities, democracy and individual rights. We must be clear about what we are doing and why.

5.38 pm

About this proceeding contribution

Reference

814 cc1310-1 

Session

2021-22

Chamber / Committee

House of Lords chamber
Back to top