UK Parliament / Open data

Surveillance Camera Code of Practice

Proceeding contribution from Lord Rosser (Labour) in the House of Lords on Wednesday, 2 February 2022. It occurred during Debate on Surveillance Camera Code of Practice.

My Lords, I first congratulate the noble Lord, Lord Clement-Jones, on securing this debate. Obviously, all who have spoken deserve a

response to the points they have raised, but I am particularly interested in what the reply will be to the noble Baroness, Lady Falkner of Margravine, who asked who was and who was not consulted and why. The point she made there most certainly deserves a response from the Government.

The Surveillance Camera Code of Practice was first published in June 2013 under provisions in the Protection of Freedoms Act 2012. It provides guidance on the appropriate use of surveillance camera systems by local authorities and the police. Under the 2012 Act these bodies

“must have regard to the code when exercising any functions to which the code relates”.

As has been said, the Government laid an updated code before both Houses on 16 November last year and, as I understand it, the code came into effect on 12 January this year. The Explanatory Memorandum indicates that changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those arising from a Court of Appeal judgment on police use of live facial recognition issued in August 2020, which was the Bridges v South Wales Police case.

Reporting the month before last, our Secondary Legislation Scrutiny Committee commented that the revised code reflects the Court of Appeal judgment

“by restricting the use of live facial recognition to places where the police have reasonable grounds to expect someone on a watchlist to be”

and added that the technology

“cannot be used for ‘fishing expeditions’”.

The committee continued:

“The Code now requires that if there is no suggested facial matches with the watchlist, the biometric data of members of the public filmed incidentally in the process should be deleted immediately. Because the technology is new, the revised Code also emphasises the need to monitor its compliance with the public sector equality duty to ensure that the software does not contain unacceptable bias. We note that a variety of regulators are mentioned in the Code and urge the authorities always to make clear to whom a person who objects to the surveillance can complain.”

As the regret Motion suggests, there is disagreement on the extent to which the code forms part of a sufficient legal and ethical framework to regulate police use of facial recognition technology, whether it is compatible with human rights—including the right to respect for private life—and whether it can discriminate against people with certain protected characteristics. Interpretations of the Court of Appeal judgement’s implications for the continued use of facial recognition technology differ too.

As has been said, the use of facial recognition is a growing part of our everyday lives—within our personal lives, by the private sector and now by the state. It can be a significant tool in tackling crime but comes with clear risks, which is why equally clear safeguards are needed. It appears that our safeguards and understanding of and frameworks for this spreading and developing technology are largely being built in a piecemeal way in response to court cases, legislation and different initiatives over its use, rather than strategic planning from the Government. Parliament—in particular MPs

but also Members of this House—has been calling for an updated framework for facial technology for some years, but it appears that what will now apply has finally come about because of the ruling on the Bridges v South Wales Police case, rather than from a government initiative.

The police have history on the use of data, with a High Court ruling in 2012 saying that the police were unlawfully processing facial images of innocent people. I hope the Government can give an assurance in reply that all those photos and data have now been removed.

While a regularly updated framework of principles is required, as legislation alone will struggle to keep up with technology, can the Government in their response nevertheless give details of what legislation currently governs the use and trials of facial recognition technology, and the extent to which the legislation was passed before the technology really existed?

On the updates made to the code, it is imperative that the technology is used proportionately and as a necessity. What will be accepted as “reasonable grounds” for the police to expect a person to be at an event or location in order to prevent phishing exercises? As the Explanatory Memorandum states:

“The Court of Appeal found that there is a legal framework for its use, but that South Wales Police did not provide enough detail on the categories of people who could be on the watchlist, or the criteria for determining when to use it, and did not do enough to satisfy its public sector equality duty.”

Can the Government give some detail on how these issues have now been addressed?

A further area of concern is the apparent bias that can impact this technology, including that its use fails to properly recognise people from black and minority-ethnic backgrounds and women. That is surely a significant flaw in technology that is meant to recognise members of our population. We are told that the guidance now covers:

“The need to comply with the public sector equality duty on an ongoing basis through equality impact assessments, doing as much as they can to ensure the software does not contain unacceptable bias, and ensuring that there is rigorous oversight of the algorithm’s statistical accuracy and demographic performance.”

What does that look like in practice? What is being done to take account of these issues in the design of the software and in the datasets used for training for its use? What does ongoing monitoring of its use and outcomes look like? The Secondary Legislation Scrutiny Committee raised the question of who a person should direct a complaint to if they object to the use of the technology, and how that will be communicated.

We have previously called for a detailed review of the use of this technology, including the process that police forces should follow to put facial recognition tools in place; the operational use of the technology at force level, taking into account specific considerations around how data is retained and stored, regulated, monitored and overseen in practice, how it is deleted and its effectiveness in achieving operational objectives; the proportionality of the technology’s use to the problems it seeks to solve; the level and rank required for sign-off; the engagement with the public and an explanation of the technology’s use; and the use of technology by authorities and operators other than the police.

What plans do the Government have to look at this issue in the round, as the code provides only general principles and little operational information? The Government previously said that the College of Policing has completed consultation on national guidance which it is intended to publish early this year, and that the national guidance is “to address the gaps”. Presumably these are the gaps in forces’ current published policies. What issues will the national guidance cover, and will it cover the issues, with great clarity and in detail, which we think a detailed review of the use of this technology should include and which I have just set out? Unfortunately, the Explanatory Memorandum suggests that neither the College of Policing national guidelines nor the updated code will do so or indeed are intended to do so.

About this proceeding contribution

Reference

818 cc992-5 

Session

2021-22

Chamber / Committee

House of Lords chamber
Back to top