UK Parliament / Open data

Police, Crime, Sentencing and Courts Bill

My Lords, changing the subject, the Data Protection Act 2018, reflecting the GDPR, in Section 14 provides that “decisions based solely”— solely—“on automated processing” are “subject to safeguards.” Such a decision

“produces legal effects concerning the data subject, or … similarly significantly affects the data subject.”

The decisions are subject to safeguards under the Act, notification of the data subject and the right of the data subject to request reconsideration or, importantly, a new decision not based on automated processing. Noble Lords will appreciate the potential importance of decisions affecting liberty and that the use of artificial intelligence may well involve profiling, which does not have an unblemished record.

This amendment would alter the term “solely,” because “solely” could mean one click on a programme. The term “significantly”, proposed in the amendment, is not the best, but I think it will serve the purpose for this evening. I do not claim that this is the best way to achieve my objective, but I did not want to let the moment pass. The Justice and Home Affairs Committee —I am not speaking as its chair—has had this issue raised a number of times. The Information Commissioner is one who has raised the issue. Elizabeth Denham, before she left the office, said it should not just be a matter of box-ticking. The guidance of the Information Commissioner’s Office provides that there should be the following three considerations:

“Human reviewers must be involved in checking the system’s recommendation and should not just apply the automated recommendation to an individual in a routine fashion; reviewers’ involvement must be active and not just a token gesture. They should have actual ‘meaningful’ influence on the decision, including the ‘authority and competence’ to go against the recommendation; and reviewers must ‘weigh-up’ and ‘interpret’ the recommendation, consider all available input data, and also take into account other additional factors.”

The Minister will, I am sure, refer to the current government consultation on data, Data: A New Direction, published in September. We dealt with this issue by putting the amendment down before then but, even so, the consultation questions the operation and efficacy of the Article 22 of the GDPR, which, as I said, is the basis for Section 14. I appreciate that the consultation will have to run its course but, looking at it, the Government seem very focused on the economic benefits of the use of data and supportive of innovation.

Of course, I do not take issue with either of those things, but it is important not to lose sight of how the use of data may disadvantage or damage an individual. Its use in policing and criminal justice can affect an individual who may well not understand how it is being used, or even that it has been used. I was going to say that whether those who use it understand it is another matter but, actually, it is fundamental. Training is a big issue in this, as is, in the case of the police, the seniority and experience of the officer who needs to be able to interpret and challenge what comes out of an algorithm. There is a human tendency to think that a machine must be right. It may be, but meaningful decisions require human thought more than an automatic, routine confirmation of what a machine tells us.

The government consultation makes it clear that the Government are seeking evidence on the potential need for legislative reform. I think that reform of Section 14 is needed. AI is so often black-box and impenetrable; even if it can be interrogated on how a decision has been arrived at, the practicalities and costs of that are substantial. For instance, it should be straightforward for someone accused of something to understand how the accusation came to be made. It is

a matter of both the individual’s rights and trust and confidence in policing and criminal justice on the part of the public. The amendment would extend the information to be provided to the data subject to include

“information … regarding the operation of the automated processing and the basis of the decision”.

It also states that this should not be “limited by commercial confidentiality”; I think noble Lords will be familiar with how openness can run up against this.

Recently, the Home Secretary told the Justice and Home Affairs Committee twice that

“decisions about people will always be made by people.”

The legislation should reflect and require the spirit of that. A click of a button on a screen may technically mean that the decision has a human element, but it is not what most people would understand or expect. I beg to move.

About this proceeding contribution

Reference

816 cc693-5 

Session

2021-22

Chamber / Committee

House of Lords chamber
Back to top