UK Parliament / Open data

Data Protection Bill [HL]

My Lords, I rather hope that the Minister has not been able to persuade noble Lords opposite. Certainly, I have not felt myself persuaded. First, on the point about “solely”, in recruiting these days, when big companies need to reduce a couple of thousand applications to 100, the general practice is that you put everything into an automated process—you do not really know how it works—get a set of scores at the end and decide where the boundary lies according to how much time you have to interview people. Therefore, there is human intervention—of course there is. You are looking at the output and making the decision about who gets interviewed and who does not. That is a human decision, but it is based on the data coming out of the algorithm without understanding the algorithm. It is easy for an algorithm to be racist. I just googled “pictures of Europeans”. You get a page of black faces. Somewhere in the Google algorithm, a bit of compensation is going on. With a big algorithm like that, they have not checked what the result of that search would be, but it comes out that way. It has been equally possible to carry out searches, as at various times in the past, which were similarly off-beam with other groups in society.

When you compile an algorithm to work with applications, you start off, perhaps, by looking at, “Who succeeds in my company now? What are their characteristics?”. Then you go through and you say, “You are not allowed to look at whether the person is a man or a woman, or black or white”, but perhaps you are measuring other things that vary with those characteristics and which you have not noticed, or some combinations. An AI algorithm can be entirely unmappable. It is just a learning algorithm; there is no mental process that a human can track. It just learns from what is there. It says, “Give me a lot of data about your employees and how successful they are and I will find you people like that”.

At the end of the day, you need to be able to test these algorithms. The Minister may remember that I posed that challenge in a previous amendment to a previous Bill. I was told then that a report was coming out from the Royal Society that would look at how we should set about testing algorithms. I have not seen that report, but has the Minister seen it? Does he know when it is coming out or what lines of thinking the Royal Society is developing? We absolutely need something practical so that when I apply for a job and I think I have been hard done by, I have some way to do something about it. Somebody has to be able to test the algorithm. As a private individual, how do you get that done? How do you test a recruitment algorithm? Are you allowed to invent 100 fictitious characters to put through the system, or should the state take an interest in this and audit it?

We have made so much effort in my lifetime and we have got so much better at being equal—of course, we have a fair way to go—doing our best continually to

make things better with regard to discrimination. It is therefore important that we do not allow ourselves to go backwards because we do not understand what is going on inside a computer. So absolutely, there has to be significant human involvement for it to be regarded as a human decision. Generally, where there is not, there has to be a way to get a human challenge—a proper human review—not just the response, “We are sure that the system worked right”. There has to be a way round which is not discriminatory, in which something is looked at to see whether it is working and whether it has gone right. We should not allow automation into bits of the system that affect the way we interact with each other in society. Therefore, it is important that we pursue this and I very much hope that noble Lords opposite will give us another chance to look at this area when we come to Report.

About this proceeding contribution

Reference

785 cc1873-4 

Session

2017-19

Chamber / Committee

House of Lords chamber
Back to top