UK Parliament / Open data

Online Safety Bill

My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.

The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.

A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.

The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you

are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.

The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.

Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.

That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.

Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.

5.30 pm

This raises some fundamental questions about what we want from privacy law and what we want in terms of criminal prosecutions. There is a spectrum of offences, and for some I think we have accepted that platforms

should report; on child sexual abuse material, platforms have a duty to report every incidence to the regulator. When it comes to threats to life, the expectation would be clear, so if you have knowledge—this happens—of an imminent terrorist attack or even of somebody who is about to commit suicide, it is clear that you should go to the police or the relevant authorities with that information. Then you have this broad spectrum of other criminal offences which may be problematic. I do not want to minimise the effect on people of hate speech crimes, but they are of a different order, shall we say, from threat-to-life cases, where I think reporting is broadly supported. We have to make a decision there.

My starting point is to be nervous about platforms acting in that policing capacity for offences that are not at the most extreme end of the spectrum. Individuals who are worried about that activity can go to the police directly themselves and can generally take the content to the police—literally; they can print it off—who can make a judgment about whether to go to the Crown Prosecution Service. I worry about the platforms doing it partly from a constitutional point of view, because I am not sure that I want them acting in that quasi-legal capacity, but also, frankly, from a volume point of view. The risk is that if you put this duty on a platform, because it is really hard to understand what is criminal hate speech and what is merely hateful hate speech, the temptation will be to send everything over. If you do that, first, you have a greater violation of privacy, and secondly, you probably have not helped the police, because they get swamped with reports that they cannot manage.

I hope that is a helpful counterargument to the idea that platforms should automatically report material. However, I recognise that it leaves an open question. When people engage in that kind of behaviour online and it has serious real-world consequences, how do we make sure that they do not feel that it is consequence-free—that they understand that there are consequences? If they have broken the law, they should be prosecuted. There may be something in streamlining the process where a complainant goes to the police and the police are able to access the information they need, having first assessed that it is worth prosecuting and illegal, so that we make that loop work first before we head in the direction of having platforms report content en masse because they believe it may have violated laws where we are not at that most serious end of the spectrum.

About this proceeding contribution

Reference

830 cc1079-1081 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top