UK Parliament / Open data

Online Safety Bill

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.

One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in

the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

About this proceeding contribution

Reference

829 cc1338-1340 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top