UK Parliament / Open data

Online Safety Bill

My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall, who explained well why I put my name to the amendments. I extend my regards to the noble Baroness, Lady Featherstone; I was looking forward to hearing her remarks, and I hope that she is well.

I am interested in free speech; it is sort of my thing. I am interested in how we can achieve a balance and enhance the free speech rights of the citizens of this country through the Bill—it is what I have tried to do with the amendments I have supported—which I fear might be undermined by it.

I have a number of amendments in this group. Amendment 49 and the consequential Amendments 50 and 156 would require providers to include in their terms of service

“by what method content present on the service is to be identified as content of democratic importance”,

and bring Clause 13 in line with Clauses 14 and 15 by ensuring an enhanced focus on the democratic issue.

Amendment 53A would provide that notification is given

“to any user whose content has been removed or restricted”.

It is especially important that the nature of the restriction in place be made clear, evidenced and justified in the name of transparency and—a key point—that the user be informed of how to appeal such decisions.

Amendment 61 in my name calls for services to have

“proportionate systems, processes and policies designed to ensure that as great a weight is given to users’ right to freedom of expression ... as to safety when making decisions”

about whether to take down or restrict users access to the online world, and

“whether to take action against a user generating, uploading or sharing content”.

In other words, it is all about applying a more robust duty to category 1 service providers and emphasising the importance of protecting

“a wide diversity of political, social, religious and philosophical opinion”

online.

I give credit to the Government, in that Clause 18 constitutes an attempt by them in some way to balance the damage to individual rights to freedom of expression and privacy as a result of the Bill, but I worry that it is a weak duty. Unlike operational safety duties, which compel companies proactively to prevent or minimise so-called harm in the way we have discussed, there is no such attempt to insist that freedom of speech be given the same regard or importance. In fact, there are worries that the text of the Bill has downgraded speech and privacy rights, which the Open Rights Group says

“are considered little more than a contractual matter”.

There has certainly been a lot of mention of free speech in the debates we have had so far in Committee, yet I am not convinced that the Bill gives it enough credit, which is why I support the explicit reference to it by the noble Lord, Lord Kamall.

I have a lot of sympathy with the amendments of the noble Lord, Lord Stevenson, seeking to replace Clauses 13, 14, 15 and 18 with a single comprehensive duty, because in some ways we are scratching around. That made some sense to me and I would be very interested to hear more about how that might work. Clauses 13, 14, 15 and 18 state that service providers must have regard to the importance of protecting users’ rights to freedom of expression in relation to

“content of democratic importance ... publisher content ... journalistic content”.

The very existence of those clauses, and the fact that we even need those amendments, is an admission by the Government that elsewhere, free speech is a downgraded virtue. We need these carve-outs to protect these things, because the rest of the Bill threatens free speech, which has been my worry from the start.

My Amendment 49 is a response to the Bill’s focus on protecting “content of democratic importance”. I was delighted that this was included, and the noble Lord, Lord Stevenson of Balmacara, has raised a lot of the questions I was asking. I am concerned that it is rather vaguely drawn, and too narrow and technocratic—politics with a big “P”, rather than in the broader sense. There is a lot that I would consider democratically important that other people might see, especially given today’s discussion, as harmful or dangerous. Certainly, the definition should be as broad as possible, so my amendment seeks to write that down, saying that it should include

“political, social, religious and philosophical opinion”.

That is my attempt to broaden it out. It is not perfect, I am sure, but that is the intention.

I am also keen to understand why Clauses 14 and 15, which give special protection to news publisher and journalistic content, have enhanced provisions, including an expedited appeals process for the reinstatement of removed materials, but those duties are much weaker—they do not exist—in Clause 13, which deals with content of democratic importance. In my amendment, I have suggested that they are levelled up.

9.15 pm

My Amendment 61 attempts to tackle the duties that will be used for companies in terms of safety, which is the focus of the Bill. It stresses that equal weight should be given to free speech and to safety.

This relates to the content of democratic importance that I have just been talking about, because I argue that democracy is not safe if we do not proactively promote freedom. Both those amendments try to ensure that companies act to remove philosophical, religious, democratic and social material only in extremis—as an exception, not the rule—and that they always have free speech at the forefront.

On the issue of how we view content of democratic importance, one thing has not been stressed in our discussions so far. We should note that the right to freedom of expression is not just about defending the ability of individuals to speak or impart information; it is also the right of the public to receive information and the freedom to decide what they find useful or second-rate and what they want to watch or listen to. It is not just the right to post opinions but the right of others to have access to diverse opinions and postings; that kind of free flow of information is the very basis of our democracy. In my view, despite its talk of user controls and user empowerment, the Bill does not allow for that or take it into account enough.

It is very important, therefore, that users are told if their posts are restricted, how they are restricted and how they can appeal. That is the focus of Amendment 53A. The EHRC says that the Bill overall lacks a robust framework for individuals to appeal platforms’ decisions or to seek redress for unjustified censorship. I think that needs to be tackled. Clause 19 has a basic complaints procedure, but my amendment to Clause 17 tries to tackle what is a very low bar by stressing the need for “evidenced justification” and details on how to appeal. Users need to know exactly why there has been a decision to restrict or remove. That is absolutely crucial.

Ofcom is the enforcer in all this, with the Secretary of State of the day being given a plethora of new delegated powers, which I think we need to be concerned about. As the coalition group Legal to Say, Legal to Type notes, the Bill in its current form gives extensive powers to the Secretary of State and Ofcom:

“This would be the first time since the 1600s that written speech will be overseen by the state in the UK”.

The truth is that we probably need a new Milton, but in 2023 what we have instead is a Moylan. I have put my name to a range of the excellent series of amendments from the noble Lord, Lord Moylan, including Amendments 102, 191 and 220, all dealing with Ofcom and the Secretary of State. As he will explain, it is really crucial that we take that on.

I did not put my name to the noble Lord’s Amendment 294, although I rather wish I had. In some ways this is a key amendment, as it would leave out the word “psychological” from the definition of harm. As we have gone through all these discussions so far in Committee and at Second Reading and so on, the definition of harm is something that, it seems to me, is very slippery and difficult. People just say, “We have to remove harmful content” or, “It is okay to remove harmful content”, but it is not so simple.

I know that any philosophical rumination is frowned upon at this stage—I was told off for it the other day—but, as this is the 150th anniversary of JS Mill’s

death, let me note that his important harm principle has been somewhat bastardised by an ever-elastic concept of harm.

Psychological harm, once added into the mix—I spoke about this before—is going to lead to the over-removal of lawful content, because what counts as harm is not settled online or offline. There is no objective way of ascertaining whether emotional or psychological harm has occurred. Therefore, it will be impossible to determine whether service providers have discharged their duties. Controversies of interpretation about what is harmful have already left the door open to activist capture, and this concept is regularly weaponised to close down legitimate debate.

The concept of harm, once expanded to include psychological harm, is subject to concept creep and subjectivity. The lack of definition was challenged by the Lords Communications and Digital Committee when it wrote to the Secretary of State asking whether psychological harm had any objective clinical basis. DCMS simply confirmed that it did not, yet psychological harm is going to be used as a basis for removing lawful speech from the online world. That can lead only to a censorious and, ironically, more toxic online environment, with users posting in good faith finding their access to services—access that is part of the democratic public square—being shut down temporarily or permanently, even reported to the law or what have you, just because they have been accused of causing psychological harm. The free speech elements of the Bill need to be strengthened enormously.

About this proceeding contribution

Reference

829 cc1762-5 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top