UK Parliament / Open data

Online Safety Bill

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

4.45 pm

My noble friend Lord Moylan’s first group within a group—Amendments 17 and 18—alters the duties in Clause 9 of the Bill. These amendments would weaken the illegal content duties by removing any obligation on services to take upstream measures to remove illegal content, including child sexual abuse material. They would therefore seriously undermine the Bill’s focus on proactive risk management. Similarly, Amendments 272 to 283 seek to alter how services should judge what is illegal. I understand that noble Lords are concerned, rightly, about the over-removal of content.

The amendments tabled by the noble Lord, Lord Clement-Jones, would require providers to have sufficient evidence that content is illegal before taking action against it, replacing the current test of “reasonable grounds to infer”. Sufficient evidence is a subjective measure. We have discussed the difficulties for those who must make these decisions and we think that this formulation would set an unclear threshold for providers to determine how they should judge illegality, which could result in the under-removal of illegal content, putting users at risk, or the over-removal of it, with adverse consequences for freedom of expression.

The amendments tabled by my noble friend Lord Moylan would narrow the test to require the removal only of content which, based on all reasonably available contextual evidence, is manifestly illegal, and we think that that threshold is too high. Context and analysis can give a provider good reasons to infer that content is illegal even though the illegality is not immediately obvious. This is the case with, for example, some terrorist content which is illegal only if shared with terrorist purposes in mind, and intimate image abuse, where additional information or context is needed to know whether content has been posted against the subject’s wishes.

Amendment 276 would remove the detail in Clause 170 that specifies the point at which providers must treat content as illegal or fraudulent. That would enable

providers to interpret their safety duties in broader ways. Rather than having greater discretion, Ofcom would be given less certainty about whether it could successfully take enforcement action. I take the point raised by noble Lords about the challenges of how platforms will identify illegal content, and I agree with my noble friend Lady Stowell that the contributions of noble and learned Lords would be helpful in these debates as well. However, Clause 170 sets out how companies should determine whether or not content is illegal or an advertisement is fraudulent. I will say a little more about the context behind that, as the noble Lord, Lord Allan, may have a question.

The Bill recognises that it will often be difficult for providers to make judgments about content without considering the context. Clause 170 therefore clarifies that providers must ascertain whether, on the basis of on all reasonably available information, there are reasonable grounds to infer that all the relevant elements of the offence—including the mental elements—are present and that no defence is available. The amount of information that would be reasonably available to a particular service provider will depend on the size and capacity of the provider, among other factors.

Companies will need to ensure that they have effective systems to enable them to check the broader context relating to content when deciding whether or not to remove it. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgments about whether or not content is illegal. We think that protects against over-removal by making it clear that platforms are not required to remove content merely on the suspicion of it being illegal. Beyond that, the framework also contains provisions about how companies’ systems and processes should approach questions of mental states and defences when considering whether or not content is an offence in the scope of the Bill.

About this proceeding contribution

Reference

829 cc1357-8 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top