My Lords, I start by saying that accurate systems and processes for content moderation are crucial to the workability of this Bill and keeping users safe from harm. Amendment 228 from the noble Lord, Lord Allan of Hallam, seeks to remove the requirement for platforms to treat content as illegal or fraudulent content if reasonable grounds for that inference exist. The noble Lord set out his concerns about platforms over-removing content when assessing illegality.
Under Clause 173(5), platforms will need to have reasonable grounds to determine whether content is illegal or a fraudulent advertisement. Only when a provider has reasonable grounds to infer that said content is illegal or a fraudulent advertisement must it then comply with the relevant requirements set out in the Bill. This would mean removing the content or preventing people from encountering it through risk-based and proportionate systems and processes.
9.30 pm
Clause 173(6) further clarifies what “reasonable grounds to infer” means in relation to judgments about illegal content and fraudulent adverts. It sets out the tests that a provider must apply to the assessment of whether all the elements of an offence—including the mental elements—are present, and whether a defence might be relied on.
The noble Lord’s amendment removes this standard for judging the illegality of content but does not replace it with another standard. That would mean that the Bill provided less detail about when providers are required to treat content as illegal or a fraudulent advert. The result would be that the Bill did not set out a consistent approach to identifying and removing such content that would enable providers to interpret their duties in a broad range of ways while still complying with the framework. This could result in services both over-removing and under-removing content.
I know that the noble Lord is concerned that this provision could encourage overzealous removal of content, but the Government are clear that the approach that I have just outlined provides the necessary safeguards against platforms over-removing content when complying
with their duties under the Bill. The noble Lord asked for a different standard to be associated with different types of criminal offence. That is, in effect, what we have done through the distinction that we have made between priority and non-priority offences.
To assist services further, Ofcom will be required to provide guidance on how it judges the illegality of content. In addition, the Government consider that it would not be right to weaken the test for illegal content by diluting the content moderation provisions in the way that this amendment would. Content moderation is critical to protecting users from illegal content and fraudulent advertisements.
The noble Viscount, Lord Colville, set out the importance of freedom of expression, as other noble Lords—principally the noble Baroness, Lady Fox, and the noble Lord, Lord Moylan, but others too—have throughout our scrutiny of the Bill. Our approach regarding freedom of expression recognises that the European Convention on Human Rights imposes obligations in relation to this on states, not private entities. As we have discussed previously, private actors, including service providers in scope, have their own freedom of expression rights. This means that platforms are free to decide what content should be allowed on their sites within the bounds of the law. As such, it is more appropriate to ask them to have particular regard to these concepts rather than to be compliant or consistent with them.
In-scope companies will have to consider and implement safeguards for freedom of expression when fulfilling their duties. For example, platforms could safeguard freedom of expression by ensuring that human moderators are adequately trained to assess contextual and linguistic nuance—such as the examples that the noble Lord gave—to prevent the over-removal of content. The larger services will also have additional duties to assess their impact on freedom of expression and privacy when adopting safety policies, to keep this assessment up to date and to demonstrate that they have taken positive steps in relation to the impact assessment.
Further, platforms will not be penalised for making the wrong calls on pieces of illegal content. Ofcom will instead make its judgments on the systems and processes that platforms have in place when making these decisions. The focus on transparency through the Bill’s framework and on user reporting and redress mechanisms will enable users to appeal the removal of content more effectively than they can at present.
Amendment 229 in the name of the noble Baroness, Lady Fox, would require providers of category 1 services to apply the user empowerment features required under Clause 12 only to content that they have “reasonable grounds to infer” is user empowerment content. The Bill’s cross-cutting freedom of expression duties already prevent providers overapplying user empowerment features or adopting an inconsistent or capricious approach; Ofcom can take enforcement action if they do this. Clause 173(2) and (3) already specify how providers must make judgments about the status of content, including judgments about whether content is in scope of the user empowerment duties. That includes making this judgment based on
“all relevant information that is reasonably available to a provider”.
It is unclear whether the intention of the noble Baroness’s amendment is to go further. If so, it would be inappropriate to apply the “reasonable grounds to infer” test in Clause 173(5) and (6) to user empowerment content. This is because, as I have just outlined in relation to the amendment in the name of the noble Lord, Lord Allan, the test sets out the approach that providers must take when assessing whether content amounts to a criminal offence. The test cannot sensibly be applied to content covered by the user empowerment duties because such content is not illegal. It is not workable to suggest that providers need to apply criminal law concepts such as intent or defences to non-criminal material. Under Clause 48, Ofcom will be required to produce and publish guidance that sets out examples of the kinds of content that Ofcom considers to be relevant to the user empowerment duties. This will assist providers in determining what content is of relevance to the user empowerment duties.
I hope that this allays the concerns raised by the noble Baroness and the noble Lord, and that the noble Lord will be content to withdraw his amendment.