My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.
We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.
We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.
A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.
The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.
The Bill’s pre-legislative scrutiny committee recommended that the legislation
“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.
The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.
This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.
This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.