UK Parliament / Open data

Online Safety Bill

My Lords, the government amendments in this group relate to the categories of primary priority and priority content that is harmful to children.

Children must be protected from the most harmful online content and activity. As I set out in Committee, the Government have listened to concerns about designating primary priority and priority categories of content in secondary legislation and the need to protect children from harm as swiftly as possible. We have therefore tabled amendments to set out these categories in the Bill. I am grateful for the input from across your Lordships’ House in finalising the scope of these categories.

While it is important to be clear about the kinds of content that pose a risk of harm to children, I acknowledge what many noble Lords raised during our debates in Committee, which is that protecting children from online harm is not just about content. That is why the legislation takes a systems and processes approach to tackling the risk of harm. User-to-user and search service providers will have to undertake comprehensive, mandatory risk assessments of their services and consider how factors such as the design and operation of a service and its features and functionalities may increase the risk of harm to children. Providers must then put in place measures to manage and mitigate these risks, as well as systems and processes to prevent and protect children from encountering the categories of harmful content.

We have also listened to concerns about cumulative harm. In response to this, the Government have tabled amendments to Clause 209 to make it explicit that cumulative harm is addressed. This includes cumulative harm that results from algorithms bombarding a user with content, or where combinations of functionality cumulatively drive up the risk of harm. These amendments will be considered in more detail under a later group of amendments, but they are important context for this discussion.

I turn to the government amendments, starting with Amendment 171, which designates four categories of primary priority content. First, pornographic content

has been defined in the same way as in Part 5—to give consistent and comprehensive protection for children, regardless of the type of service on which the pornographic content appears. The other three categories capture content which encourages, promotes or provides instructions for suicide, self-harm or eating disorders. This will cover, for example, glamorising or detailing methods for carrying out these dangerous activities. Designating these as primary priority content will ensure that the most stringent child safety duties apply.

Government Amendment 172 designates six categories of priority content. Providers will be required to protect children from encountering a wide range of harmful violent content, which includes depictions of serious acts of violence or graphic injury against a person or animal, and the encouragement and promotion of serious violence, such as content glamorising violent acts. Providers will also be required to protect children from encountering abusive and hateful content, such as legal forms of racism and homophobia, and bullying content, which sadly many children experience online.

The Government have heard concerns from the noble Baronesses, Lady Kidron and Lady Finlay of Llandaff, about extremely dangerous activities being pushed to children as stunts, and content that can be harmful to the health of children, including inaccurate health advice and false narratives. As such, we are designating content that encourages dangerous stunts and challenges as a category of priority content, and content which encourages the ingestion or inhalation of, or exposure to, harmful substances, such as harmful abortion methods designed to be taken by a person without medical supervision.

Amendment 174, from the noble Baroness, Lady Kidron, seeks to add “mis- and disinformation” and “sexualised content” to the list of priority content. On the first of these, I reiterate what I said in Committee, which is that the Bill will protect children from harmful misinformation and disinformation where it intersects with named categories of primary priority or priority harmful content—for example, an online challenge which is promoted to children on the basis of misinformation or disinformation, or abusive content with a foundation in misinformation or disinformation. However, I did not commit to misinformation and disinformation forming its own stand-alone category of priority harmful content, which could be largely duplicative of the categories that we have already included in the Bill and risks capturing a broad range of legitimate content.

We have already addressed key concerns related to misinformation and disinformation content which presents the greatest risk to children by including content which encourages the ingestion or inhalation of, or exposure to, harmful substances to the list of priority categories. However, the term “mis- and disinformation”, as proposed by Amendment 174, in its breadth and subjectivity risks inadvertently capturing a wide range of content resulting in disproportionate, excessive censorship of the content children see online, including in areas of legitimate debate. The harm arising from misinformation or disinformation usually arises from the context or purpose of the content, rather than the mere fact that

it is untrue. Our balanced approach ensures that children are protected from the most prevalent and concerning harms associated with misinformation and disinformation.

4.15 pm

I turn to sexualised and adult content. Again, we must tread carefully here. What might constitute this is subjective and presents challenges for both providers and Ofcom to interpret. It is important that what constitutes priority content is sufficiently well defined so it is clear both to providers and to Ofcom what their obligations under the Bill are. Amendment 174 sets an extremely broad scope and gives rise to a risk of censorship if providers take an excessively broad interpretation of what is sexualised and adult content. It is important that we safeguard children’s freedom of expression through the Bill and do not inadvertently limit their access to innocuous and potentially helpful content.

The duties are, of course, not limited to the content in the Government’s amendments. The Bill requires providers to identify and act on any “non-designated content” which meets the Bill’s threshold of

“content that is harmful to children”,

even where it has not been designated as primary priority or priority content. Therefore, I hope the noble Baroness will understand why we cannot accept Amendment 174.

Amendment 237 in my name introduces a delegated power to update and amend these lists. This is essential for ensuring that the legislation remains flexible to change and that new and emerging risks of harm can be captured swiftly. Amendment 238, also in my name, ensures that the draft affirmative procedure will apply except in cases where there is an urgent need to update the lists, when the affirmative procedure can be used. This ensures that Parliament will retain the appropriate degree of oversight over any changes. I beg to move.

About this proceeding contribution

Reference

831 cc1384-6 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top