UK Parliament / Open data

Online Safety Bill

Yes, that is what the measure does—for instance, where it intersects with the named categories of primary priority or priority content in the Bill, although that is not the only way the Bill does it. This will be covered by non-designated content that is harmful to children. As we have said, we will bring forward amendments on Report—which is perhaps why the noble Baroness has not seen them in the material in front of us—regarding material harms to children, and they will provide further detail and clarity.

Returning to the advisory committee that the Bill sets up and the amendments from the noble Baroness, Lady Merron, and my noble friend Lord Moylan, all regulated service providers will be forced to take action against illegal misinformation and disinformation in scope of the Bill. That includes the new false communication offences in the Bill that will capture communications where the sender knows the information to be false but sends it intending to cause harm—for example, hoax cures for a virus such as Covid-19. The noble Baroness is right to say that that is a slightly different approach from the one taken in her amendment, but we think it an appropriate and proportionate response to tackling damaging and illegal misinformation and disinformation. If a platform is likely to be accessed by children, it will have to protect them from encountering misinformation and disinformation content that meets the Bill’s threshold for content that is harmful to children. Again, that is an appropriate and proportionate response.

Turning to the points made by my noble friend Lord Moylan and the noble Baroness, Lady Fox, services will also need to have particular regard to freedom of expression when complying with their safety duties. Ofcom will be required to set out steps that providers can take when complying with their safety duties in the codes of practice, including what is proportionate for different providers and how freedom of expression can be protected.

4.45 pm

My noble friend Lord Bethell and the noble Baroness, Lady Merron, are concerned that health misinformation and disinformation will not be adequately covered by this. Their amendment seeks to tackle that but, in doing so, mimics provisions on content harmful to adults previously included in the Bill which the Government consciously removed last year following debates in another place. The Government take concerns about health-related misinformation and disinformation very seriously. Our approach will serve a purpose of transparency and accountability by ensuring that platforms are transparent and accountable to their users about what they will and will not allow on their services.

Under the new terms of service for category 1 services, if certain types of misinformation and disinformation are prohibited in platforms’ terms of service, they will have to remove it. That will include anti-vaccination falsehoods and health-related misinformation and disinformation if it is prohibited in their terms of service. This is an appropriate response which prevents services from arbitrarily removing or restricting legal content, however controversial it may be, or suspending or banning users where it is not in accordance with their expressed terms of service.

The Bill will protect people from the most egregious types of health-related misinformation and disinformation while still protecting freedom of expression and allowing users to ask genuine questions about health-related matters. There are many examples from recent history—Primodos, Thalidomide and others—which point to the need for legitimate debate about health-related matters, sometimes against companies which have deep pockets to defend the status quo.

My noble friend Lord Bethell also raised concerns about the role that algorithms play in pushing content. I reassure him that all companies will face enforcement action if illegal content in scope of the Bill is being promoted to users via algorithms. Ofcom will have a range of powers to assess whether companies are fulfilling their regulatory requirements in relation to the operation of their algorithms.

In circumstances where there is a significant threat to public health, the Bill already provides additional powers for the Secretary of State to require Ofcom to prioritise specified objectives when carrying out its media literacy activity and to require that companies report on the action they are taking to address the threat. The advisory committee on misinformation and disinformation will also be given the flexibility and expertise to consider providing advice to Ofcom on this issue, should it choose to.

Amendments 99 and 222 from the noble Baroness, Lady Merron, and Amendments 223 and 224 from the noble Lord, Lord Knight of Weymouth, relate to the advisory committee. Disinformation is a pervasive and evolving threat. The Government believe that responding to the issue effectively requires a multifaceted, whole-of-society approach. That is what the advisory committee seeks to do by bringing together technology companies, civil society organisations and sector experts to advise Ofcom in building cross-sector understanding and technical knowledge of the challenges and how best to tackle them. The Government see this as an essential part of the Bill’s response to this issue.

I understand the desire of noble Lords to ensure that the committee is conducting its important work as quickly as possible, but it is imperative that Ofcom has the appropriate time and space to appoint the best possible committee and that its independence as a regulator is respected. Ofcom is well versed in setting up statutory committees and ensuring that committees established under statute meet their obligations while maintaining impartiality and integrity. To seek to prescribe timeframes or their composition risks impeding Ofcom’s ability to run a transparent process that finds the most suitable candidates. Considering the evolving nature of disinformation and the online realm, the advisory committee will also need the flexibility to adapt and respond. It would therefore not be appropriate for the Bill to be overly prescriptive about the role of the advisory committee or to mandate the things on which it must report.

The noble Baroness, Lady Fox of Buckley, asked whether the committee could include civil liberties representatives. It is for Ofcom to decide who is on the committee, but Ofcom must have regard to the desirability of including, among others, people representing the interests of UK users of regulated services, which could include civil liberties groups.

The noble Baroness, Lady Kidron, raised the challenges of artificial intelligence. Anything created by artificial intelligence and shared on an in-scope service by a user will qualify as user-generated content. It would therefore be covered by the Bill’s safety duties, including to protect children from harmful misinformation and disinformation, and to ensure that platforms properly enforce their terms of service for adults.

I turn to the points raised in my noble friend Lord Moylan’s Amendment 264. Alongside this strong legislative response, the Government will continue their operational response to tackling misinformation and disinformation. As part of this work, the Government meet social media companies on a regular basis to discuss a range of issues. These meetings are conducted in the same way that the Government would engage with any other external party, and in accordance with the well-established transparency processes and requirements.

The Government’s operational work also seeks to understand misinformation and disinformation narratives that are harmful to the UK, to build an assessment of their risk and threat. We uphold the same commitment to freedom of expression in our operational response as we do in our legislative response. As I said, we are not in the business of telling companies what legal content they can and cannot allow. Indeed, under the Bill, category 1 services must set clear terms of service that are easy for users to understand and are consistently enforced, ensuring new levels of transparency and accountability.

Our operational response will accompany our legislative response. The measures have been designed to provide a strong response to tackle misinformation and disinformation, ensuring users’ safety while promoting a thriving and lively democracy where freedom of expression is protected.

The noble Baroness, Lady Fox, and the noble Lord, Lord Clement-Jones, asked about the counter-disinformation unit run, or rather led, by the Department for Science, Innovation and Technology. That works to understand attempts to artificially manipulate the information environment, and to understand the scope, scale and reach of misinformation and disinformation. It responds to acute information incidents, such as Russian information operations during the war in Ukraine, those we saw during the pandemic and those around important events such as general elections. It does not monitor individuals; rather, its focus is on helping the Government understand online misinformation and disinformation narratives and threats.

When harmful narratives are identified, the unit works with departments across Whitehall to deploy the appropriate response, which could involve a direct rebuttal on social media or awareness-raising campaigns to promote the facts. Therefore, the primary purpose is not to monitor for harmful content to flag to social media companies—the noble Baroness raised this point—but the department may notify the relevant platform if, in the course of its work, it identifies content that potentially violates platforms’ terms of service, including co-ordinated, inauthentic or manipulative behaviour. It is then up to the platform to decide whether to take action against the content, based on its own assessment and terms of service.

About this proceeding contribution

Reference

829 cc2018-2021 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top