My Lords, these amendments are concerned with Ofcom’s powers under Clause 111 to issue notices to deal with terrorism content and child sexual exploitation and abuse content.
I acknowledge the concerns which have been aired about how these powers work with encrypted services. I want to make it clear that the Bill does not require companies to break or weaken encryption, and we have built in strong safeguards to ensure that users’
privacy is protected. Encryption plays an important role online, and the UK supports its responsible use. I also want to make it clear that we are not introducing a blanket requirement for companies to monitor all content for all harms, at all times. That would not be proportionate.
However, given the serious risk of harm to children from sexual abuse and exploitation online, the regulator must have appropriate, tightly targeted powers to compel companies to take the most effective action to tackle such reprehensible illegal activity which is taking place on their services. We must ask companies to do all that is technically feasible to keep children safe, subject to stringent legal safeguards.
The powers in the Bill are predicated on risk assessments. If companies are managing the risks on their platform appropriately, Ofcom will not need to use its powers. As a last resort, however, where there is clear evidence of child sexual abuse taking place on a platform, Ofcom will be able to direct companies either to use, or to make best efforts to develop or source, accredited and accurate technology to identify and remove this illegal content. To be clear, these powers will not enable Ofcom or our law enforcement agencies to obtain any automatic access to the content detected. It is simply a matter of making private companies take effective action to prevent child sexual abuse on their services.
Ofcom must consider a wide range of matters when deciding whether a notice is necessary and proportionate, including the impacts on privacy and freedom of expression of using a particular technology on a particular service. Ofcom will only be able to require the use of technology accredited as highly accurate in detecting illegal child sexual abuse or terrorism content, vastly minimising the risk that content is wrongly identified.
In addition to these safeguards, as a public body, Ofcom is bound through the Human Rights Act 1998 by the European Convention on Human Rights, including Articles 8 and 10. Ofcom has an obligation not to act in a way which unduly interferes with the right to privacy and freedom of expression when carrying out its duties, for which it is held to account.
If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a solution. It is right that we can require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments.
Despite the breadth of the existing safeguards, we recognise that concerns remain about these powers, and we have listened to the points that noble Lords raised in Committee about privacy and technical feasibility. That is why we are introducing additional safeguards. I am grateful for the constructive engagement I have had with noble Lords across your Lordships’ House on this issue, and I hope that the government amendments alleviate their concerns.
I turn first to our Amendments 250B, 250C, 250D, 255A, 256A, 257A, 257B, 257C and 258A, which require that Ofcom obtain a skilled persons’ report before issuing a warning notice and exercising its
powers under Clause 111. This independent expert scrutiny will supplement Ofcom’s own expertise to ensure that it has a full understanding of relevant technical issues to inform its decision-making. That will include issues specific to the service in question, such as its design and relevant factors relating to privacy.
6 pm
We are confident that, in addition to Ofcom’s existing routes of evidence-gathering, Amendment 256A will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place. That will further help Ofcom to issue a notice which is targeted and proportionate.
Ofcom will need to appoint a skilled person and notify the provider about the appointment and the relevant matters to be explored in the report before issuing its final notice. Ofcom will have discretion over what should be included in the report, as this will depend on the specific circumstances. Under Amendments 257A and 257B, Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. That will enable the provider to make representations based on Ofcom’s own analysis and that of the skilled person.
I turn now to Amendments 257D, 257E and 257F. We have heard concerns about the impact that scanning technologies could have on journalistic content and sources. Any technology required by Ofcom must be highly accurate in detecting only terrorism content on public channels or only child sexual exploitation and abuse content on public or private channels. So, the likelihood of journalistic content or sources being compromised will be low—but to reassure your Lordships further, we have expanded the matters that Ofcom must consider in its decision-making.
Amendment 257D requires Ofcom to consider the impact that the use of a particular technology on a particular service would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. It builds on the existing safeguards in Clause 113 regarding freedom of expression and privacy. I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his constructive engagement on this issue. I beg to move.