My Lords, I shall speak in favour of Amendments 195, 239 and 263, tabled in the names of my right reverend friend the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, who I thank for his comments.
My right reverend friend the Bishop of Oxford regrets that he is unable to attend today’s debate. I know he would have liked to be here. My right reverend friend tells me that the Government’s Centre
for Data Ethics and Innovation, of which he was a founding member, devoted considerable resource to horizon scanning in its early years, looking for the ways in which AI and tech would develop across the world. The centre’s analysis reflected a single common thread: new technologies are developing faster than we can track them and they bring with them the risk of significant harms.
This Bill has also changed over time. It now sets out two main duties: the illegal content duty and the children duty. These duties have been examined and debated for years, including by the joint scrutiny committee. They are refined and comprehensive. Risk assessments are required to be “suitable and sufficient”, which is traditional language from 20 years of risk-based regulation. It ensures that the duties are fit for purpose and proportionate. The duties must be kept up to date and in line with any service changes. Recent government amendments now helpfully require companies to report to Ofcom and publish summaries of their findings.
However, in respect of harms to adults, in November last year the Government suddenly took a different tack. They introduced two new groups of duties as part of a novel triple shield framework, supplementing the duty to remove illegal harms with a duty to comply with their own terms of service and a duty to provide user empowerment tools. These new duties are quite different in style to the illegal content and children duties. They have not benefited from the prior years of consultation.
As this Committee’s debates have frequently noted, there is no clear requirement on companies to assess in the round how effective their implementation of these new duties is or to keep track of their developments. The Government have changed this Bill’s system for protecting adults online late in the day, but the need for risk assessments, in whatever system the Bill is designed around, has been repeated again and again across Committee days. Even at the close of day eight on Tuesday, the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, referred explicitly to the role of risk assessment in validating the Bill’s systems of press reforms. Surely this persistence across days and groups of debate reflects the systemically pivotal role of risk assessments in what is, after all, meant to be a systems and processes rather than a content-orientated Bill.
But it seems that many people on many sides of this Committee believe that an important gap in risk assessment for harms to adults has been introduced by these late changes to the Bill. My colleague the right reverend Prelate is keen that I thank Carnegie UK for its work across the Bill, including these amendments. It notes:
“Harms to adults which might trickle down to become harms to children are not assessed in the current Bill”.
The forward-looking parts of its regime need to be strengthened to ensure that Parliament and the Secretary of State review new ways in which harms manifesting as technology race along, and to ensure that they then have the right advice for deciding what to do about them. To improve that advice, Ofcom needs to risk assess the future and then to report its findings.
12.30 pm
As the Committee can see, Amendment 195 is drawn very narrowly, out of respect for concerns about freedom of expression, even though the Government have still not explained how risk assessment poses any such threat. Ofcom would be able to request information from companies, using its information-gathering powers in Clause 91, to complete its future-proofing risk assessment. That is why, as Carnegie again notes,
“A risk assessment required of OFCOM for the purposes of future proofing alone could fill this gap”
in the Bill’s system,
“without even a theoretical threat to freedom of expression”.
Amendment 239 would require Ofcom to produce a forward-looking report, based on a risk assessment, to inform the Secretary of State’s review of the regime.
Amendment 263 would complete this systemic implementation of risk assessment by ensuring that future reviews of the regime by the Secretary of State include a broad assessment of the harms arising from regulated services, not just regulated content. This amendment would ensure ongoing consideration of risk management, including whether the regime needs expanding or contracting. I urge the Minister to support Amendments 195, 239 and 263.