My Lords, I am grateful for this short and focused debate, which has been helpful, and for the points made by the noble Lords, Lord Stevenson and Lord Allan, and the noble Baroness, Lady Kidron. I think we all share the same objective: ensuring that terms of service promote accountability and transparency, and empower users.
One of the Bill’s key objectives is to ensure that the terms of service of user-to-user platforms are suitable and effective. Under the Bill, companies will be required both to set out clearly how they will tackle illegal content and protect children and to ensure that their terms of service are properly enforced. The additional transparency and accountability duties on category 1
services will further ensure that users know what to expect on the largest platforms. This will put an end to these services arbitrarily removing content or, conversely, failing to remove content that they profess to prohibit.
The Bill will also ensure that search services are clear to their users about how they are complying with their adult and child safety duties under this new law. Given the very different way in which search services operate, however, this will be achieved through a publicly available statement rather than through terms of service. The two are meant distinctly.
Noble Lords are right to point to the question of intelligibility. It struck me that, if it takes 10 days to read terms of service, perhaps we should have a race during the 10 days allotted to this Committee stage to see which is quicker—but I take the point. The noble Lord, Lord Allan, is also right that the further requirements imposed through this Bill will only add to that.
The noble Baroness, Lady Kidron, asked a fair question about what “accessibility” means. The Bill requires all platforms’ terms of service for illegal content and child safety duties to be clear and accessible. Ofcom will provide guidance on what that means, including ensuring that they are suitably prominent. The same applies to terms of service for category 1 services relating to content moderation.
I will focus first on Amendments 16, 21, 66DA, 75 and 197, which seek to ensure that both Ofcom and platforms consider the risks associated with platforms’ terms of service with regard to the illegal content and child safety duties in the Bill. We do not think that these amendments are needed. User-to-user services will already be required to assess the risks regarding their terms of service for illegal content. Clause 8 requires companies to assess the “design and operation” of a service in relation to illegal content. As terms of service are integral to how a service operates, they would be covered by this provision. Similarly, Clause 10 sets out that companies likely to be accessed by children will be required to assess the “design and operation” of a service as part of their child risk assessments, which would include the extent to which their terms of service may reduce or increase the risk of harm to children.
In addition to those risk assessment duties, the safety duties will require companies to take proportionate measures effectively to manage and mitigate the risk of harm to people whom they have identified through risk assessments. This will include making changes to their terms of service, if appropriate. The Bill does not impose duties on search services relating to terms of service, as search services’ terms of service play a less important role in determining how users can engage on a platform. I will explain this point further when responding to specific amendments relating to search services but I can assure the noble Lord, Lord Stevenson, that search services will have comprehensive duties to understand and mitigate how the design and operation of their service affects risk.
Amendment 197 would require Ofcom to assess how platforms’ terms of service affect the risk of harm to people that the sector presents. While I agree that this is an important risk factor which Ofcom must consider,
it is already provided for in Clause 89, which requires Ofcom to undertake an assessment of risk across regulated services. That requires Ofcom to consider which characteristics of regulated services give rise to harm. Given how integral terms of service are to how many technology companies function, Ofcom will necessarily consider the risk associated with terms of service when undertaking that risk assessment.
However, elevating terms of service above other systems and processes, as mentioned in Clause 89, would imply that Ofcom needs to take account of the risk of harm on the regulated service, more than it needs to do so for other safety-by-design systems and processes or for content moderation processes, for instance. That may not be suitable, particularly as the service delivery methods will inevitably change over time. Instead, Clause 89 has been written to give Ofcom scope to organise its risk assessment, risk register and risk profiles as it thinks suitable. That is appropriate, given that it is best placed to develop detailed knowledge of the matters in question as they evolve over time.
Amendments 70, 71, 72, 79, 80, 81, 174 and 302 seek to replace the Bill’s references to publicly available statements, in relation to search services, with terms of service. This would mean that search services would have to publish how they are complying with their illegal content and child protection duties in terms of service rather than in publicly available statements. I appreciate the spirit in which the noble Lord has tabled and introduced these amendments. However, they do not consider the very different ways in which search services operate.
User-to-user services’ terms of service fulfil a very specific purpose. They govern a user’s behaviour on the service and set rules on what a user is allowed to post and how they can interact with others. If a user breaks these terms, a service can block his or her access or remove his or her content. Under the status quo, users have very few mechanisms by which to hold user-to-user platforms accountable to these terms, meaning that users can arbitrarily see their content removed with few or no avenues for redress. Equally, a user may choose to use a service because its terms and conditions lead them to believe that certain types of content are prohibited while in practice the company does not enforce the relevant terms.
The Bill’s duties relating to user-to-user services’ terms of service seek to redress this imbalance. They will ensure that people know what to expect on a platform and enable them to hold platforms accountable. In contrast, users of search services do not create content or interact with other users. Users can search for anything without restriction from the search service provider, although a search term may not always return results. It is therefore not necessary to provide detailed information on what a user can and cannot do on a search service. The existing duties on such services will ensure that search engines are clear to users about how they are complying with their safety duties. The Bill will require search services to set out how they are fulfilling them, in publicly available statements. Their actions must meet the standards set by Ofcom. Using these statements will ensure that search services are as transparent as user-to-user services about how they are complying with their safety duties.
The noble Lord’s Amendment 174 also seeks to expand the transparency reporting requirements to cover the scope and application of the terms of service set out by search service providers. This too is unnecessary because, via Schedule 8, the Bill already ensures transparency about the scope and application of the provisions that search services must make publicly available. I hope that gives the noble Lord some reassurance that the concerns he has raised are already covered. With that, I invite him to withdraw Amendment 16.