My Lords, this has been one of the most important debates we have had so far in Committee, covering most of the issues in Clause 12—effectively, the replacement of the legal but harmful provisions that were in the draft Bill with the user empowerment tools, introducing the new element of the triple shield, or the three-legged stool as we are now going to describe it thanks to the noble Baroness, Lady Fraser. It is about how we as adults are empowered to protect ourselves from harmful content and, most crucially, the amplification of the harm caused by the systems used on the platforms.
I welcome subsections (4) and (5) of Clause 12, on ease of use and ease of access to the tools. Many platforms already offer these sort of tools. The noble Lord, Lord Clement-Jones, referred to the ParentZone research that has been circulated, which talked about a Facebook tool to prevent autoplay of ads. It took ParentZone’s tech-savvy researcher—not the noble Baroness, Lady Burt—three and a half hours to work out how to turn autoplay off. The research also found that 30% of tools had changed in the last year, so this is an ever-moving target for people to chase after.
The reality is that most of us do not have the time, even if we have the inclination, to deal with all these things. We already have user empowerment tools for unsubscribing from junk emails—and how many of us can be bothered to go through all that all the time? Sometimes I do but sometimes I just have to delete them and move on. We have to manage cookies; sometimes I do and sometimes I do not because I do not have time. That is why we need to look seriously at putting some of these tools on by default, with easily accessible settings to then turn them off if desired.
I therefore support Amendments 34 and 35, tabled by the noble Baroness, Lady Morgan, although I support those from the noble Lord, Lord Clement-Jones, more, which is why I put my name to them before the debate started. What the noble Baroness said about self-harm, suicide and eating disorders is really important. Again, this is less about people never being able to see individual items of content relating to those things and much more about restraining the platforms from bombarding us with similar content, as happened to Molly Russell and others. Here, of course, as many noble Lords have said, we should be mindful of the vulnerability of many young adults and other adults to the same experience that was implicated in Molly’s death.
According to Refuge’s research, which has been circulated, just over one in three UK women have experienced online abuse or harassment on social media, and perpetrators of domestic abuse are increasingly turning to technology as a tool to further their abuse. A briefing sent by the Royal College of Psychiatrists says that, according to NHS England, only 57.5% of 17 to 24 year-olds feel safe using social media in this country. Why not improve their safety as adults by having them opt in to seeing potentially harmful content—this is particularly important to some vulnerable adults with limited capacity to make decisions about internet and social media use—without limiting the freedom of adults to see this content if they want to?
The noble Lord, Clement-Jones, with Amendments 36 and 37, to which I added my name, is essentially going back to some of the debate about safety by design. As the right reverend Prelate set out so powerfully, the platforms are designed to maximise engagement, time spent on their site, data collection and the targeting of advertising. It is about their business model, not our safety. Artificial intelligence has no ethical constraint, and these user empowerment tools allow us to shift the algorithm in our favour, including to make us safer. To toggle them off is to side with the business model regardless of adult safety; to toggle them on is to side with adults having a more pleasant but slightly less engaging experience. Whose side is the Minister on? We look forward to hearing.