My Lords, I thank Mencap and the Royal College of Psychiatrists for their briefings. I will speak against the change in the other place which waters down the protections offered to adults, and focus in particular on adults without capacity.
The original Bill included protections for adults under the umbrella of “legal but harmful”, which gave robust directions to platforms on what content to remove. These protections must be reinstated; the triple shield is not enough. Your Lordships are presented with a system where social media platforms must filter only
“to the extent that it is proportionate to do so”,
assuming that all adults are capacitous all of the time and that they will be responsible for making their own choices to avoid seeing harmful content.
I recognise that there is an intended new duty for services to undertake a risk assessment on the impact of certain material on children, and to tackle the promotion of sites which share harmful content and to prevent children witnessing it, but this applies just to children. I agree with my noble friend Lady Kidron that tech companies must design for safety, just as we expect in the physical environment.
My main point is that there is no clear distinction between childhood and adulthood when it comes to mental health. I am concerned about the mental health consequences for anybody, whether child or adult, of seeing some of the images, messaging and push notifications which relentlessly pursue anyone who has ever engaged with one of the horrific sites like those seen by 14 year-old Molly Russell. These images are harmful to 14 year-olds; they are harmful to 24 year-olds; and they are harmful to 74 year-olds. Once seen, it is very hard to unsee them.
Misinformation and negative messaging are harmful to anyone who may struggle to belong and feel valued, whether at a vulnerable moment in their lives or as part of an ongoing struggle with depression. One in 20 Google searches is for health-related information. People in the UK apparently make 27 searches a minute for “depression”, 22 a minute for “stress”, and 21 a minute for anxiety. Given the waiting times for mental health support in the community, perhaps it is unsurprising that people seek help online. This Bill must have an emphasis on prevention. The Bill places duties on regulated providers but, as of June 2022, more than 500 hours of video were uploaded to YouTube every minute. This is content created and viewed by its users at a rate where any reactionary approach is doomed to fall quickly behind.
As legislators we must think of society as a whole, not just those who are fully engaged and economically productive citizens who currently feel invulnerable. Making sure that legislation works for people with a learning disability and those who may not have the understanding needed to protect themselves from harmful content should not be an add-on. Could the Minister suggest how the Bill could deliver greater protections to people with a learning disability or other cognitive or mental health reason for increased risk of online harm?
As I have said before, if we could get it right for people with learning disabilities, we could actually get it right for everyone.
6.19 pm