My Lords, I welcome the Bill, but regret the time it has taken to arrive. To make the UK the safest place in the world to be online, it must be strengthened, and I will support amendments that would ensure greater protection for children through proper age assurance. The damage to children from exploitation by social media cannot continue. The state must regulate, using severe penalties, to force platforms to behave with greater responsibility as they cannot be trusted to self-regulate. The rise in suicide and self-harm and the loss of self-esteem are ruining young lives. The platforms must take greater responsibility; they have the money and the technology to do this but need stronger incentives to act, such as the promised executive criminal liability amendment.
Ofcom faces a formidable challenge in policing companies to adhere to its terms and conditions about content moderation. Heavy fines are not enough. Ofcom will need guidance in setting codes of practice from not only the three commissioners but NGOs, such as the Internet Watch Foundation, and an advocacy body for children to continually advise on emerging harms. A new regulatory regime to address illegal and harmful content online is essential but, having removed legal but harmful from the original Bill, we lost the opportunity to detoxify the internet.
Concentrating on the big platforms will miss the growth of bespoke platforms that promote other harms such as incel culture, a threat to women but also to young men. Incels, involuntarily celibates, use mainstream platforms such as YouTube to reel in unsuspecting young men before linking them to their own small, specialist websites, but these are outside the scope of category 1 provision and therefore any minimum standards. These sites include not only sexist and misogynistic material but anti-Semitic, racist, homophobic and transphobic items, and even paedophilia. One of the four largest incel forums is dedicated to suicide and self-harm. HOPE not hate, the anti-fascist campaign, has warned that smaller platforms used by the far right to organise and radicalise should be under the same level of scrutiny as category 1 platforms.
User empowerment features, part of the triple shield, such as options to filter out content from unverified users and abusive content, put the onus on the user to filter out material rather than filters being turned on by default. Ofcom must ensure a statutory duty to promote media literacy by the largest platforms as part of their conditions of service. The Bill should make children’s risk assessment consistent across all services, and should tackle the drivers of harm and the design of the service, not just the content.
I welcome the new offences targeting harmful behaviour, including epilepsy trolling, cyber flashing and the sending of manufactured deepfake intimate images without consent. Despite the Bill adding controlling or coercive behaviour to the list of priority offences, more needs to be done to protect women, one in three of whom has experienced online abuse. Ofcom must add a mandatory code of practice regarding violence against women and girls so that tech companies understand they have a duty to prioritise their safety.
The Bill must prevent the relentless promotion of suicide and self-harm that has destroyed the lives of young people and their families. I commend the bravery of Ian Russell, who is campaigning to prevent other deaths following the tragic suicide of his daughter, Molly. I back the amendments from the noble Baroness, Lady Kidron, to ensure that coroners and bereaved families can access social media content. I applaud all those campaigners who want to see the Bill implemented urgently, and I will work with other noble Lords to strengthen it.
6.42 pm