My Lords, this is the first time that I have spoken in Committee. I know we have 10 days, but it seems that we will go even further because this is so important. I will speak to Amendments 250A and 250B.
I thank the noble Lords, Lord Russell of Liverpool and Lord Stevenson of Balmacara, and, of course— if I may be permitted to say so—the amazing noble Baroness, Lady Kidron, who is an absolute whizz on this, for placing their names on these amendments, as well as the 5Rights Foundation, the Internet Watch Foundation and the UK Safer Internet Centre for their excellent briefings. I have spoken to these charities, and the work they do is truly amazing. I do not think that the Bill will recognise just how much time and energy they give to support families and individuals. Put quite simply, we can agree that services’ internal complaint mechanisms are failing.
Let me tell your Lordships about Harry. Harry is an autistic teenager who was filmed by a member of the public in a local fast-food establishment when he was dysregulated and engaging in aggressive behaviour.
This footage was shared out of context across social media, with much of the response online labelling Harry as a disruptive teenager who was engaging in unacceptable aggression and vandalising public property. This was shared thousands of times over the course of a few weeks. When Harry and his mum reported it to the social media platforms, they were informed that it did not violate community guidelines and that there was a public interest in the footage remaining online. The family, quite rightly, felt powerless. Harry became overwhelmed at the negative response to the footage and the comments made about his behaviour. He became withdrawn and stopped engaging. He then tried to take his own life.
3.30 pm
It was at this point that Harry’s mum reached out to the voluntary-run service Report Harmful Content, as she had nowhere else to turn. Report Harmful Content is run by the charity South West Grid for Learning. It was able to mediate between the social media sites involved to further explain the context and demonstrate the real-world harm that this footage, by remaining online, was having on the family and on Harry’s mental health. Only then did the social media companies concerned remove the content.
Sadly, Harry’s story is not an exception. In 2022, where a platform initially refused to take down content, Report Harmful Content successfully arbitrated the removal of content in 87% of cases, thus demonstrating that even if the content did not violate community guidelines, it was clear that harm had been done. There are countless cases of members of the public reporting a failure to remove content that was bullying them. This culture of inaction has led to apathy and a disbelief among users that their appeals will ever be redressed. Research published by the Children’s Commissioner for England found that 40% of children did not report harmful content because they felt that there
“was no point in doing so”.
The complaints mechanism in the video-sharing platform regulation regime is being repealed without an alternative mechanism to fill the gap. The current video-sharing platform regulation requires platforms to
“provide for an impartial out-of-court procedure for the resolution of any dispute between a person using the service and the provider”
to operate impartial dispute resolution in the event. In its review of the first year of this regulation, Ofcom highlighted that the requirements imposed on platforms in scope are not being met in full currently. However, instead of strengthening existing appeals processes, the VSP regime is set to be repealed and superseded by this Bill.
The Online Safety Bill does not have an individual appeals process, meaning that individuals will be left without an adequate pathway to redress. The Bill establishes only a “super-complaints” process for issues concerning multiple cases or cases highlighting a systemic risk. It will ultimately fall to the third sector to highlight cases to Ofcom on behalf of individuals.
The removal of an appeals process—given the repeal of the VSP regime—would be in stark contrast with the direction of travel in other nations. Independent appeals processes exist in Australia and New Zealand,
and more countries are also looking at adopting independent appeals. The new Irish Online Safety and Media Regulation Act includes provision
“for the making of a complaint to the Commission”.
The Digital Services Act in Europe also puts a process in place. There is precedent for these systems. It cannot be right that the Republic of Ireland and the UK and its territories have over 52 ombudsmen in 32 sectors, yet none of them works in digital at a time when online harm—especially to children, as we hear time and again in your Lordships’ House—is at unprecedented levels.
The Government’s response so far has been insufficient. When the Online Safety Bill received its seventh sitting debate, much discussion related to independent appeals, referred to here as the need for an ombudsman. The Digital Minister recognised:
“In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast”.
Dame Maria Miller MP said that
“it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it”.—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 295-96.]
In response to the Joint Committee’s recommendation for an ombudsman, the Government said:
“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm. Therefore, it is difficult to know how an Ombudsman service could function where user complaints are likely to be complex and where financial compensation is not usually appropriate. An Ombudsman service may also disincentivise services from taking responsibility for their users’ safety. Introducing an independent resolution mechanism at the same time as the new regime may also pose a disproportionate regulatory burden for services and confuse users … The Secretary of State will be able to reconsider whether independent resolution mechanisms are appropriate at the statutory review. Users will also already have a right of action in court if their content is removed by a service provider in breach of the terms and conditions. We will be requiring services to specifically state this right of action clearly in their terms and conditions”.
Delaying the implementation of an individual appeals process will simply increase the backlog of cases and will allow for the ripple effect of harm to go unreported, unaddressed and unaccounted for.
There is precedent for individual complaints systems, as I have mentioned, both in the UK and abroad. Frankly, the idea that an individual complaints process will disincentivise companies from taking responsibility does not hold weight, given that these companies’ current appeal mechanisms are woefully inadequate. Users must not be left to the courts to have their appeals addressed. This process is cost-prohibitive for most and cannot be the only pathway to justice for victims, especially children.
To conclude, I have always personally vowed to speak up for those who endure horrific suffering and injustices from tormentors. I know how the pain and trauma that comes from systems that have been set up being no longer being fit for purpose feels. I therefore say this to my noble friend the Minister: nothing is too difficult if you really want to find a solution. The public have asked for this measure and there is certainly wide
precedent for it. By not allowing individuals an appeals process, the Government’s silence simply encourages the tormentors and leaves the tormented alone.