This is an incredibly important Bill. It has huge cross-party support and was subject to scrutiny by the Joint Committee, which produced a unanimous report, which shows the widespread feeling in both Houses and on both sides of this Chamber that we should legislate. I do feel, though, that I should respond to some of the remarks of the shadow Secretary of State, the hon. Member for Manchester Central (Lucy Powell), on the Joint Committee report.
I agree with the hon. Member that, unless this legislation covers the systems of social media companies as well as the content hosted, it will not be effective, but it is my belief that it does that. Throughout the evidence that the Committee took, including from Ofcom and not just the Government, it was stated to us very clearly that the systems of social media companies are within scope and that, in preparing the risk registers for the companies, Ofcom can look at risks. For Facebook, that could include the fact that the news feed recommends content to users, while for someone on TikTok using For You, it could be the fact that the company is selecting—algorithmically ranking—content that someone might like. That could include, for a teenage girl, content that promoted self-harm that was being actively recommended by the company’s systems, or, as Frances Haugen set out, extremist content and hate speech being actively promoted and recommended by the systems.
That would be in scope. The algorithms are within scope, and part of Parliament job’s will be to ensure on an ongoing basis that Ofcom is using its powers to audit the companies in that way, to gain access to information in that way, and to say that the active promotion of regulated content by a social media company is an offence. In passing this Bill, we expect that that will be fully in scope. If the legislation placed no obligation on a company to proactively identify any copies of content that it had judged should not be there and had taken down, we would have a very ineffective system. In effect, we would have what Facebook does to assess content today. If that was effective, we would not need this legislation, but it is woefully ineffective, so the algorithms and the systems are in scope. The Bill gives Ofcom the power to regulate on that basis, and we have to ensure that it does that in preparing the risk registers.
Following what my Joint Committee colleague, the hon. Member for Bristol North West (Darren Jones), said, the point about the codes of practice is really important. The regulator sets the codes of practice for companies to follow. The Government set out in their response to the Joint Committee report that the regulator can tell companies if their response is not adequate. If an area of risk has been identified where the company has to create policies to address that risk and the response is not good enough, the regulator can still find the company in breach. I would welcome it if the Minister wished to say more about that, either today or as the Bill goes through the House, because it is really important. The response of a company to a request from the regulator, having identified a risk on its platforms, cannot be: “Oh, sorry, we don’t have a policy on that.” It has to be able to set those policies. We have to go beyond just enforcing the terms of service that companies have created for themselves. Making sure they do what they say they are going to do is really important, as the Secretary of State said, but we should be able to push them to go further.
I agree, though, with the hon. Member for Manchester Central and other hon. Members about regulation being based on risk and not just size. In reality, Ofcom will have to make judgment calls on smaller sites that are posing a huge risk or a new risk that has been identified.
The regulator will have the power to regulate Metaverse and VR platforms. Anything that is a user-to-user service is already in scope of the legislation. The challenge for
the regulator will be in moderating conversations between two people in a virtual room, which is much harder than when people are posting text-based content. The technology will have to adapt to do that, but we should start that journey based on the fact that that is already in scope.
Finally, on the much used expression “legal but harmful”, I am pleased the Government took one of our big recommendations, which is to write more offences clearly into the Bill, so it is clear what is actually being regulated—so promotion of self-harm is regulated content and hate speech is part of the regulated content. The job of the regulator then is to set the threshold where intervention should come and I think that should be based on case law. On many of these issues, such as the abuse of the England footballers after the final of the European championships, people have been sentenced in court for what they did. That creates good guidance and a good baseline for what hate speech is in that context and where we would expect intervention. I think it would be much easier for the Bill, the service users that are regulated and the people who post content, to know what the offences are and where the regulatory standard is. Rather than describing those things as “legal but harmful”, we should describe them as what they are, which is regulated offences based on existing offences in law.
The Government made an important step in responding to say that the Government, in seeking amendment to the codes of practice that bring new offences within scope of these priority areas of harm, should have to go through an affirmative process in both Houses. That is really important. Ultimately, the regulation should be based on our laws and changes should be based on decisions taken in this House.