UK Parliament / Open data

Online Safety Bill

My Lords, I will speak to my Amendment 232, as well as addressing issues raised more broadly by this group of amendments. I want to indicate support from these Benches for the broader package of amendments spoken to so ably by the noble Baroness, Lady Kidron. I see my noble friend Lord Clement-Jones has returned to check that I am following instructions during my temporary occupation of the Front Bench.

The comments I will make are going to focus on an aspect which I think we have not talked about so much in the debate, which is age assurance in the context of general purpose, user-to-user and search services, so-called Part 3, because we like to use confusing language in this Bill, rather than the dedicated pornography sites about which other noble Lords have spoken so powerfully. We have heard a number of contributions on that, and we have real expertise in this House, not least from my noble friend Lady Benjamin.

In the context of age assurance more generally, I start with a pair of propositions that I hope will be agreed to by all participants in the debate and build on what I thought was a very balanced and highly informative introduction from the noble Baroness, Lady Kidron. The first proposition is that knowledge about the age of users can help all online platforms develop safer services than they could absent that information—a point made by the right reverend Prelate the Bishop of Oxford earlier. The second is that there are always some costs to establishing age, including to the privacy of users and through some of the friction they encounter when they wish to use a service. The task before us is to create mechanisms for establishing age that maximise the safety benefits to users while minimising the privacy and other costs. That is what I see laid out in the amendment that the noble Baroness, Lady Kidron, has put before us.

My proposed new clause seeks to inform the way that we construct that balance by tasking Ofcom with carrying out regular studies into a broad range of approaches to age assurance. This is exactly the type of thinking that is complementary to that in Amendment 142; it is not an alternative but complementary to it. We may end up with varying views on exactly where that balance should be struck. Again, I am talking about general purpose services, many of which seek to prohibit pornography—whether they do so 100%, it is a different set of arguments from those that apply to services which are explicitly dedicated to pornography. We may come to different views about where we eventually strike the balance but I think we probably have a good,

shared understanding of the factors that should be in play. I certainly appreciate the conversations I have had with the noble Baroness, Lady Kidron, and others about that, and think we have a common understanding of what we should be considering.

If we can get this formulation right, age assurance may be one of the most significant measures in the Bill in advancing online safety, but if we get it wrong, I fear we may create a cookie banner scenario, such as the one I warned about at Second Reading. This is my shorthand for a regulatory measure that brings significant costs without delivering its intended benefits. However keen we are to press ahead, we must always keep in mind that we do not want to create legislation that is well-intended but does not have the beneficial effect that we all in this Committee want.

Earlier, the noble Baroness, Lady Harding, talked about the different roles that we play. I think mine is to try to think about what will actually work, and whether the Bill will work as intended, and to try to tease out any grit in it that may get in the way. I want in these remarks to flag what I think are four key considerations that may help us to deliver something that is actually useful and avoid that cookie banner outcome, in the context of these general purpose, Part 3 services.

First, we need to recognise that age assurance is useful for enabling as well as disabling access to content—a point that the noble Baroness, Lady Kidron, rightly made. We rightly focus on blocking access to bad content, but other things are also really important. For example, knowing that a user is very young might mean that the protocol for the reporting system gets to that user report within one hour, rather than 24 hours for a regular report. Knowing that a user is young and is being contacted by an older user may trigger what is known as a grooming protocol. Certainly at Facebook we had that: if we understood that an older user was regularly contacting younger users, that enabled us to trigger a review of those accounts to understand whether something problematic was happening—something that the then child exploitation and online protection unit in the UK encouraged us to implement. A range of different things can then be enabled. The provision of information in terms that a 13 year-old would understand can be triggered if you know the age of that user.

Equally, perfectly legitimate businesses, such as alcohol and online gambling businesses, can use age assurance to make sure that they exclude people who should not be part of that. We in this House are considering measures such as junk food advertising restrictions, which again depend on age being known to ensure that junk food which can be legitimately marketed to older people is not marketed to young people. In a sense, that enables those businesses to be online because, absent the age-gating, they would struggle to meet their regulatory obligations.

Secondly, we need to focus on outcomes, using the risk assessment and transparency measures that the Bill creates for the first time. We should not lose sight of those. User-to-user and search services will have to do risk assessments and share them with Ofcom, and Ofcom now has incredible powers to demand information from them. Rather than asking, “Have you put in an age assurance system?”, we can ask, “Can you tell us how many 11 year-olds or 15 year-olds you estimate

access the wrong kind of content?”, and, “How much pornography do you think there is on your service despite the fact that you have banned it?” If the executives of those companies mislead Ofcom or refuse to answer, there are criminal sanctions in the Bill.

The package for user-to-user and search services enables us to really focus on those outcomes and drill down. In many cases, that will be more effective. I do not care whether they have age-assurance type A or type B; I care whether they are stopping 99.9% of 11 year-olds accessing the wrong kind of content. Now, using the framework in the Bill, Ofcom will be able to ask those questions and demand the answers, for the first time ever. I think that a focus on outcomes rather than inputs—the tools that they put in place—is going to be incredibly powerful.

7.45 pm

The third consideration is quite a difficult one. We need to plan for actual behaviour, and how it will change over time and how we adapt to that, rather than relying on assumptions that we make today. My experience is that actual behaviour often differs. Cookie banners are an example. The assumption of the regulator was, “We’ll put these cookie banners in place, and it will be so off-putting that people will stop using cookies”. The reality is completely different: everyone has just carried on using cookies and people click through. The behaviour has not matched up to expectations.

You can put a lot of these tools in place, such as age assurance and age-restricted services. If you build an age-restricted version of your service—there is a YouTube for kids along with many other kids’ services—then you can see whether or not they are going to be acceptable. If people are rejecting them, you need to adapt. There is no point saying, “Well, you should go and use YouTube Kids”. If people are signing up for it but finding it too restrictive and going elsewhere, we need to be able to think about how we can adapt to that and work with it.

The reality today, as the right reverend Prelate the Bishop of Oxford referred to, is that 60% of kids are on social media, and in many cases their parents have bought the phone and enabled that access. How do we deal with that? We cannot just bury our heads in the sand and ignore it; we have to be able to adapt to that behaviour and think about what tools work in that environment.

My last point on adaptation is that we found, working in the industry, that sometimes the law incentivised ignorance, which is the worst possible outcome. We had age-estimation tools that allowed us to understand that children were 11 or 12. They may have passed an age check by providing ID that showed that they were overage, and their parents may have helped them, but we knew they were not. But that knowledge itself created legal risk, so we would bury the knowledge. If kids are going on these online platforms, my view is that I would much rather the platforms use all the tools available—we should not discourage them from understanding that these are 11 year-olds—and we find a way to work with that and make the service as safe as possible. These are hard questions because, while we want the law to work in an absolute way, in practice people are very creative and will work around and through systems.

The final point about making all this work is understanding the key role of usability. I was struck by the compelling vision of noble Baroness, Lady Kidron, of low-friction age assurance. There are issues of principle and practice when it comes to making sure that age assurance is usable. The argument around principle is that we as a free society do not want to create unnecessary friction for people to access information online. We can put measures in place and impinge on freedom of expression in a way that is necessary and proportionate. With regard to all the arguments that we have heard about access to pornography sites, the case is of course clear and absolute—there is a strong case for putting in place very restrictive measures—but for other general purpose services that younger people may want to use to connect with family and friends, we need to tread quite carefully if we are putting in place things that might infringe on their rights. The UN convention also talks about the right to express oneself. We need to be careful about how we think that through, and usability is critical to that.

I share the noble Baroness’s vision, in a sense. A lot of it could happen at the phone level, when a parent sets a phone up for their 11 year-old. The phone knows that you are an 11 year-old, and there is a system in place for the phone to tell the apps that you install that you are 11. That works in a smooth way and is potentially low friction but very effective. There are exceptions, but, by and large, teenagers have their own phone and the phone is tied to them; if you know the age of the phone then you have a highly reliable understanding of the age of the user. There is a lot in there. It is those kinds of low-friction measures that we should be looking for.

At the other end of the spectrum, if every app that you use asks you to upload your passport, that is not going to work. People will just bypass it, or they will stick to the apps they know already and never install new ones. We would end up concentrating the market, and our friends at the Competition and Markets Authority would not be very pleased with that potential outcome. It is about making something usable from both a principled reason—not blocking access to legitimate services—and a pragmatic reason: making sure we do not create an incentive. One of the phrases the team doing the age verification at the Facebook would use was, “Our competition is lying”. If people do not like what you are doing, they will simply lie and find their way round it. We need to bear that in mind, even under the new regime.

If we are mindful of these practical considerations, we should be able to deliver very useful age-assurance tools. I hope the Minister will agree with that. I look forward to hearing the Government’s description of how they think all this will work, because that bit is missing from the debate. I know other Lords will have heard from Ofcom, which started putting out information about how it thinks it will work. The age assurance community has started putting out information. The gap in all this is that it is not clear what the Government think this future world will look like. I hope they can at least start to fill that gap today by trying to explain to us, in response to these amendments, their vision for the new world of age assurance, not just for the pornography sites but, critically, for all these other

sites that millions of us use that certainly aim to be gateways not to pornography but rather to other forms of communication and information.

About this proceeding contribution

Reference

830 cc811-5 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top