My Lords, it is a privilege to introduce Amendments 123A, 142, 161 and 184 in my name and those of the noble Lords, Lord Bethell and Lord Stevenson, and the right reverend Prelate the Bishop of Oxford. These amendments represent the very best of your Lordships’ House and, indeed, the very best of Parliament and the third sector because they represent an extraordinary effort to reach consensus between colleagues across the House including both opposition parties, many of the Government’s own Benches, a 40-plus group of Back-Bench Conservatives and the Opposition Front Bench in the other place. Importantly, they also enjoy the support of the commercial age check sector and a vast array of children’s charities and, in that regard, I must mention the work of Barnardo’s, CEASE and 5Rights, which have really led the charge.
I will spend the bulk of my time setting out in detail the amendments themselves, and I will leave my co-signatories and others to make the arguments for
them. Before I do, I once again acknowledge the work of the noble Baroness, Lady Benjamin, who has been fighting this fight for many years, and the noble Baroness, Lady Harding, whose characteristic pragmatism was midwife to the drafting process. I also acknowledge the time spent talking about this issue with the Secretary of State, the noble Lord the Minister and officials at DSIT. I thank them for their time and their level of engagement.
Let me first say a few words about age assurance and age verification. Age assurance is the collective term for all forms and levels of age verification, which means an exact age, and age estimation, which is an approximate or probable age. Age assurance is not a technology; it is any system that seeks to achieve a level of certainty about the age or age range of a person. Some services with restricted products and services have no choice but to have the very highest level of assurance or certainty—others less so.
To be clear at the outset, checking someone’s age, whether by verification or estimation, is not the same as establishing identity. While it is absolutely the case that you can establish age as a subset of establishing someone’s identity, the reverse is not necessarily true. Checking someone’s age does not need to establish their identity.
Age assurance strategies are multifaceted. As the ICO’s guidance in the age-appropriate design code explains, online services can deploy a range of methods to achieve the necessary level of certainty about age or age range. For example, self-verification, parental authentication, AI estimation and/or the use of passports and other hard identifiers may all play a role in a single age assurance strategy, or any one of them may be a mechanism in itself in other circumstances. This means that the service must consider its product and make sure that the level of age assurance meets the level of risk.
Since we first started debating these issues in the context of the Digital Economy Act 2017, the technology has been transformed. Today, age assurance might just as effectively be achieved by assessing the fluidity of movement of a child dancing in a virtual reality game as by collecting their passport. The former is over 94% accurate within five seconds and is specific to that particular child, while a passport may be absolute but less reliable in associating the check with a particular child. So, in the specific context of that dancing child, it is likely that the former gives the greater assurance. When a service’s risk profile requires absolute or near absolute certainty—for example, any of the risks that are considered primary priority harms, including, but not limited to, pornography—having the highest possible level of assurance must be a precondition of access.
Age assurance can also be used to ensure that children who are old enough to use a service have an age-appropriate experience. This might mean disabling high-risk features such as hosting, livestreaming or private messaging for younger children, or targeting child users or certain age groups with additional safety, privacy and well-being interventions and information. These amendments, which I will get to shortly, are designed to ensure both. To achieve the levels of certainty and privacy which are widely and rightly
demanded, the Bill must both reflect the current state of play and anticipate nascent and emerging technology that will soon be considered standard.
That was a long explanation, for which I apologise, but I hope it makes it clear that there is no single approach, but, rather, a need to clearly dictate a high bar of certainty for high-risk services. A mixed economy of approaches, all geared towards providing good outcomes for children, is what we should be promoting. Today we have the technology, the political will and the legislative mechanism to make good on our adult responsibilities to protect children online. While age assurance is eminently achievable, those responsible for implementing it and, even more importantly, those subject to it need clarity on standards; that is to say, rules of the road. In an era when data is a global currency, services have shown themselves unable to resist the temptation to repurpose information gleaned about the age of their users, or to facilitate the access to industrial amounts of harmful material for children for commercial gain. As with so many of tech’s practices, this has eroded trust and heightens the need for absolute clarity on how services build their age-assurance systems and what they do—and do not do—with the information they gather, and the efficacy and security of the judgments they make.
Amendment 125A simply underlines the point made frequently in Committee by the noble Baroness, Lady Ritchie of Downpatrick, that the Bill should make it clear that pornography should not be judged by where it is found but by the nature of the material itself. It would allow Ofcom to provide guidance on pornographic material that should be behind an age gate, either in Part 3 or Part 5.
Amendment 142 seeks to insert a new clause setting out matters that Ofcom must reflect in its guidance for effective age assurance; these are the rules of the road. Age assurance must be secure and maintain the highest levels of privacy; this is paramount. I do not believe I need to give examples of the numerous data leaks but I note the excessive data harvesting undertaken by some of the major platforms. Age assurance must not be an excuse to collect users’ personal and sensitive information unnecessarily, and it should not be sold, stored or used for other purposes, such as advertising, or offered to third parties.
Age assurance must be proportionate to the risk, as per the results of the child risk assessment, and let me say clearly that proportionality is not a route to allow a little bit of porn or a medium amount of self-harm, or indeed a lot of both, to a small number of children. In the proposed new clause, proportionality means that if a service is high-risk, it must have the highest levels of age assurance. Equally, if a service is low-risk or no-risk, it may be that no age assurance is necessary, or it should be unobtrusive in order to be proportionate. Age-assurance systems must provide mechanisms to challenge or change decisions to ensure that everyone can have confidence in their use, and they do not keep individuals—adults or children—out of spaces they have the right to be in. It must be inclusive and accessible so that children with specific accessibility needs are considered at the point of its design, and it must provide meaningful information so that users
can understand the mode of operation. I note that the point about accessibility is of specific concern to the 5Rights young advisers. Systems must be effective. It sounds foolish to say so, but look at where we are now, when law in the US, Europe, the UK and beyond stipulates age restrictions and they are ignored to the tune of tens of millions of children.
Age assurance is not to rely solely on the user to provide information; a tick box confirming “I am 18” is not sufficient for any service that carries a modicum of risk. It must be compatible with the following laws: the Data Protection Act, the Human Rights Act, the Equality Act and the UNCRC. It must have regard to the risks and opportunities of interoperable age assurance, which, in the future, will see these systems seamlessly integrated into our services, just as opening your phone with your face, or using two-factor authentication when transferring funds, are already normalised. It must consult with the Information Commissioner and other persons relevant to technological expertise and an understanding of child development.
On that point, I am in full support of the proposal from the noble Lord, Lord Allan, to require Ofcom to produce regular reports on age-assurance technology, and see his amendment as a necessary companion piece to these amendments. Importantly, the amendment stipulates that the guidance should come forward in six months and that all systems of age assurance, whether estimated or verified, whether operated in-house or by third-party providers, and all technologies must adhere to the same principles. It allows Ofcom to point to technical standards in its guidance, which I know that the ISO and the IEEE are currently drafting with this very set of principles in mind.
6.15 pm
Amendment 161, which I promise I will get through a little more quickly, simply sets out the need for any regulated service to have an appropriate level of confidence in the age or age range of child relative to risk. It makes clear under what circumstances an age-assurance strategy is required and that any methodology is permitted provided it is adequate to the risk inherent in the service and meets Ofcom’s guidance, which I have already spoken to. Paragraph 3 of the proposed new schedule specifies that the highest standard of age assurance is required for pornographic services covered by Part 5; that is, they must confirm beyond reasonable doubt that the user is not a child. It also makes provision for auditing systems that age-check children. Paragraph 4 deals expressly with pornography accessed via Part 3 services and, crucially, it requires the same high bar of age assurance to access pornography.
I pause for a moment to underline the fact that the impact on children from pornography, which I know other noble Lords will talk to, is not lessened by the route by which they access it it. Arguably, pornography that a child sees in the context of a Part 3 service of news, chatter and shopping is normalised by that context and, therefore, worse. So while we are clear that a Part 3 service must put material that reaches the definition of porn in Clause 70(2) behind an age gate, we are not, as some would suggest, age-gating the internet.
Paragraph 5 of the proposed new schedule makes it clear that a company must consider all parts of the service separately; for example, those that are high-risk may require a higher level of age-assurance than those that are not. Paragraph 3 makes it clear that existing users, as well as new users, should be given the benefit of the schedule. Paragraph 7 refers to the definition and Paragraph 8 is a commencement clause that requires this coming into effect within 12 months of the Act receiving Royal Assent, a subject to which I will return to in a moment.
Between them, Amendments 142 and 146 together and separately give the services, Ofcom and children the very best chance of introducing effective, privacy-preserving age-verification and estimation. There will be no more avoiding, no more excuses, no more using age checking as a data point for commercial purposes. While the Bill requires age assurance under certain circumstances, age checking as a concept is not brought in by this Bill. It is already widely demanded by US, EU and UK laws, but it is poorly done and largely unregulated, so we continue to see children in their millions accessing platforms they are too young to be on and children who are 13, but do not yet have adult capacity, being offered services designed for adults which do not account for their vulnerabilities.
That brings me to the idea of a commencement clause in both amendments. The failure to implement Part 3 of the DEA means there is simply no trust left in the community that the Government will do as they say or even that the Online Safety Act will do as it says. That is not helped by the repealing of Part 3 last week, hidden in a group of government amendments on devolution. Putting a time limit will, if history repeats itself, allow campaigners to go to the courts.
I am encouraged by indications that the Government will bring forward their own amendments and by their willingness to put some of this in the Bill; however, I must express a certain level of frustration at the tendency to reject what we have written in favour of something that clearly does less. I struggle to see why we need to argue that age assurance systems should be secure, that they should not use data for other purposes, that users should have a way of challenging age assurance decisions, or that they should be inclusive or accessible or take account of a child’s need to access certain information. These are not aspirational; they are a critical intervention needed to make age assurance workable. They have had a great deal of expert input over many years, so much so that they have been adopted voluntarily by the commercial age check sector. More importantly, without a transparent and trusted system of age assurance, all the provisions in the Bill aimed at ensuring that children have heightened protections will fall down.
The bereaved parents of Olly, Molly, Breck, Frankie and Sophie have come to your Lordships’ House to urge noble Lords to back these amendments. As they said when meeting Minister Scully, each of them had parental controls, each of them reported problems to companies, schools and/or the police—and still their children are dead. We need this regime for self-harm and pro-suicide material as much as we need it for pornography. If it were not against parliamentary rules, I would get down on my knees and beg the
Minister to go back to the department and say that these amendments must be passed with the full force of their meaning. This is robust, practical and much needed to make the Bill acceptable to adults and safe for children.
Amendment 184, also in my name, which seeks to establish the age of porn performers, will be spoken to by others at greater length, but I take the opportunity to tell your Lordships’ House that this is already the case in the United States and it works very well. I suggest we follow suit. Finally, this group should be seen as a companion piece to the harms schedule put forward by the same group of noble Lords, and which has the full support of all those groups and people I mentioned at the outset. The scale and range of expertise that has gone into this package of amendments is astonishing and between them, they would go a very long way to ensure that companies do not profit from purveying harms to children. I beg to move.