UK Parliament / Open data

Online Safety Bill

My Lords, I rise to introduce this group. On Tuesday in Committee, I said that having reached day 8 of the Committee we had all found our roles; now, I find myself in a different role. The noble Baroness, Lady Kidron, is taking an extremely well-earned holiday and was never able to be in the House today. She has asked me to introduce this group and specifically to speak to Amendment 125 in her name.

I strongly support all the amendments in the group, particularly those that would result in a review, but will limit my words to Amendment 125. I also thank the other co- signatories, the noble Baroness, Lady Finlay, who is in her place, and my noble friend Lord Sarfraz, who made such a compelling speech at Second Reading on the need for the Bill to consider emerging technologies but who is also, sadly, abroad, on government business.

I start with something said by Lord Puttnam, and I paraphrase: that we were forbidden from incorporating the word “digital” throughout the whole process of scrutiny of the communications Act in 2002. As a number of us observed at the time, he said, it was a terrible mistake not to address or anticipate these issues when it was obvious that we would have to return to it all at some later date. The Online Safety Bill is just such a moment: “Don’t close your eyes and hope”, he said, “but look to the future and make sure that it is represented in the Bill”.

With that in mind, this amendment is very modest. I will be listening carefully, as I am sure the noble Baroness, Lady Kidron, will from a distance, to my noble friend the Minister because if each aspect of this amendment is already covered in the Bill, as I suspect he will want to say, then I would be grateful if he could categorically explain how that is the case at the Dispatch Box, in sufficient detail that a future court of law can clearly understand it. If he cannot state that then I will be asking the House, as I am sure the noble Baroness, Lady Kidron, would, to support the amendment’s inclusion in the Bill.

There are two important supporters of this amendment. If the Committee will forgive me, I want to talk briefly about each of them because of the depth of understanding of the issues they have. The first is an enforcement officer who I shall not name, but I and the noble Baroness, Lady Kidron, want to thank him and his team for the extraordinary work that they do, searching out child sexual abuse in the metaverse. The second, who I will come to in a little bit, is Dr Geoff Hinton, the inventor of the neural network and most often referred to as “the godfather of AI”, whom the noble Baroness, Lady Kidron, met last week. Both are firm supporters of this amendment.

The amendment is part of a grouping labelled future-proofing but, sadly, this is not in the future. It is with us now. The rise of child sexual abuse in the metaverse is growing phenomenally. Two months ago, at the behest of the Institution of Engineering and Technology, the noble Baroness, Lady Kidron, hosted a small event at which members of a specialist police unit explained to colleagues from both Houses that what they were finding online was amongst the worst imaginable, but was not adequately caught by existing laws. I should just warn those listening to or reading this—I am looking up at the Public Gallery, where I see a number of young people listening to us—that I am about to briefly recount some really horrific stuff from what we saw and heard.

The quality of AI imagery is now at the point where a realistic AI image of a child can be produced. Users are able to produce or order indecent AI images, based on a child known to them. Simply by uploading a picture of a next door neighbour’s child or a family member, or taking a child’s image from social media

and putting that face on existing abuse images, they can create a body for that picture or, increasingly, make it 3D and take it into an abuse room. The type of imagery produced can vary from suggestive or naked to penetrative sex; for the most part, I do not think I should be repeating in this Chamber the scenarios that play out.

VR child avatars can be provided with a variety of bespoke abuse scenarios, which the user can then interact with. Tailor-made VR experiences are being advertised for production on demand. They can be made to meet specific fetishes or to feature a specific profile of a child. The production of these VR abuse images is a commercial venture. Among the many chilling facts we learned was that the Oculus Meta Quest 2, which is the best-selling VR headset in the UK, links up to an app that is downloaded on to the user’s mobile phone. Within that app, the user can search for other users to follow and engage with—either through the VR headset or via instant messaging in their mobile app. A brief search through the publicly viewable user profiles on this app shows a huge number of profiles with usernames indicative of a sexual interest in children.

Six weeks after the event, the noble Baroness, Lady Kidron, spoke to the same officer. He said that already the technology was a generation on—in just six weeks. The officer made a terrible and terrifying prediction: he said that in a matter of months this violent imagery, based on and indistinguishable from an actual known child, will evolve to include moving 3D imagery and that at that point, the worlds of VR and AI will meet and herald a whole new phase in offending. I will quote this enforcement officer. He said:

“I hate to think where we will be in six months from now”.

While this group is labelled as future-proofing the Bill, I remind noble Lords that in six months’ time, the provisions of the Bill will not have been implemented. So this is not about the future; it is actually about the now.

Noon

Even though what I am describing is abhorrent, to some it may appear to be a victimless crime or a thought crime that might take the place of real crimes, since it could be argued that nobody gets hurt. There are three points to say against that. First, evidence shows that rehearsing child-abuse fantasies online radically accelerates the offender pathway—the length of time between looking at images and abusing a child. Secondly, the relative anonymity of the online world has enabled and supercharged the spread of such content and risks, normalising its production and consumption. Thirdly, the current advances in AI allow perpetrators to create and share thousands of images of a child in a matter of minutes. That leaves the police overwhelmed with the impossible task of distinguishing between the AI-created children and the real children who are being abused. The sheer volume of abuse imagery can remain undiscovered and therefore unreached. This is a perverse and chilling game of whack-a-mole.

A small band of enforcement officers are crying out for our help because they are concerned that existing law does not reach this material and that blurring the role of machine and master risks undermining their

ability to enforce the law. While Sections 62 to 69 and Schedule 13 of the Coroners and Justice Act 2009 go some way towards bringing certain computer-generated images into the scope of the law, much of the sexual offences law fails to reach the online world. As a result, the enforcement community is struggling to deal with the new generation of automated and semi-automated systems that create not only abuse images but abusive scenarios at the touch of a button. As the police officer explained to us, the biggest change required is the provision of specific offences covering virtual abuse in the VR social environment, to protect children in those areas against the psychological impact of virtual abuse.

This amendment makes a small change to the definition of “content”, to make clear that machine-generated content is to be regarded as user-generated content of a service, under the following circumstances: first, if the creation or use of the content interacts with user-generated content; secondly, if it takes the form or identity of a user; thirdly, if it provides content that would reach the bar of illegal primary priority content or priority content in another format; and finally, if a user has in any way facilitated any element of the generation by way of a command prompt or any other instruction, however minimal. This would go a long way to support the police in their unenviable task.

When my noble friend the Minister responds, I would ask that he confirms that the scope of the Bill—user-to-user services and search—does not fetter law enforcement. We discussed services of limited functionality being out of scope earlier in Committee, when discussing Amendment 2. For example, would a person or an automated process creating this material at scale, with no user-to-user functionality, be out of scope? The concern must be that existing laws covering child sexual abuse do not address the current state of technology, and this Bill may be drawn too narrowly to catch the abuse that is happening at ever-increasing scale.

Finally, this brings me to Dr Geoff Hinton. After a decade at Google, he retired and has chosen to speak freely about his profound worries concerning the future of AI, joining the chorus of those on the front line who are demanding that we regulate it before it is too late. I am a keen and enthusiastic early adopter of new technology, but we should listen very carefully to his concerns. He says that AI systems can learn and provide a compelling view of the world at such speed and scale that, in the hands of bad actors, they will in the very near future obliterate any version of a common reality. A deluge of fake images, videos and texts will be the data upon which future AI-driven communication will be built, leaving all of us unable to distinguish between fact and fiction. That is a very scary view of the world and we should take his professional concern very seriously, particularly when we focus on this Bill and how we protect our children in this world.

Given the scope of the Bill, we obviously will not be able to address every one of the hopes or fears of AI as it stretches out ahead of us, but it is a huge mistake for the Online Safety Bill to pretend that this future is not already with us. In this amendment and the whole group, we are attempting to put in the Bill the requirements to recognise those future dangers. As Dr Hinton has

made clear, it is necessary to treat the fake as if it were real today, because we are no longer certain what is fake and what is real. We do a disservice to our children if we do not recognise that reality today.

I appreciate that I have spoken for far too long on this very small amendment. It closes a loophole which means that if machine-generated material is imitating user-to-user behaviour, takes the form of a user, or would in another context meet the bar of illegal primary priority content or priority content, it should be treated as such under the safety duties of the Bill. That is all it does. This would prevent the police standing by as the horrific rise in the use of abuse rooms—which act as a rehearsal for abusing children—continues. It is much needed and an essential first step down this road. I beg to move.

About this proceeding contribution

Reference

830 cc989-993 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top