UK Parliament / Open data

Online Safety Bill

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in

spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking

into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and

children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

About this proceeding contribution

Reference

829 cc1112-5 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top