UK Parliament / Open data

Online Safety Bill

My Lords, the noble Lord, Lord Allan of Hallam, hinted at the fact that there have been a plethora of government amendments on Report and, to be honest, it has been quite hard fully to digest most of them, let alone scrutinise them. I appreciate that the vast majority have been drawn up with opposition Lords, who might have found it a bit easier. But some have snuck in and, in that context, I want to raise some problems with the amendments in this group, which are important. I, too,

am especially worried about that government amendment on facilitating remote access to services and equipment used to buy services. I am really grateful to the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, for tabling Amendment 247B, because I did not know what to do—and they did it. At least it raises the issue to the level of it needing to be taken seriously.

The biggest problem that I had when I originally read this provision was that facilitating remote access to services, and as yet undefined equipment used by a service, seems like a very big decision, and potentially disproportionate. It certainly has a potential to have regulatory overreach, and it creates real risks around privacy. It feels as though it has not even been flagged up strongly enough by the Government with regard to what it could mean.

I listened to what the Minister said, but I still do not fully understand why this is necessary. Have the Government considered the privacy and security implications that have already been discussed? Through Amendment 252A, the Government now have the power to enter premises for inspection—it rather feels as if there is the potential for raids, but I will put that to one side. They can go in, order an audit and so on. Remote access as a preliminary way to gather information seems heavy-handed. Why not leave it as the very last thing to do in a dialogue between Ofcom and a given platform? We have yet to hear a proper justification of why Ofcom would need this as a first-order thing to do.

The Bill does not define exactly what

“equipment used by the service”

means. Does it extend to employees’ laptops and phones? If it extends to external data centres, have the Government assessed the practicalities and security impact of that and the substantial security implications, as have been explained, for the services, the data centre providers and those of us whose data they hold?

I am also concerned that this will necessitate companies having very strongly to consider internal privacy and security controls to deal with the possibility of this, and that this will place a disproportionate burden on smaller and mid-sized businesses that do not have the resources available to the biggest technology companies. I keep raising this because in other parts of government there is a constant attempt to say that the UK will be the centre of technological innovation and that we will be a powerhouse in new technologies, yet I am concerned that so much of the Bill could damage that innovation. That is worth considering.

It seems to me that Amendment 252A on the power to observe at the premises ignores decentralised projects and services—the very kind of services that can revolutionise social media in a positive way. Not every service is like Facebook, but this amendment misses that point. For example, you will not be able to walk into the premises of the UK-based Matrix, the provider of the encrypted chat service Element that allows users to host their own service. Similarly, the non-profit Mastodon claims to be the largest decentralised social network on the internet and to be built on open-web standards precisely because it does not want to be bought or owned by a billionaire. So many of these amendments seem not to take those issues into account.

I also have a slight concern on researcher access to data. When we discussed this in Committee, the tone was very much—as it is in these amendments now—that these special researchers need to be able to find out what is going on in these big, bad tech companies that are trying to hide away dangerous information from us. Although we are trying to ensure that there is protection from harms, we do not want to demonise the companies so much that, every time they raise privacy issues or say, “We will provide data but you can’t access it remotely” or “We want to be the ones deciding which researchers are allowed to look at our data”, we assume that they are always up to no good. That sends the wrong message if we are to be a tech-innovative country or if there is to be any working together.

5.15 pm

My final point is to be a bit more positive. I am very keen on the points made by the Minister on the importance of transparency in algorithms, particularly in Amendments 196 and 199. This raises an important point. These amendments are intended to mean that providers of user-to-user services and search services would have to include in their transparency report details about algorithms, so that we can see how they work, and these amendments particularly relate to illegal content and content that is harmful to children. I should like that being understood more broadly, because for me there is constant tension where people do not know what the algorithms are doing. When content is removed, deboosted, or whatever, they do not know why. More transparency there would be positive.

The Minister knows this, because I have written to him on the subject, but many women, for example, are regularly being banned from social media for speaking out on sex-based rights, and gender-critical accounts are constantly telling me and are discussing among themselves that they have been shadow banned: that the algorithms are not allowing them to get their points over. This is, it is alleged, because of the legacy of trans activists controlling the algorithms.

Following on from the point of the noble Lord, Lord Allan of Hallam, there is always a danger here of people being conspiratorial, paranoid and thinking it is the algorithms. I made the point in an earlier discussion that sometimes you might just put up a boring post and no one is interested, but you imagine someone behind the scenes. But we know that Facebook continues to delete posts that states that men cannot be women, for example.

I would like this to be demystified, so the more Ofcom can ask the companies to demystify their algorithmic decisions and the more users can be empowered to know about it, the better for all of us. That is the positive bit of the amendments that I like.

About this proceeding contribution

Reference

831 cc2078-2081 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top