My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.
I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I
also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.
We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.
A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.
The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.
I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.
In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.
As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.
If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.
12.45 pm
I stress again that, in the context of our debate on these amendments, a key element in the rationale for deploying these tools voluntarily—not because they are required to do so by law—is the fact that social media services are acting as hosts for content on their servers, so they feel partially liable for it; in fact, in legal terms, they may well be strictly liable for it. By contrast, modern private messaging services tend to have quite a different architecture, where the provider does not host content on its servers but simply moves it from one device on the network to another. There are some exceptions to that with legacy services, such as Facebook Messenger and the caching of large files—we could go into that subject, if noble Lords are interested. But the key point is that there has been a trend towards more functionality at the edge—namely, on the device in your pocket—as we move from classic social media, which depended on servers, to messaging. That distinction is critical when we consider what is commonly referred to as client-side scanning. The scanning that takes place today generally takes place on platform servers on content they are hosting themselves. The introduction of scanning on to people’s own
devices is a different beast in technical, legal and ethical terms; I am sure we will want to tease that out in the debate.
The second trend we have seen is the concern over government surveillance. Back in the day, we may have been comfortable with the security services having a desk in the telephone exchange or asking their mate Bob, who does the filing at some company, to pass them information about a dodgy character—but the landscape has shifted. The Snowden revelations triggered a huge debate about the reach of Governments into our online lives—even those whom we think are on our side, such as the UK Government or the US Government—and we are increasingly concerned about foreign surveillance at home, to the extent that we are willing to spend a fortune pulling Huawei devices out of core UK telecom networks to mitigate the risk of Chinese government access. If you think that a foreign Government have gained access to the UK’s telecom networks, using an end-to-end encrypted service is one of the best ways to protect yourself, which, I am sure, is on the minds of the technical staff of UK political parties when they choose to put their teams on encrypted apps such as WhatsApp.
Thirdly, there is a general trend in privacy expectations and legislation, which are all heading in one direction: improving transparency over what is being done with data and giving people more power to withhold or grant consent. This reflects the fact that more of our lives are moving online, so being able to control it becomes more critical to us all. We see this trend playing out in multiple pieces of legislation, such as the general data protection regulation and the privacy regulation, as well as in actions taken by regulators to step up enforcement.
Far from being an irrational move by platforms careless as to its negative impacts, the adoption of end-to-end encryption is an entirely rational response to these three powerful regulatory and societal trends. It can help to mitigate the ever-increasing risks related to content liability—which the Bill, in fact, adds to—it makes hostile government surveillance much harder, and it is a key safeguard against privacy violations.
If this is where we have been with regulation incentivising the adoption of end-to-end encryption, how might this play out as we introduce a new element in the mix with the Online Safety Bill? I can see three scenarios that could play out as the Bill comes into force and Ofcom gains powers to issue directions to platforms. First, the Government could declare that their intent is to impose technical requirements that would mean that people in the UK will no longer be able to use truly secure end-to-end encrypted products. That could be either through explicit instructions to messaging service providers to remove the end-to-end encryption, or through requiring client-side scanning to be installed on user devices in the UK, which would, in effect, render them less secure. That is not my preferred option, but it would at least allow for an orderly transition, if services choose to withdraw products from the UK market rather than operate here on these terms. It might be that there are no significant withdrawals, and the UK Government could congratulate themselves on calling the companies’ bluff and getting what they
want at little cost, but I doubt that this would be the case, given the strength of feeling out there—which, I am sure, we have all seen. We would at least want to know, one way or the other, which way that will go before adopting that course of action.
The second course is for the Government to continue with the posture of intentional ambiguity, as they have done to date. They are careful to say that they have no intention of banning end-to-end encryption, and I expect to hear that again in the Minister’s response today, but at the same time refuse to confirm that they could not do so under the new powers in the Bill. This creates a high-stakes game of chicken, where the Government think companies will give them more if they hold the threat of drastic technical orders over them. That “more” might include providing more metadata—who messaged whom, when and from where—or tools to identify patterns of suspicious behaviour without reading message content. These are all things we can expect the Government to be discussing with companies, as well as urging them to deploy forms of client-side scanning voluntarily.
As a veteran of a thousand psychic wars of this kind, I have to say that I do not think it is as productive a way to proceed as some in government may believe. It is all too common to have meetings with government representatives where you are working together on responding to terrorist content only to find a Minister going out the next day to say that your platform does not care about terrorism. I get it; this is politics. However, it is hard to explain to engineers who you are asking to go the extra mile to build new safety tools why they should do so when the Government who asked for the tools give them no credit for this. I understand the appeal from the government side of going into a negotiation with a big regulatory stick that you can show to the other side, but I think it is misguided.
The Government’s hope is that companies will blink first in the game of chicken and give them what they want, but it is at least as likely that the Government will blink first and have to abandon proposals, which risks discrediting their efforts as a whole. If nobody blinks, and we allow an unstoppable force to hit an immovable object, we could end up with the complete breakdown of key relationships and years of unproductive litigation. I believe that the interests of people in the UK lie in government being able to work with the services that millions of us use to find the best ways to combat harms—harms that everybody, on both sides, agree are a priority.
That brings me to my third and final scenario, and the one that these amendments are seeking to create. This is where the Government accept that end-to-end encrypted communication services are a legitimate part of the modern online environment that should not be undermined or pushed out of the UK market. The Government would explicitly rule out any intention to use orders under Clause 110 to weaken end-to-end encrypted services and instead focus their efforts on making it clear to people that end-to-end encryption does not mean impunity.
I was talking to my children as I came in about the fact that end-to-end encryption is not entirely secure and does not grant absolute privacy, and they said,
“Of course—everyone should do the online safety classes we do at school”. These offer the simple message that it is foolish to send things over any internet service that you would not want to be shared widely, and the training tells you that any message can be screenshotted and passed around. Rather than talking up the fact that end-to-end encryption is protecting people sharing bad content, we should be talking up the ways in which you remain exposed.
Sadly, we have become used to reading stories about awful content being shared in groups on messaging services used by serving police officers—these were WhatsApp end-to-end encrypted messages. If there is legitimate interest in investigating content, we will see it serviced, whether or not it is shared on an encrypted service. Unless people are communicating only with themselves, there are multiple ways that their content, if illegal, might come to the attention of the authorities. The most obvious is that someone who is privy to the content hands it over, either voluntarily or because they are themselves under investigation. But the police and security services also have a range of intrusive surveillance tools at their disposal which can compromise the devices of their targets under properly warranted authority, and all the content on any apps they use can be provided to the security services properly, under the controls in the Regulation of Investigatory Powers Act. There are long-standing powers, sometimes used controversially, to require people to grant access to their own devices if there are grounds to think it is necessary to investigate some types of offence.
I hope the Government will give serious consideration to moving in this direction and to accepting the force of the amendments that have been put forward today. This is not about weakening the fight against appalling harms such as child sexual abuse material and terrorism, but rather about finding the most practical way to wage that fight in a world where end-to-end encryption exists and is being used to mitigate other material risks to our online lives. I beg to move.