UK Parliament / Open data

Online Safety Bill

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content

that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.

4 pm

There is a really interesting potential role for Ofcom here. One thing that is frustrating if you work at a platform is that you will often get stuck and when you go out and look for advice, you find it is hard to get it. When I ran a working group with some French lawyers, including quite senior judges, they came into the working group saying, “This is all straightforward—you’re just not removing the illegal stuff”. So we gave them real cases and it was interesting to see how half of the lawyers in the room would be on one side, saying “It must come down—it’s against French law” while

the other half was saying, “How could you possibly take this down in France?”, because it was protected speech. It is really difficult to get that judgment but, interestingly, an unintended consequence of the Bill may be that Ofcom will ultimately get stuck in that position.

The Bill is not about Ofcom making rulings on individual items of content but if—as in the example I shared with the noble Lord, Lord Moylan, earlier—the police have said to a platform, “You must remove this demonstration. It is illegal”, and the platform said, “No, we judge it not to be illegal”, where are the police going to go? They will go to Ofcom and say, “Look, this platform is breaching the law”, so Ofcom is going to get pulled into that kind of decision-making. I do not envy it that but, again, we need to plan for that scenario because people who complain about illegality will go wherever they think they can get a hearing, and Ofcom will be one of those entities.

A huge amount on this illegal content area still needs to be teased out. I ask the Minister to respond specifically to the points I have raised around whose jurisdiction it is. If the speaker is speaking legally, because they are in a country outside the United Kingdom, what is the Government’s expectation on platforms in those circumstances? Will he look at the issue of the tests and where on this spectrum, from probably illegal through to likely to be illegal and may be illegal, the Government expect platforms to draw the line? If platforms have removed the bad content, will he consider carefully to what extent the Government think that the platforms should have to go through the process of investing time and energy to work out whether they removed it for illegality or for a terms of service breach? That is interesting but if our focus is on safety, frankly, it is wasted effort. We need to question how far we expect the platforms to do that.

About this proceeding contribution

Reference

829 cc1343-6 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top