UK Parliament / Open data

Online Safety Bill

My Lords, I also support Amendment 157, which stands in the name of the noble Lord, Lord Pickles, and others, including my own. As the noble Baroness, Lady Deech, indicated, it is specific in the nature of what it concentrates on. The greatest concern that arises through the amendment is with reference to category 2A. It is not necessarily incompatible with what the noble Lord, Lord Moylan, proposes; I do not intend to make any direct further

comment on his amendments. While the amendment is specific, it has a resonance with some of the other issues raised on the Bill.

I am sure that everyone within this Committee would want to have a Bill that is as fit for purpose as possible. The Bill was given widespread support at Second Reading, so there is a determination across the Chamber to have that. Where we can make improvements to the Bill, we should do that and, as much as possible, try to future-proof the Bill. The wider resonance is the concern that if the Bill is to be successful, we need as much consistency and clarity within it as possible, particularly for users. Where we have a level of false dichotomy of regulations, that runs contrary to the intended purposes of the Bill and creates inadvertent opportunities for loopholes. As such, and as has been indicated, the concern is that in the Bill at present, major search engines are effectively treated in some of the regulations on a different basis from face-to-face users. For example, some of the provisions around risk assessment, the third shield and the empowerment tools are different.

As also indicated, we are not talking about some of the minor search engines. We are talking about some of the largest companies in the world, be it Google, Microsoft through Bing, Amazon through its devices or Apple through its Siri voice tool, so it is reasonable that they are brought into line with what is there is for face-to-face users. The amendment is therefore appropriate and the rationale for it is that there is a real-world danger. Mention has been made—we do not want to dwell too long on some of the examples, but I will use just one—of the realms of anti-Semitism, where I have a particular interest. For example, on search tools, a while ago there was a prompt within one search engine that Jews are evil. It was found that when that prompt was there, searches of that nature increased by 10% and when it was removed, they were reduced. It is quite fixable and it goes into a wide range of areas.

One of the ways in which technology has changed, I think for us all, is the danger that it can be abused by people who seek to radicalise others and make them extreme, particularly young children. Gone are the days when some of these extremists or terrorists were lonely individuals in an attic, with no real contact with the outside world, or hanging around occasionally in the high street while handing out poorly produced A4 papers with their hateful ideology. There is a global interconnection here and, in particular, search engines and face-to-face users can be used to try to draw young people into their nefarious activities.

I mentioned the example of extremism and radicalisation when it comes to anti-Semitism. I have seen it from my own part of the world, where there is at times an attempt by those who still see violence as the way forward in Northern Ireland to draw new generations of young people into extremist ideology and terrorist acts. There is an attempt to lure in young people and, sadly, search engines have a role within that, which is why we need to see that level of protection. Now, the argument from search engines is that they should have some level of exemptions. How can they be held responsible for everything that appears through their searches, or indeed through the web? But in

terms of content, the same argument could be used for face-to-face users. It is right, as the proposer of this amendment has indicated, that there are things such as algorithmic indexing and prompt searches where they do have a level of control.

The use of algorithms has moved on considerably since my schooldays, as they surely have for everyone in this Committee, and I suspect that none of us felt that they would be used in such a fashion. We need a level of protection through an amendment such as this and, as its proposers, we are not doctrinaire on the precise form in which this should take place. We look, for example, at the provisions within Clause 11—we seek to hear what the Government have to say on that—which could potentially be used to regulate search engines. Ensuring that that power is given, and will be used by Ofcom, will go a long way to addressing many of the concerns.

I think all of us in this Committee are keen to work together to find the right solutions, but we feel that there is a need to make some level of change to the regulations that are required for search engines. None of us in this Committee believes that we will ultimately have a piece of legislation that reflects perfection, but there is a solemn duty on us all to produce legislation that is as fit for purpose and future-proofed as possible, while providing children in particular with the maximum protection in what is at times an ever-changing and sometimes very frightening world.

About this proceeding contribution

Reference

829 cc1288-1290 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top