UK Parliament / Open data

Online Safety Bill

Proceeding contribution from Baroness Finlay of Llandaff (Crossbench) in the House of Lords on Wednesday, 1 February 2023. It occurred during Debate on bills on Online Safety Bill.

My Lords, I thank my noble friend Lady Kidron for her tenacious moral leadership on this issue. I remind noble Lords that, when we passed the Tobacco Advertising and Promotion Act, none of us predicted tobacco companies’ development and marketing of vapes with higher and more addictive nicotine content than that in cigarettes. It was a simple lesson.

A gap now in this Bill is the difficult issue of “legal but harmful”. We should not focus on the difficulty of defining this, but rather on the design and standards of algorithms that internet platforms use to commercial advantage, dodging any responsibility for what happens and blaming the end user.

Before the Government amended Clauses 12 and 13, category 1 service providers would have been forced to risk-assess across their sites and provide information on this in their terms of service, including how harmful content was to be managed. But this is now gone and as a result, the digital environment will not be detoxified as originally intended. What pressures, if any, were exerted on government by commercial and other sources to amend these clauses?

It matters that the Bill now treats people under 18 and over 18 very differently, because the brain’s development and peak addictive potential from puberty does not stop at 18. Those in their 20s are at particular risk.

The social media platforms act commercially, pushing out more content, including online challenges, as their algorithms pick up a keyword—whether spelled correctly or incorrectly—a mouse hovering over an image or a like response. Currently, platforms judge addiction and profit by the time spent on a platform, but that is not how addictions work. Addiction is the reward-reinforcing behaviour that evokes a chemical response in the brain that makes you want more. Hence the alcoholic, the gambling addict, the drug addict and so

on keep going back for more; the sex addict requires ever more extreme images to gain stimulation; the user will not switch off access.

Those whose emotional expression is through abuse and violent behaviour find more ways to abuse to meet their urge to control and vent feelings, often when adverse childhood experiences were the antecedent to disastrous destructive behaviour. The unhappy young adult becomes hooked in by the images pushed to them after an internet search about depression, anorexia, suicidal ideation and so on. The algorithm-pushed images become compulsive viewing, as ever more are pushed out, unasked for and unsearched for, entrapping them into escalating harms.

Now, the duties in Clause 12 are too vague to protect wider society. The user should be required to opt in to content so that it can be followed, not opt out. The people controlling all this are the platform companies. They commission the algorithms that push content out. These could be written completely differently: they could push sources of support in response to searches for gambling, eating disorders, suicidal ideation, dangerously extreme sex and so on. Amending the Bill to avoid escalating harms is essential. Some of the harms are ones we have not yet imagined.

The platform companies are responsible for their algorithms. They must be made responsible for taking more a sophisticated, balanced-risk approach: the new technology of artificial intelligence could detect those users of their platforms who are at particular risk. In daily life offline, we weigh up risk, assessing harms and benefits in everything, filtering what we say or do. Risk assessment is part of life. That does not threaten freedom of speech, but it would allow “legal but harmful” to be addressed.

The Bill presents a fantastic opportunity. We must not throw it away.

6.33 pm

About this proceeding contribution

Reference

827 cc711-2 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top