UK Parliament / Open data

Online Safety Bill

My Lords, it is a great honour to rise after the noble Baroness, Lady Merron, who spoke so clearly about Amendment 52 and the group of amendments connected with health misinformation, some of which stand also in my name.

As the noble Baroness rightly pointed out, we have known for a long time the negative impact of social media, with all its death scrolls, algorithms and rabbit holes on vaccine uptake. In 2018, the University of Southampton did a study of pregnant women and found that those who reported using social media to research antenatal vaccinations were 58% less likely to accept the whooping cough vaccine. Since then, things have only got worse.

3.45 pm

As a junior Health Minister during the pandemic, I saw how the successful vaccine rollout was at severe risk of being undermined by misinformation, amplified by foreign actors and monetised by cynical commercial interests. The challenge was enormous. The internet, as we know, is a highly curated environment that pushes content, functions and services that create an emotional response and retain our attention. Social media algorithms are absolutely the perfect tool for conspiracy theorists, and a pandemic necessarily raises everyone’s concerns. It was unsurprising that a lot of people went down various rabbit holes on health information.

The trust between our clinical professionals and their patients relies on a shared commitment to evidence-based science. That can quickly go out of the window if the algorithms are pushing rousing content that deliberately plays into people’s worst fears and anxieties, thereby displacing complex and nuanced analysis with simplistic attention-seeking hooks, based sometimes

on complete nonsense. The noble Baroness, Lady Merron, mentioned lemons for cancer as a vivid example of that.

At the beginning of the vaccine programme, a thorough report by King’s College London, funded by the NIHR health protection research unit, found that 14% of British adults believed the real purpose of mass vaccination against coronavirus was to track and control the population. That rose to an astonishing 42% among those who got their information from WhatsApp, 39% for YouTubers, 29% from the Twitterati and 28% from Facebookers. I remember that, when those statistics came through, it put this important way out of the pandemic in jeopardy.

I remind the Committee that a great many people make money out of such fear. I highly recommend the Oxford University Journal of Communication article on digital profiteering for a fulsome and nuanced guide to the economics of the health misinformation industry. I also remind noble Lords that foreign actors and states are causing severe trouble in this area. “Foreign disinformation” social media campaigns are linked to falling vaccination rates, according to an international time-trend analysis published by BMJ Global Health.

As it happens, in the pandemic, the DHSC, the Cabinet Office and a wide group throughout government worked incredibly thoughtfully on a communications strategy that sought to answer people’s questions and apply the sunlight of transparency to the vaccine process. It balanced the rights to freedom of expression with protecting our central strategy for emerging from the pandemic through the vaccine rollout. I express considerable thanks to those officials, and the social media industry, who leant into the issue more out of a sense of good will than any legal obligation. I was aware of some of the legal ambiguities around those times.

Since then, things have gone backwards, not forwards. Hesitancy in the UK has risen, with a big impact on vaccine take-up rates. We are behind on 13 out of the 14 routine vaccine programmes, well behind the 95% target set by the World Health Organization. The results are clear: measles is rising because of vaccine uptake falling, and that is true of many common, avoidable diseases. As for the platforms, Twitter’s recent decision at the end of last year to suddenly stop enforcing its Covid-19 misinformation policy was a retrograde step and possibly the beginning of a worrying trend that we should all be conscious of, and is one of the motivating reasons for this amendment.

Unfortunately, the Government’s decision to remove from the Bill the provisions on content harmful to adults, and with that the scope to include harmful health content, has had unintended consequences and left a big gap. We will have learned nothing from the pandemic if we do not act to plug that gap. The amendment and associated amendments in the group seek to address this by introducing three duties, as the noble Baroness, Lady Merron explained.

The first requirement is an assessment of the risks presented by harmful health disinformation and misinformation. Anyone who has been listening to these debates will recognise that this very much runs with the grain of the Bill’s approach and is consistent with many of the good things already in the Bill.

Risk assessments are a very valuable tool in our approach to misinformation. I remind noble Lords that, for this Bill, “content” has a broad meaning that includes services and functions of a site, including the financial exploitation of that content. Secondly, the amendment would require large platforms to publish a policy setting out their approach to health misinformation. Each policy would have to explain how it is designed to mitigate or manage risks and should be kept up to date and maintained. That kind of transparency is at the heart of how we hold platforms to account. Lastly, platforms would be required to summarise their health misinformation policy in terms that consumers can properly understand.

This approach is consistent with the spirit of the Bill’s treatment of many harms: we are seeking transparency and we are creating accountability, but we are not mandating protocols. The consequences are clear. Users, health researchers and internet analysts would be able to see clearly how a platform proposes to deal with health misinformation that they may encounter on a particular service and make informed decisions as a result. The regulator would be able to see clearly what the nature of these risks is.

May I briefly tackle some natural concerns? On the question of protection of freedom of expression, my noble friend Lord Moylan rightly reminded us on Tuesday of Article 19 of the UN Universal Declaration of Human Rights: everyone has the freedom to express opinions and speech. On this point, I make it clear that this amendment would not require platforms to remove health misinformation from their service or to prescribe particular responses. In fact, I would go further. I recognise that it is important to have a full debate about the efficacy, safety and financial wisdom of treatments, cures and vaccines. This amendment would do nothing to close down that debate. It is about clarity. The purpose of the amendment is to prevent providers ducking the question about how they handle health misinformation. To that extent, it would help both those who are worried about health misinformation and those who are worried about being branded as sharing health misinformation to know where the platforms are coming from. It would ensure that providers establish what is happening on their service, what the associated risks to their users are, and then to shine a light on how they intend to deal with it.

I also make it clear that this is not just about videos, articles and tweets. We should also be considering whether back-end payment mechanisms, including payment intermediaries, donation collection services and storefront support, should be used to monetise health misinformation and enable bad actors. During the pandemic, the platforms endorsed the principle that no company should be profiting from Covid-19 vaccine misinformation, for instance. It is vital that this is considered as part of the platforms’ response to health misinformation. We should have transparency about whether platforms such as PayPal and Google are accepting donations, membership or merchandise payments from known misinformation businesses. Is Amazon, for instance, removing products that are used to disseminate health misinformation? Are crowdfunding websites hosting health misinformation campaigns from bad actors?

To anticipate my noble friend the Minister, I say that he will likely remind us that there are measures already in place in the Bill if the content is criminal or likely to be viewed by children, and I welcome those provisions. However, as the Bill stands, the actual policies on misinformation and the financial exploitation of that content will be a matter of platform discretion, with no clarity for users or the regulator. It will be out of sight of clear regulatory oversight. This is a mistake, just as Twitter has just shown, and that is why we need this change.

Senior clinicians including Sir Jeremy Farrar, Professor John Bell and the noble Lord, Lord Darzi, have written to the Secretary of State to raise their concerns. These are serious players voicing serious concerns. The approach in Amendment 52 is, in my view, the best and most proportionate way to protect those who are most vulnerable to false and misleading information.

About this proceeding contribution

Reference

829 cc2004-7 

Session

2022-23

Chamber / Committee

House of Lords chamber
Back to top