I support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be
“designed to effectively … reduce the likelihood of the user encountering content”
they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.
At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.
The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.
Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.
The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.
The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.
Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.
Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.