My co-sponsors will deal with some of the more detailed elements of the 30 amendments that we are dealing with. These will include safety duties, functionality and harm, and codes of practice. I am sure that the noble Lords, Lord Stevenson and Lord Knight, and the right reverend Prelate the Bishop of Oxford will speak to their own amendments.
I will provide a brief overview of why we are so convinced of the paramount need for a safety by design approach to protect children and remind digital companies and platforms, forcibly and legally, of their
obligation to include the interests and safety of children as a paramount element within their business strategies and operating models. These sites and services are artificial environments. They were designed artificially and can be redesigned artificially.
In her testimony to the US Senate in July 2021, the Facebook whistleblower Frances Haugen put her finger on it rather uncomfortably when talking about her erstwhile employer:
“Facebook know that they are leading young users to anorexia content … Facebook’s internal research is aware that there are a variety of problems facing children on Instagram … they know that severe harm is happening to children”.
She was talking about, probably, three years ago.
On the first day of Committee, the noble Lord, Lord Allan, who is not with us today, used the analogy of the legally mandated and regulated safe design of aeroplanes and automobiles and the different regimes that cover their usage to illustrate some of our choices in dealing with regulation. We know why aeroplanes and cars have to be designed safely; we also know that either form of transportation could be used recklessly and dangerously, which is why we do not allow children to fly or drive them.
First, let us listen to the designers of these platforms and services through some research done by the 5Rights Foundation in July 2021. These are three direct quotes from the designers:
“Companies make their money from attention. Reducing attention will reduce revenue. If you are a designer working in an attention business, you will design for attention … Senior stakeholders like simple KPIs. Not complex arguments about user needs and human values … If a senior person gives a directive, say increase reach, then that’s what designers design for without necessarily thinking about the consequences”.
Companies know exactly what they need to do to grow and to drive profitability. However, they mostly choose not to consider, mitigate and prioritise to avoid some of the potentially harmful consequences. What they design and prioritise are strategies to maximise consumption, activity and profitability. They are very good at it.
Let us hear what the children say, remembering that some recent research indicates that 42% of five to 12 year-olds in this country use social media. The Pathways research project I referred to earlier worked closely with 21 children aged 12 to 18, who said: “We spend more time online than we feel we should, but it’s tough to stop or cut down”. “If we’re not on social media, we feel excluded”. “We like and value the affirmations and validations we receive”. “We create lots of visual content, much of it about ourselves, and we share it widely”. “Many of us are contacted by unknown adults”. “Many of us recognise that, through using social media, we have experienced body image and relationships problems”.
To test whether the children in this research project were accurately reporting their experiences, the project decided to place a series of child avatars—ghost children, in effect—on the internet, whose profiles very clearly stated that they were children. It did this to test whether these experiences were true.
They found—in many cases within a matter of hours of the profiles going online—proactive contacting by strangers and rapid recommendations to engage more
and more. If searches were conducted for eating disorders or self-harm, the avatars were quickly able to access content irrespective of their stated ages and clearly evident status as children. At the same time they were being sent harmful or inappropriate content, they also received age-relevant advertising for school revision and for toys—the social media companies knew that these accounts were registered as children.
This research was done two years ago. Has anything improved since then? It just so happens that 5Rights has produced another piece of research which is about to be released, and which used the exact same technique—creating avatars to see what they would experience online. They used 10 avatars based on real children aged between 10 and 16, so what happened? For an 11 year-old avatar, Instagram was recommending images of knives with the caption “This is what I use to self-harm”; design features were leading children from innocent searches to harmful content very quickly.
I think any grandparents in the Chamber will be aware of an interesting substance known as “Slime”—a form of particularly tactile playdough which one’s grandchildren seem to enjoy. Typing in “Slime” on Reddit was one search, and one click, away from pornography; exactly the same thing happened on Reddit when the avatar typed in “Minecraft”, another very popular game with our children or grandchildren. A 15 year-old female avatar was private-messaged on Instagram by a user that she did not follow—an unknown adult who encouraged her to link on to pornographic content on Telegram, another instant messaging service. On the basis of this evidence, it appears that little or nothing has changed; it may have even got slightly worse.
By an uncomfortable coincidence, last week, Meta, the parent company of Facebook and Instagram, published better than expected results and saw its market value increase by more than $50 billion in after-hours trading. Mark Zuckerberg, the founder of Meta, proudly announced that Meta is pouring investment into artificial intelligence tools to make its platform more engaging and its advertising more effective. Of particular interest and concern given the evidence of the avatars was his announcement that since the introduction of Reels, a short-term video feed designed specifically to respond to competition from TikTok, its AI-driven recommendations had boosted the average time people spend on Instagram by 24%.
To return to the analogy of planes and cars used by the noble Lord, Lord Allan, we are dealing here with planes and cars in the shape of platforms and applications which we know are flawed in their design. They are not adequately designed for safety, and we know that they can put users, particularly children and young people, in the way of great harm, as many grieving families can testify.
In conclusion, our amendments propose that companies must design digital services that cater for the vulnerabilities, needs, and rights of children and young people by default; children’s safety cannot and must not be an afterthought or a casualty of their business models. We are asking for safety by design to protect children to become the mandatory standard. What we have today is unsafe design by default, driven by commercial strategies which can lead to children becoming collateral damage.
Given that it is the noble Baroness’s birthday, I am sure we can feel confident that the Minister will have a positive tone when he replies. I beg to move.
4.30 pm