My Lords, I know that the Minister has carefully considered the definition of “commercial pornography”, and I am grateful that he has engaged with my comments on previous drafts of the regulations and that we have met in person to discuss these. Further to those conversations, I am happy to say that I support the regulations and the guidance, and certainly encourage other noble Lords to do the same, although I have a number of concerns I would like to highlight.
First, I note that it has taken more than 18 months since Third Reading to get to the point where this House and the other place are considering the regulations to determine what is deemed commercial pornography
and the regulator’s guidance on age verification. I hope the Minister can assure us that the full implementation of age verification for pornographic websites is now very close. Indeed, it would be even better if he could tell the House when he expects it to be operational.
Secondly, I note that in its report on the Bill, Sub-Committee B of the Secondary Legislation Scrutiny Committee said that the measures available to the BBFC, as the age-verification regulator, should be applied “fairly and transparently”. I certainly hope that they will be. To this end, I ask the Minister to place a letter in the Library nine months after age verification goes live, with an update on the number of websites with AV in place and how many enforcement actions have taken place. I hope that that will be possible.
Thirdly, I cannot address the regulations and guidance that will help give effect to Part 3 of the Digital Economy Act without reflecting on the fact that, thanks to amendments introduced by your Lordships’ House, Part 3 will no longer address some very serious problems as effectively as it would have done. When Part 3, as amended, is implemented, there will be nothing in it to prevent non-photographic and animated child sex abuse images, which are illegal to possess under Section 62 of the Coroners and Justice Act 2009, being accessed behind age verification. This is a serious problem. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the Internet Watch Foundation, but since none of these images was hosted in the UK, it was unable to act.
Of course I appreciate that technically the amendments to the Digital Economy Bill, which removed from the regulator the power to take action against such material when it is behind age verification, did not have the effect of legalising possession of this material. The 2009 Act remains in place. However, as virtually all this material is beamed into the UK from other jurisdictions, the arrival of the Digital Economy Bill in your Lordships’ House meant that for the first time we had a credible means of enforcing that law online. There is no need for a regulator to be in the same jurisdiction as a website that it determines to block.
As I said at the time, greeting the first really credible means of enforcing that law online by removing the relevant enforcement mechanism from key parts of the Bill inevitably called into question our commitment to the law. I appreciate that there is arguably a mechanism for trying to enforce the law: the National Crime Agency can work with overseas agencies if websites with this material are identified. However, the mechanism is slow and expensive, and it remains unclear how it can have any effect if the domestic laws of the countries in question permit non-photographic child sex abuse images. To this extent, it was no surprise to me that in response to a Written Parliamentary Question in September 2018, the Government were unable to say whether the NCA had taken action against any websites, or whether any sites had been removed by overseas jurisdictions. ComRes polling published in the summer shows that 71% of MPs think that the regulator should be empowered to block these sites. Only 5% disagree.
The other loophole, of course, relates to all but the most extreme forms of violent pornography. Given that under the Video Recordings Act 1984 it is not legal to supply this material, it was entirely proper that the Digital Economy Bill, as it entered your Lordship’s House, did not accommodate such material. However, amendments were introduced in this House to allow it behind age verification. As I observed at the time, this sent out the message loud and clear that violence against women—unless it is “grotesque”, to quote the Minister on Report, at col. 1093—is, in some senses, acceptable.
My concerns about the impact of such material remain and have been mirrored by those of the Women and Equalities Select Committee in its report, which I referred to earlier. Of great importance, it states:
“There is significant research suggesting that there is a relationship between the consumption of pornography and sexist attitudes and sexually aggressive behaviour, including violence. The Government’s approach to pornography is not consistent: it restricts adults’ access to offline pornography to licensed premises and is introducing age-verification of commercial pornography online to prevent children’s exposure to it, but it has no plans to address adult men’s use of mainstream online pornography”.
I appreciate that we cannot deal with these problems today. The Government must, however, urgently prioritise how to address them. They could deal with the matter very quickly if they were to make time for my very short two-clause Digital Economy Act amendment Bill, which addresses the matter in full. With these caveats, I warmly welcome the regulations and the guidance.