UK Parliament / Open data

Data Protection Bill [HL]

My Lords, as the economy becomes more digitised, the politics of data become centrally important. As the Minister himself said, data is the fuel of the digital economy, and public policy now needs an agile framework around which to balance the forces at play. We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data. The recent theft of the personal details of 143 million Americans in the hack of Equifax or the unfolding story of abuse of social media in the US elections by Russian agents make the obvious case for data protection.

This Bill attempts to help us tackle some big moral and ethical dilemmas, and we as parliamentarians have a real struggle to be sufficiently informed in a rapidly changing and innovative environment. I welcome the certainty that the Bill gives us in implementing the GDPR in this country in a form that anticipates Brexit and the need to continue to comply with EU data law regardless of membership of the EU in the future.

However, we need e-privacy alongside the GDPR. For example, access to a website being conditional on accepting tracking cookies should be outlawed; we need stricter rules on wi-fi location tracking; browsers

should have privacy high by default; and we need to look at extending the protections around personal data to metadata derived from personal data.

But ultimately I believe that the GDPR is an answer to the past. It is a long-overdue response to past and current data practice, but it is a long way from what the Information Commissioner’s briefing describes as,

“one of the final pieces of much needed data protection reform”.

I am grateful to Nicholas Oliver, the founder of people.io, and to Gi Fernando from Freeformers for helping my thinking on these very difficult issues.

The Bill addresses issues of consent, erasure and portability to help protect us as citizens. I shall start with consent. A tougher consent regime is important but how do we make it informed? Even if 13 is the right age for consent, how do we inform that consent with young people, with parents, with adults generally, with vulnerable people and with small businesses which have to comply with this law? Which education campaigns will cut through in a nation where 11 million of us are already digitally excluded and where digital exclusion does not exclude significant amounts of personal data being held about you? And what is the extent of that consent?

As an early adopter of Facebook 10 years ago, I would have blindly agreed to its terms and conditions that required its users to grant it,

“a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content”.

I posted on the site. It effectively required me to give it the right to use my family photos and videos for marketing purposes and to resell them to anybody. Thanks to this Bill, it will be easier for me to ask it to delete that personal data and it will make it easier for me to take it away and put it goodness knows where else with whatever level of security I deem fit, if I can trust it. That is welcome, although I still quite like Facebook, so I will not do it just yet.

But what about the artificial intelligence generated from that data? If, in an outrageous conflagration of issues around fake news and election-fixing by a foreign power to enable a reality TV star with a narcissistic personality disorder to occupy the most powerful executive office in the free world, I take against Facebook, can I withdraw consent for my data to be used to inform artificial intelligences that Facebook can go on to use for profit and for whatever ethical use they see fit? No, I cannot.

What if, say, Google DeepMind got hold of NHS data and its algorithms were used with bias? What if Google gets away with breaking data protection as part of its innovation and maybe starts its own ethics group, marking its own ethics homework? Where is my consent and where do I get a share of the revenue generated by Google selling the intelligence derived in part from my data? And if it sells that AI to a health company which sells a resulting product back to the NHS, how do I ensure that the patients are advantaged because their data was at the source of the product?

No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead. Users can no

longer reasonably comprehend the risk associated with data sharing, and so cannot reasonably be asked to give consent.

The individual as a data controller also becomes central. I have plenty of names, addresses, phone numbers and email addresses, and even the birthdays of my contacts in my phone. Some are even Members of your Lordships’ House. If I then, say, hire a car and connect my phone to the car over Bluetooth so that I can have hands-free driving and music from my phone, I may then end up sharing that personal contact data with the car and thereby all subsequent hirers of the car. Perhaps I should be accountable with the car owner for that breach.

Then, thanks to AI, in the future we will also have to resolve the paradox of consent. If AI determines that you have heart disease by facial recognition or by reading your pulse, it starts to make inference outside the context of consent. The AI knows something about you, but how can you give consent for it to tell you when you do not know what it knows? Here, we will probably need to find an intermediary to represent the interests of the individual, not the state or wider society. If the AI determines that you are in love with someone based on text messages, does the AI have the right to tell you or your partner? What if the AI is linked to your virtual assistant—to Siri or Google Now—and your partner asks Siri whether you are in love with someone else? What is the consent regime around that? Clause 13, which deals with a “significant decision”, may help with that, but machine learning means that some of these technologies are effectively a black box where the creators themselves do not even know the potential outcomes.

The final thing I want to say on consent concerns the sensitive area of children. Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting and so on. I am an advocate of the uses of these technologies. Many have seamless integration with the school management information systems that thereby expose children’s personal data to third parties based on digital contracts. Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.

Then there is the collection of data by schools to populate the national pupil database held by the Department for Education. This database contains highly sensitive data about more than 8 million children in England and is routinely shared with academic researchers and other government departments. The justification for this data collection is not made clear by the DfE and causes a big workload problem in schools. Incidentally, this is the same data about pupils that was shared with the Home Office for it to pursue deportation investigations. I am talking about data collected by teachers for learning being used for deportation. Where is the consent in that?

I have here a letter from a Lewisham school advising parents of its privacy policy. It advises parents to go to a government website to get more information about how the DfE stores and uses the data, if they are interested. That site then advises that the Government,

“won’t share your information with any other organisations for marketing, market research or commercial purposes”.

That claim does not survive any scrutiny. For example, Tutor Hunt, a commercial tutoring company, was granted access to the postcode, date of birth and unique school reference number of all pupils. This was granted for two years up to the end of March this year to give parents advice on school choice. Similar data releases have been given to journalists and others. It may be argued that this data is still anonymous, but it is laughable to suggest that identity cannot then be re-engineered, or engineered in the first place, from birth date, postal code and school. The Government need to get their own house in order to comply with the Bill.

That leads me to erasure, which normally means removing all data that relates to an individual, such as name, address and so on. The remaining data survives with a unique numeric token as an identifier. Conflicting legislation will continue to require companies to keep data for accounting purposes. If that includes transactions, there will normally be enough data to re-engineer identity from an identity token number. There is a clause in the Bill to punish that re-engineering, which needs debating to legitimise benign attempts to test research and data security, as discussed by the noble Baroness, Lady Manningham-Buller.

The fact that the Bill acknowledges how easy it is to re-identify from anonymous data points to a problem. The examples of malign hacking from overseas are countless. How do we prevent that with UK law? What are the Government’s plans, especially post Brexit, to address this risk? How do we deal with the risk of a benign UK company collecting data with consent—perhaps Tutor Hunt, which I referred to earlier—that is then acquired by an overseas company, which then uses that data free from the constraints of this legislation?

In the context of erasure, let me come to an end by saying that the Bill also allows for the right to be forgotten for children as they become 18. This is positive, as long as the individual can choose what they want to keep for him or herself. Otherwise, it would be like suggesting you burn your photo albums to stop an employer judging you.

Could the Minister tell me how the right to be forgotten works with the blockchain? These decentralised encrypted trust networks are attractive to those who do not trust big databases for privacy reasons. By design, data is stored in a billion different tokens and synced across countless devices. That data is immutable. Blockchain is heavily used in fintech, and London is a centre for fintech. But the erasure of blockchain data is impossible. How does that work in this Bill?

There is more to be said about portability, law enforcement and the intelligence services, but thinking about this Bill makes my head hurt. Let me close on a final thought. The use of data to fuel our economy is critical. The technology and artificial intelligence it generates has a huge power to enhance us as humans and to do good. That is the utopia we must pursue. Doing nothing heralds a dystopian outcome, but the pace of change is too fast for us legislators, and too complex for most of us to fathom. We therefore need to devise a catch-all for automated or intelligent decisioning by future data systems. Ethical and moral clauses could and should, I argue, be forced into terms

of use and privacy policies. That is the only feasible way to ensure that the intelligence resulting from the use of one’s data is not subsequently used against us as individuals or society as a whole. This needs urgent consideration by the Minister.

8.03 pm

About this proceeding contribution

Reference

785 cc183-7 

Session

2017-19

Chamber / Committee

House of Lords chamber
Back to top