My Lords, I start by thanking noble Lords for their amendments, which bring us
back to the important issues around the use of automated processing in what is an increasingly digital world. I apologise if my smile was misleading, I was just very pleased to see the noble Baroness in her place; it did not indicate anything other than that.
The range in which automated processing is applied includes everything from suggested views on YouTube to quotes for home insurance and beyond. In considering these amendments it is important to bear in mind that automated decision-making can bring benefits to data subjects, so we should not view these provisions simply through the prism of threats to data subjects’ rights. The Government are conscious of the need to ensure that stringent provisions are in place to regulate appropriately decisions based solely on automated processing. We have included in the Bill the necessary safeguards such as the right to be informed of automated processing as soon as possible, along with the right to challenge an automated decision made by a data controller or processor. We have considered the amendments proposed by noble Lords and believe that Clauses 13, 43, 48, 94, 95, 111 and 189 provide sufficient safeguards to protect data subjects of all ages—adults as well as children.
5.15 pm
Let me respond first to Amendments 34 and 92. They seek to insert into Parts 2 and 3 a definition of “significant decision” as including a decision that has a legal or similar effect for the data subject or a group sharing one of the nine protected characteristics under the Equality Act 2010 to which the data subject belongs. Of course, all types of discrimination, including discriminatory profiling via the use of algorithms and automated processing, endanger an individual’s rights and liberties. However, we note that the Equality Act already provides for safeguards for individuals to ensure that they are not discriminated against on the basis of a protected characteristic. In addition, recital 71 of the GDPR states that data controllers must ensure that they use appropriate mathematical or statistical procedures to ensure that factors that result in inaccuracies are minimised, and to prevent discriminatory effects on individuals based on their racial origin or ethnicity, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation. We therefore do not believe that a further provision is required.
Amendments 35, 93 and 100 are designed to prevent any automated decision being taken if the decision engages the rights of the data subject under the Human Rights Act. The GDPR and the Bill permit automated decision-making where it is authorised by law, subject to the safeguards provided. The noble Baroness, and Liberty, in its briefing, argued both that we should lower the threshold of types of decisions to which the prohibition on automated decision-making applies to include a broader range of decisions—namely those that engage the convention rights—and that we have muddled types of decisions and processing in resisting the amendments in Committee. I assure the noble Baroness that this is not the case. Clause 13 merely replicates the threshold in article 22 of the GDPR. Similarly, in Part 3, Clauses 47 and 48 faithfully give effect to article 11 of the Law Enforcement Directive. We believe that these amendments would strike the
wrong balance because they would mean that practically all decisions would be caught by the prohibition, even if authorised by law. As I have indicated, the safeguards in the Bill will apply where significant decisions are based on automated processing.
It may assist if I provide a practical example of the effect of these amendments in the context of Part 4. The intelligence services may use automated processing in their investigations, perhaps in a manner akin to a triage process to narrow down a field of inquiry. The decision arising from such a process may be to conduct a further search of their systems; arguably, that decision significantly affects a data subject and engages that individual’s human rights. As such, it would be prohibited by the amendment, potentially impeding appropriate investigative work around identifying national security threats where the alternative of trawling through records by hand would be quite impossible.
Amendment 36 seeks to clarify what is meant by a decision,
“based solely on automated processing”,
to ensure that human intervention must be meaningful. I am sympathetic to the intention behind the amendment but the phrase, especially when read with recital 71 of the GDPR, already provides for this. As my noble friend Lord Ashton indicated in Committee, mere human presence or incidental human involvement is not sufficient to constitute meaningful input. The input must be meaningful. Therefore, the level of human intervention required is already clarified in the text and I am confident that further elaboration of this provision is not required.
Amendments 37, 38 and 91 seek to enable the data subject to receive meaningful information about the qualifying decision taken so that they can determine whether it will benefit or harm their interest. The GDPR already provides for this in articles 13 and 14, which require data controllers to provide data subjects with meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject when data is collected from them and whenever it is processed for a new purpose. The provision of information to data subjects is addressed in the article 29 working group guidelines published in October, which emphasise the need for the information to be meaningful and include an example relating to credit scoring. However, I would be happy to send the noble Lord a copy of the guidelines.
Similarly, we consider that Part 3 already provides for adequate information to be given to the data subject. Where a significant decision is based solely on automated processing, Clause 48 places a duty on the controller to notify the data subject and confers a right on the data subject to request that the controller reconsider the decision or take a new decision that is not based solely on automated processing. Given these provisions, we consider that the existing rights in the GDPR and in Part 3 are sufficient to enable a data subject to access information about processing related to them and to use such information as the basis to challenge any decisions.
Amendments 39, 40 and 94 would require the data controller to provide specified information, some of it technical in nature, regarding automated processing.
We consider such a provision unnecessary, not least because of the substantial burden it would place on the data controller. Furthermore, a data subject would gain little from receiving technical information of arguably little practical use. As I have indicated, the GDPR and the Bill already provide that where automated processing results in a significant decision affecting the data subject, they must be informed of it. The GDPR and LED do not require data controllers to provide such technical information to data subjects and I am not persuaded that the Bill should take a different route.
Amendment 41 seeks to require a data controller to deposit an impact assessment with the Information Commissioner before engaging in automated decision-making concerning a child. I fear I am similarly unpersuaded of the case for this amendment. Article 35 already makes comprehensive provision for a data controller to conduct an impact assessment where a type of processing of personal data is likely to result in high risk to the rights and freedoms of natural persons, which will include children. Furthermore, article 36 requires the controller to consult with the commissioner prior to processing if the impact assessment reveals that there is a high risk to the processing. On the basis of this, the commissioner will provide the controller with appropriate written advice. We consider these measures comprehensive and believe no further requirements are needed in the Bill to supplement these articles.
Finally, Amendments 101 and 102 seek to strike out the provisions in Part 4 regarding automated decision-making relating to contracts. Although, of course, Part 4 derives from the modernised Council of Europe Convention 108 rather than the GDPR, it is instructive that recital 71 to the GDPR expressly refers to processing,
“necessary for the entering or performance of a contract between the data subject and a controller”,
as one example of automated processing allowed when authorised by law. That being the case, I do not see why these provisions should be seen as vexing in any way. Again as I indicated in Committee, running a job applicant’s or prospective contractor’s name through a database as part of an initial sift could be one step in a selection or procurement process. The result of such processing could help determine subsequent stages of the process and therefore whether to enter into a contract.
The drafters of the GDPR, LED and Convention 108 were sensitive to the need to apply appropriate safeguards around the use of automated decision-making. The GDPR and the Bill give effect to such safeguards. In particular, data subjects must be notified of decisions and may request that the decision be reconsidered. Given this and the other safeguards provided in the Bill, including the monitoring and enforcement role of the Information Commissioner, I am satisfied that the Bill already makes adequate and proportionate provision and I therefore invite noble Lords not to press their amendments.