My Lords, these amendments return us to the issue of automated decision-making, which we debated on Monday, albeit principally in the context of Part 2.
The noble Baroness, Lady Hamwee, has indicated that the purpose of Amendment 134A is to probe why Clause 48(1)(b) is required. Clauses 47 and 48 should be read together. Clause 47 essentially operates to prohibit the controller making a significant decision based solely on automated processing, unless such a decision is required or authorised by law. Where automated decision-making is authorised or required by law, Clause 48 permits the controller to make a qualifying significant decision, subject to the specified safeguards.
A significant decision based solely on automated processing which is not required or authorised by law is an unlawful decision and therefore null and void. That being the case, we should not seek to legitimise an unlawful decision by conferring a right on a data subject to request that such a decision be reconsidered. Should such a decision be made contrary to Clause 47(1), the proper way to deal with it is through enforcement action by the Information Commissioner, not through the provisions of Clause 48.
Amendments 135 and 144 seek to prevent any decision being taken on the basis of automated decision-making where the decision would engage the rights of the data subject under the Human Rights Act. As my noble friend Lord Ashton indicated on Monday when the Committee debated Amendment 75, which was framed in similar terms, such a restriction would arguably wholly negate the provisions in respect of automated decision-making as it would be possible to argue that any decision based on automated decision-making would, at the very least, engage the data subject’s right to respect for privacy under Article 8 of the European Convention on Human Rights.
At the same time, the unintended consequences of this could be very damaging. For example, any intelligence work by the intelligence services relating to an individual would almost certainly engage the right to respect for private life. The effect of the amendment on Part 4 would therefore be to prevent the intelligence services taking any further action based on automated processing, even if that further action was necessary, proportionate, authorised under the law and fully compliant with the Human Rights Act. Where a decision will have legal or similarly significant effects for a data subject, data controllers will be required to notify data subjects to ensure that they can seek the remaking of that decision with human intervention. We believe that this affords sufficient safeguards.
Turning to Amendment 135A, I can assure the noble Baroness, Lady Hamwee, that automated processing does indeed include profiling. This is clear from the definition of profiling in Clause 31 which refers to,
“any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to an individual”.
Given that, I do not believe more is needed, but I confirm that there is no significance in omitting the word “profiling”. We did not include a reference to profiling as an example of automated decision-making on the grounds that it is just that, an example, and therefore an express reference to including profiling would add nothing.
Amendment 135B would require controllers to notify data subjects within 72 hours where a qualifying significant decision has been made based solely on automated processing. While it is appropriate elsewhere in the Bill to require controllers to report data breaches to the Information Commissioner, where feasible, within 72 hours, we consider that the existing requirement to notify data subjects of what is a lawful qualifying significant decision as soon as reasonably practicable establishes the need for prompt notification while recognising that there needs to be some flexibility to reflect the operational environment.
Amendment 136A seeks to require the Information Commissioner to appoint an independent person to oversee the operation of automated decision-making under Part 3. I am unpersuaded of the case for this amendment. The Information Commissioner is, of course, already an independent regulator with express statutory duties to, among other things, monitor and enforce the provisions in Part 3, so it is unclear to me why the commissioner should be obliged to, in effect, subcontract her functions in so far as they relate to automated decision-making. Such processing is subject to the commissioner’s oversight functions as much as any other processing, so I do not see why we need to single it out for special treatment. If the argument is that automated processing can have a more acute impact on data subjects than any other forms of processing, then it is open to the commissioner to reflect this in how she undertakes her regulatory functions and to monitor compliance with Clauses 47 and 48 more closely than other aspects of Part 3, but this should be left to the good judgment of the commissioner rather than adding a new layer of regulation.
The noble Baroness asked whether it is 21 days from receipt of notification or another time. Clause 48(2)(b) makes it clear that it is 21 days from receipt.
I have some sympathy for Amendment 137, which requires controllers subject to Part 3, on request, to provide data subjects with the reasons behind the processing of their personal data. I agree that data subjects should, in general, have the right to information about decision-making which affects them, whether or not that decision-making derives from automated processing. However, this is not straightforward. For example, as with the rights to information under Clauses 42 and 43, this cannot be an absolute right otherwise we risk compromising ongoing criminal investigations. If the noble Baroness will agree not to move Amendment 137, I undertake to consider the matter further ahead of Report.
Amendments 142C and 143B in the name of the noble Lord, Lord Stevenson, seek to confer a new duty on controllers to inform data subjects of their right to intervene in automated decision-making. I believe the Bill already effectively provides for this. Clause 95(3) already places a duty on a controller to notify a data
subject that a decision about them based solely on automated processing has been made.
Amendments 145 and 146 seek to strike out the provisions in Part 4 that enable automated decision-making in relation to the consideration of contracts. The briefing issued by Liberty suggested that there was no like provision under the GDPR, but recital 71 to the GDPR expressly refers to processing,
“necessary for the entering or performance of a contract between the data subject and a controller”,
as one example of automated processing which is allowed when authorised by law. Moreover, we envisage the intelligence services making use of this provision—for example, considering whether to enter into a contract may initially require a national security assessment whereby an individual’s name is run through a computer program to determine potential threats.
Finally, Amendment 146A would place a duty on the intelligence services to inform the Information Commissioner of the outcome of their consideration of a request by a data subject to review a decision based solely on automated processing. We are not persuaded that a routine notification of this kind is necessary. The Information Commissioner has a general function in relation to the monitoring and enforcement of Part 4 and in pursuance of that function can seek necessary information from the intelligence services, including in respect of automated processing.
I hope again that my detailed explanation in response to these amendments has satisfied noble Lords, and as I have indicated, I am ready to consider Amendment 137 further ahead of Report. I hope that on that note, the noble Baroness will withdraw the amendment.