moved Amendment No. 102A:
102A: Clause 61 , page 32, line 20, leave out ““preventing”” and insert ““detecting actual or attempted””
The noble Baroness said: With this amendment I shall also speak to Amendments Nos. 104B, 106A, 107A, 110C and 116A. We are now going to change gear rather and move on to consideration of Part 3 of the Bill, which, unlike Part 2, is controversial.
The consequence of accepting our amendments would be that the Bill would not authorise the sharing or mining of data for the purposes of preventing fraud or other forms of crime that have not yet been committed. The amendments would not, however, prevent data sharing or matching being carried out to detect fraud or other criminal activity that has already occurred or been attempted.
Part 3 proposes increased data-sharing powers across public and private sector bodies for the purposes of identifying and preventing fraud. It would also give express statutory authorisation for the practice of data matching, or data mining, which involves computerised fishing expeditions into the personal data of huge numbers of people, most of whom there is no reason to suspect have ever been involved in any kind of fraudulent activity. The Audit Commission is already conducting data-matching exercises on a bi-annual basis to identify fraud, as part of the national fraud initiative.
As we on these Benches said at Second Reading, it is important that we should make the best use of modern data systems to detect fraud. The methods adopted, however, must be not only effective but proportionate. We have serious reservations about some of the detail of the privacy implications ofPart 3, which are reflected by the amendments tabled by my noble friends and me—not only my noble friend Lord Henley on the Front Bench but my noble friends on the Back Benches.
Part 3 would increase the scope of the existing practice of data matching by giving the Home Secretary the power to extend the purposes for which data matching can be undertaken, by increasing the involvement of private bodies in data-matching exercises and by amending the terms of the Data Protection Act 1998. This kind of mass data collection, data sharing and data mining, is a familiar theme in Home Office legislation, we find. It raises serious ethical and constitutional issues.
These schemes have the potential to change the nature of the relationship between the state and the citizen, turning us, if we are not careful, from a nation of citizens into a nation of suspects. We are particularly concerned that these fishing expeditions could and would be used to identify patterns, trends or profiles that suggested the possibility of future criminal behaviour. That would be permitted by the Bill, which would authorise data sharing and data mining for the purposes of preventing fraud or other criminal behaviour, not merely investigating crimes and fraud that have already been committed or attempted.
We are not yet persuaded that it would be appropriate to mine data to predict the likelihood of fraud or other types of criminal behaviour with the aim of preventing them before they occur or are even attempted. I return to my enjoyment of some of the films one can see these days. The process we are being invited to agree to in this part of the Bill reminds me of ““Minority Report””. Perhaps noble Lords will remember the Tom Cruise film. It is a scary future where individuals are arrested before any offence has even committed. There is a series of personality profiling and the hero finds himself about to be arrested for a murder he has not committed and had not even thought of committing.
Some patterns, trends or personal profiles identifiable by any data-mining exercise might well suggest that a type of future behaviour is likely. Why should this be used to justify preventive action, particularly where this could be detrimental in some way to the person concerned? Not everyone follows normal or typical patterns, trends or profiles? We do not easily fit what those who are writing software programs would like us to comply with. Just because a person grows up in an area where nine out of 10 young people may commit crime at some time in their lives, it does not follow that he or she will follow suit. Individuals surely should be judged on the basis of what they do rather than what others who live near them or who may be like them have done in thepast. Data-mining to identify patterns of behaviour indicative of future risks cannot be 100 per cent successful. The fear is that it could lead to innocent people being unjustifiably identified and targeted. I beg to move.
Serious Crime Bill [HL]
Proceeding contribution from
Baroness Anelay of St Johns
(Conservative)
in the House of Lords on Wednesday, 21 March 2007.
It occurred during Committee of the Whole House (HL)
and
Debate on bills on Serious Crime Bill [HL].
About this proceeding contribution
Reference
690 c1268-70 Session
2006-07Chamber / Committee
House of Lords chamberSubjects
Librarians' tools
Timestamp
2023-12-15 12:17:17 +0000
URI
http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_386773
In Indexing
http://indexing.parliament.uk/Content/Edit/1?uri=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_386773
In Solr
https://search.parliament.uk/claw/solr/?id=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_386773