Schedule 6 [Data matching]:
moved Amendment No. 110B:
Schedule 6, page 64, line 22, leave out ““(including the identification of any patterns and trends)””
The noble Lord said: I shall also speak to Amendment No. 112. Clause 65 provides for Schedule 6. Paragraph 2 of the schedule inserts a new Part 2 into the Audit Commission Act 1998.
New Section 32A provides for the Audit Commission to carry out data-matching exercises or to arrange for another organisation to do that on its behalf. New subsection (2) defines a data-matching exercise, while subsection (4) provides that such,
““assistance may, but need not, form part of an audit””.
Amendment No. 110B amends this definition to restrict it to,
““the comparison of sets of data to determine how far they match””,
excluding the identification of any patterns and trends.
New Section 32C in Schedule 6 provides that, where the Audit Commission thinks it appropriate, it may,
““conduct a data matching exercise using data held by or on behalf of bodies not subject to section 32B””,
which sets out the bodies that may be required to provide information to the commission in order to conduct a data-matching exercise including sensitive personal data.
The Explanatory Notes highlight that those voluntary bodies could include central government departments and some private sector bodies, such as mortgage providers. Can the Minister confirm whether that provision could provide access to the children's index or the national identity register? Is it correct that information disclosed to the commission for matching could then be disclosed to an unrestricted range of bodies for fraud detection and prevention purposes, or is there another statutory duty to disclose the information?
Amendment No. 112, our second amendment, would remove Section 32C in its entirety.
The amendments are intended to explore what the Government really mean by data-matching in this context and to probe the extent of data-sharing that they expect to take place, with all the questions that spiral from that. I hope that the Minister will take the opportunity to explain in some detail exactly what the Audit Commission does with the personal data with which it deals as part of the national fraud initiative. Is it all for audit purposes? We need to question what the Government might want the Audit Commission to do with access to a greater mass of personal data under the provisions. Should we consider why the Audit Commission should be empowered to match data that do not form part of an audit?
On one level, data-matching could involve little more than the comparison of two or more sets of data to see whether there are overlaps. For example, the commission compares data to identify whether someone is claiming two benefits that are supposed to be mutually exclusive—in the old days, whether they were claiming income support on one hand and unemployment benefit on the other. Payrolls are another example. Data on individuals can be matched to see whether someone is claiming benefits in one borough but working in another.
However, Liberty highlights that, in reality, the definition of data-matching goes much further. The Bill states that it is to include the identification of any patterns and trends. There are serious concerns that that is more akin to data-mining than data-matching.
As my noble friend Lady Anelay highlighted at Second Reading,
““the Bill could open the way for operations under which software was used to search several databases to identify suspicious patterns of activity that simply could not be spotted when the data were seen individually””.—[Official Report, 7/2/07; col. 736.]
In essence, the Bill enables what are commonly termed fishing expeditions—data-mining that does not have to be founded on any suspicion or intelligence that a person or company has done anything wrong.
The Government’s consultation acknowledged that there would be concerns about the legality of data-mining. They were right and have so far failed to convince commentators on the Bill that it is a proportionate measure. Liberty believes that data-mining, by its very nature, will not be targeted or the intelligence as well sifted as the Minister suggested at Second Reading. In fact, it has significant concerns about the Bill’s compliance with both Human Rights Act and Data Protection Act principles. As we have discussed, huge quantities of data could be analysed, and while that may help to identify a few criminals, is that enough justification to subject the majority of the innocent population to such measures?
As well as those points of principle, there are practical considerations. As the noble Lord, Lord Thomas of Gresford, said on Second Reading, there is no guarantee that any patterns or trends thrown up by data-matching are meaningful or significant. There is a considerable amount of luck—one could say chance—involved. Who will interpret the results of the data-mining: the Audit Commission or the organisation to which it releases the data? There are not that many steps from the trawl of data dictating who will be investigated because of their characteristics or behaviours and the justification that the Government need under Part 1 for a serious crime prevention order.
I understand that, in the presentation to our researchers last Monday, the Audit Commission explained that it merely matches the data; it is then up to the local authority, for example, to follow up that match to establish whether there has been a simple mistake or fraudulent use. How do, and how will, the Government ensure that there is adequate training for and checks on those who are provided with the data in interpreting and handling the information? What hoops do organisations have to jump through to check the status of the information? Will they stop an individual’s benefits and then check his status, or will they check and then stop his benefits? This, once again, goes back to the need for a code of practice across the board.
Will there be a process of complaint for individuals if the bodies of the Audit Commission get it wrong? I understand that twins, for example, especially those who have the same initials, can prove particularly tricky in these circumstances.
We need only look at the inaccurate results thrown up by any data-mining exercise conducted on our shopping practices by the likes of private sector bodies such as Tesco, based on loyalty cards, to see that data-mining is not infallible. When it leads to people being sent vouchers for a brand that they would never buy, it is merely an annoyance; if, however, it led to an innocent person being subjected to a police investigation or preventive measures, the personal cost would be much greater and would be unacceptable.
Even if it were acknowledged that the investigation was a mistake, would not the record that there had been an investigation be kept on file? Would that record link into the national identity register, for example, so that those using it to verify personal details would see that someone had been investigated, regardless of the fact that it was an error or that the person was found innocent? Again, at this point, the adage that mud sticks or that there is no smoke without fire would hold. It would certainly indicate the reaction that many people might have to such information.
We have in the past drawn a comparison with the more stringent regime of Germany. Data there may be mined only with the authorisation of the court—something that is missing here—and for the following purposes. First, there must be evidence that a crime may have been committed. Secondly, the crime in question must be serious and one of the specific criminal offences set out in the criminal procedure rules, such as the trafficking of drugs or weapons, endangering the safety of the public or creating risk to life or limb. Thirdly, the investigation of the crime would be seriously impaired if the public authorities were denied the right to carry out the data-mining exercise. What assessment have Her Majesty’s Government made of the regime in Germany, and what consideration did they give to applying similar stringent restrictions to data-mining in this country?
New Section 32C(7) in Schedule 6 enables a data-matching exercise to include,
““data provided by a body or person outside England and Wales””.
Whom do the Government have in mind in that provision? Would data from anyone anywhere in the world be included? Could the body or person in theory include offshore bank accounts or even organisations such as the CIA? Will an organisation have to meet any criteria before it can take part in the voluntary provision of data under subsection (7)? Will an organisation have to adhere to rules on what it does with the data once they are matched?
There are concerns that parliamentary approval for data-mining in the context of protection against fraud will be open to function creep, an area where, dare I say it, the Government’s record does not inspire trust.
There are concerns that such approval will be treated as a green light for the use of data-mining processes in many other contexts. I hope that the Minister will be able to address these and other concerns, which will no doubt be expressed by Members on other sides of the House. I look forward to hearing her reply. I beg to move.
Serious Crime Bill [HL]
Proceeding contribution from
Lord Henley
(Conservative)
in the House of Lords on Monday, 26 March 2007.
It occurred during Debate on bills on Serious Crime Bill [HL].
About this proceeding contribution
Reference
690 c1530-3 Session
2006-07Chamber / Committee
House of Lords chamberSubjects
Librarians' tools
Timestamp
2023-12-15 11:49:11 +0000
URI
http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_388641
In Indexing
http://indexing.parliament.uk/Content/Edit/1?uri=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_388641
In Solr
https://search.parliament.uk/claw/solr/?id=http://data.parliament.uk/pimsdata/hansard/CONTRIBUTION_388641