UK Parliament / Open data

Higher Education and Research Bill

My Lords, to pick up on the recently finished speech of the noble Baroness, Lady Wolf, I thoroughly agree with the three main points she made. First, producing a mixed indicator, as the Government propose, would not be useful to students or others looking at the quality of a university or a course. It would be like composing a meal out of mincemeat, cornflakes and cleaning fluid. Each of those things is useful in its own right, but mix them together and they have no function. Keep them separate, as the noble Baroness advocated, and you get some very useful data on which students can judge in their own terms the quality of a university.

Secondly, let these things be criterion-referenced. We have a real problem at the moment in GCSE—we are saying that every child should get English and Maths, but we are making that impossible, because we make these exams harder as students do better. About 30% are required to fail in order to meet the requirements of Ofqual. We have to be careful about this when we are looking at a bronze, silver or gold indicator. If we do not make these indicators criterion-referenced, we are saying that, whatever happens— however well our universities do—we will always call 20% of them bronze. In other words, we will put them into an international students’ “avoid at all costs” category. That seems a really harmful thing to do. If these criteria mean anything —if there is a meaning to any of the elements going into the TEF—we should be able to say, “We want you to hit 60%.” Why not? Why do the criteria have to be relative? They do not mean anything as relative criteria. They must have absolute meanings and they must be absolute targets.

Thirdly, this really adds up. The noble Lord, Lord Liddle, made it clear that gold, silver and bronze indicators—this big step change between the three grades —are not suited to a collection of imprecise measures. You do not know whether an institution that you have placed towards the bottom of silver is actually bronze or, worse, whether something in bronze is actually in the middle of silver. It is not that exact. You have to do what the Government do elsewhere in education statistics—for example, in value added on schools—which is, yes, to publish a value, but publish a margin of error too. That way, people get to learn that you might be saying: “This is actually 957 on your scale of 1,000, but the error margin is somewhere between 900 and 1,010.” You get used to the imprecision, to understand that this is not precise, so you can put a proper value on the information you are being given.

About this proceeding contribution

Reference

778 c476 

Session

2016-17

Chamber / Committee

House of Lords chamber
Back to top