The Haldane principle: remembering Fisher and getting that definition right

Opinion piece by G. R. Evans

It is very welcome news that the Government has decided to include a definition of the Haldane Principle on the face of the Bill. Jo Johnson made a special point of this in his speech to Universities UK on 24 February.  An accompanying document was published jointly by both the Departments of State that will in future be responsible for higher education. It proudly states that:

the amendment that we have tabled will, for the first time in history, enshrine the Haldane Principle in law.

This document did not, however, give more details. The actual Amendment of Clause 99 proposing and containing the definition is to be found in yet another document:

Page 64, line 10, at end insert -

The ‘Haldane principle’ is the principle that decisions on individual research proposals are best taken following an evaluation of the quality and likely impact of the proposals (such as a peer review process).”

Note that this definition does not stipulate an exercise of academic judgement, merely an ‘evaluation’ including ‘likely impact’ of research to be funded. Furthermore, the definition does not mention that infrastucture funding will come from Research England. Rather, an earlier statement merely stipulates the Councils (a sub-set of UKRI) will be responsible for the disbursement of project funding:

Page 64, line 7, at end insert -

“the Haldane principle, where the grant or direction mentioned in subsection (1) is in respect of functions exercisable by one or more of the Councils mentioned in section 91(1) pursuant to arrangements under that section,”

This took me back to the question what Haldane actually called for and the context in which he did so. His thoughts on higher education matters are chiefly to be found in some collected writings put together in a period when he was actively involved in fostering the development of the new ‘redbrick’ universities. He developed a special enthusiasm for technical education but essentially he was interested in the work of a university as a whole, not merely its research.

He recognised that if higher education was going to expand successfully something would have to be done about the funding that would be needed:

‘the truth is that work of this kind must be more largely assisted and fostered by the State than is the tradition of today if it is to succeed’

(Education and Empire: Addresses on certain topics of the day (London, 1902), p.38).

The new universities began to accept state funding but it was not at first expected that Oxford or Cambridge would need to apply. The First World War upset many expectations.

A decisive correspondence followed between November 1918 and May 1919, between the then President of the Board of Education, H. A. L. Fisher, and the Vice-Chancellor of the University of Oxford. This was published in full in May in the Oxford University Gazette, under the heading Applications for Government Grants (Oxford University Gazette, 49 (1918-9), p.471-8).

A deputation from the universities ‘asking for larger subsidies from the State’ met Fisher on 23 November. Oxford and Cambridge consulted one another and agreed that it would be wise to join in, but without committing themselves. Oxford was understandably nervous about accepting state funding because of the likelihood that it would bring State control.

But the Oxford scientists, scenting money, put in their own bids for specific sums for particular purposes. The heads of departments of the University Museum wrote on 3 March, 1919 with a list of such ‘needs’, identifying sums for capital outlay and salaries and pensions for Heads of Department and scholarships for what would now be called STEM subjects.

It was in this context that Fisher seems to have made his far-reaching policy decision and stated the ‘Fisher Principle’, that the state would not interfere in the allocation of funds within universities. It would not decide directly whether to fund, say, science at Oxford, or History at Manchester. It would give funding in the form of ‘Block Grants’ and allow the universities themselves to decide how to use the money.

He wrote to the Oxford Vice-Chancellor on 16 April:

‘Henceforth…each University which receives aid from the State will receive it in the form of a single inclusive grant, for the expenditure of which the University, as distinguished from any particular Department, will be responsible. Both the Government and, I think, the great majority of the Universities are convinced that such an arrangement is favourable not only to the preservation of University autonomy but also to the efficient administration of the University funds.’

The University’s Council (then the Hebdomadal Council, meeting weekly in term-time) requested an interview with Fisher and on May 15 a deputation of five, led by the Vice-Chancellor, had a meeting with him. The Memorandum of the Interview ‘kindly furnished by Mr. Fisher’s Secretary’ is also published in the Gazette. It repeated the policy principle arrived at in November, that ‘the English Universities in receipt of State-aid favoured …a general Block Grant’. It was explained that a Standing Committee was in process of ‘formation’ and that ‘henceforward, practically all the money for University Education would be borne on the Treasury Vote and would be allocated in annual Block Grants’ as the Standing Committee recommended.

This Standing Committee developed into the University Grants Committee, which was replaced a quarter of a century ago by first one then four Funding Councils. One of those, HEFCE, is now to be replaced as distributor of the remnant of that Block Grant mainly by Research England within UKRI, with only a vestige of the element previously used to fund teaching still remaining.

So there seem to be features of the Government Amendment to Clause 99 which would bear further thought if a definition of the ‘Haldane Principle’ is to enter statute.

The Haldane Principle arguably needs to be understood as it was developed in the ‘Fisher Principle’ and has been maintained for a century since. That placed a ‘buffer’ body between State and university and protected the freedom of the university to choose how to use its block grant on academic not political principles. That is not quite the thrust of the definition as it stands at present.

Nor did the ‘Fisher-Haldane Principle’ apply to the buffer body itself. The buffer stood between academic freedom and state control. It was not itself subject to that principle. It merely ensured that it was respected.

It is to be hoped that the legal draftsmen working on the Bill will try again. The version in the current Amendment, if it passes into law, will fail to protect the autonomy of the providers receiving funding from UKRI. Nor will it require funding decisions to be taken by academics or by autonomous institutions. The ‘peer review process’ is given as a mere example. There seems nothing to prevent a Minister or Secretary of State conducting ‘an evaluation of the quality and likely impact of the proposals’. Haldane and Fisher could both be turning in their graves.

 

 

 

Why the NSS is garbage

A note by Lord Lipsey    

The National Students Survey results matter. First, they are used by students to evaluate institutions by comparison with rival institutions. Secondly, they are one of the metrics to be used in the TEF, in awarding gold, silver or bronze markings to institutions which apply to take part. These ratings will decide if an institution can or cannot raise its fees beyond £9K.

The idea that student satisfaction should play a major role in the rating of universities is controversial. Research shows that there is no correlation between student satisfaction and student results in terms of degree grade . However the government has opted to increase the importance of student choice, competition and satisfaction in the higher education landscape; and this short note does not seek to address the rights and wrongs of that.

Rather it focusses on a narrow point: whether the National Student Survey (NSS), the chosen instrument to measure student satisfaction, is fit for purpose or not.  Here the evidence is unequivocal, The NSS is statistical garbage, The reasons are widely understood by the statistical community and were set out inter alia by the Royal Statistical Society in its response to the government’s technical consultation about the TEF http://www.rss.org.uk/Images/PDF/influencing-change/2016/RSS-response-to-BIS-Technical Consultation-on-Teaching-Excellence-Framework-year-2.pdf.

  1. The NSS is not based on a random or representative sample of students. It is more akin to a census, which includes everyone who chooses to complete the form but not those who don’t. Some groups eg ethnic minorities are seriously underrepresented. As the final report of the ONS Review of Data Sources for the TEF 2016 said: “under-reporting of certain groups and over-coverage of others …could lead to bias in use of the data.” https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/556352/Teaching-Excellence-Framework-review-of-data-sources.pdf

I chair Trinity Laban Conservatoire of Music and Dance, a leading London Conservatoire. This year Trinity Laban’s response rate to the NSS rose from about 60% to about 80%. This makes any comparison between last year’s results and this year’s results invalid. We know nothing about the 20%; so a 50% satisfaction rate amongst respondents could perfectly well be a 70% satisfaction rate amongst students as whole.

  1. Even if the responses are treated as a random or representative sample, making calculations of statistical significance possible, the margins of error in the NSS figures are large. The ONS concluded that “differences between institutions at the overall level are small and are not significant” https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/556352/Teaching-Excellence-Framework-review-of-data-sources.pdf Yet on that are being based student decisions on where to go, and the level of fees institutions can charge.

So for example Trinity Laban’s  music results – and music is our biggest group of students -are based on 112 students. To take an example only 49% of these students say that marking on their course has been fair. Statistically however and treating the returns as if they were a sample, this means that there is a 95% chance that between 34% and 64% of students think it is fair. This perhaps exaggerates the unreliability of the NSS insofar as. if response rates are high, then the results may be less unreliable that they would be for a sample. But the point still applies. And.in cases of smaller sub-samples the low response renders the results vacuous. For example for Trinity Laban Musical Theatre the poor results were based on response from just 23 students.

The small sample size is a particular problem for small institutions. It is also a grave problem in that the most valid comparisons are not between institutions, but between particular departments in institutions teaching the same or similar courses. For individual courses, there will tend to be only a fraction of the responses obtained for institutions as a whole.

  1. The results can be greatly affected by happenstance. Results in the TL musical theatre department in the past have been good which suggests that there was some peculiar chemistry about this year (and also there is some suggestion that students coordinated their responses to make a point). A survey of dance satisfaction for Trinity Laban was completed the day after a one-off cock-up about room bookings; had it been done two days earlier the results might have been much better.
  1. There is scope for “gaming” the results and encouraging students to give positive results,. Trinity Laban does not do this. Anecdotal evidence, as cited by the RSS paper above suggests that less scrupulous institutions do, and the incentive to do so will be greatly increased now that the NSS has new significance as a metric for the TEF.
  1. Less concrete but not less important, an emphasis on the student experience may lead to undesirable effects on what they are taught. Institutions focussed on a high NSS score will tend to dumb down degrees or go for safe options on content. ]

Here is an example from Trinity Laban. TL exists due to an amalgamation of Trinity College of Music and Laban dance. A few years ago, the Principal introduced a compulsory two-week course called CoLab which was a creative programme involving dance and music working together. At first students were furious at being deprived of these two weeks off from the focus of their studies – and no doubt this would have been reflected in their satisfaction ratings. However time has passed; familiarity has grown; and CoLab is now one of the most popular things we offer with our students.

In addition, during the ONS review of the NSS,  respondents expressed reservations about wider issues related to the use of information from the NSS and the DLHE. Concerns included:

  • limited variation between institutions of the raw scores from the student responses
  • difficulty in trying to compare widely differing institutions
  • difficulty in capturing the wider benefits beyond academic results of attending a higher education institution

The government has already downgraded the importance of the NSS in TEF – the so-called LSE amendment made when it was pointed out that NSS suggested that LSE and a number of other prestigious institutions were rated low by students. The metric should be subject to intense scrutiny when the House of Lords debates the Higher Education and Research Bill in committee; and I am tabling amendments to make sure it is.

The spurious precision of the NSS has the capacity to damage staff morale, to put students off certain institutions and affect the validity of the TEF. I know of at least once case where a head of department at a major institution came close to resigning because of a disappointing NSS result. I know of another where a Principal was dismissed inter alia because of the failure of that institution’s NSS results to improve. This is terrifying.

Lord Lipsey is joint Chair of the All Party Statistics Group, a Fellow of the Academy of Social Science, a Board member of Full Fact the fact checking charity, a former member of the advisory council of the National Centre for Social Research,  founder of Straight Statistics and a former adviser on opinion polling to James Callaghan as prime minister. In other words, though he may talk nonsense on many subjects, this is not one of them!