Opinion Piece by Derek Sayer*
Editorial Note: In our previous blogpost, we noted that while there was agreement that REF2014 was problematic, there was less agreement about alternatives. To make progress, we need more debate. We hope that this piece by Derek Sayer will stimulate this, and we welcome comments. Please note that comments are moderated and will not be published immediately.
The rankings produced by Times Higher Education and others on the basis of the UK’s Research Assessment Exercises (RAEs) have always been contentious, but accusations of universities’ gaming submissions and spinning results have been more widespread in REF2014 than any earlier RAE. Laurie Taylor’s jibe in The Poppletonian that ‘a grand total of 32 vice-chancellors have reportedly boasted in internal emails that their university has become a top 10 UK university based on the recent results of the REF’[i] rings true in a world in which Cardiff University can truthfully[ii] claim that it ‘has leapt to 5th in the Research Excellence Framework (REF) based on the quality of our research, a meteoric rise’ from 22nd in RAE2008. Cardiff ranks 5th among universities in the REF2014 ‘Table of Excellence,’ which is based on the GPA of the scores assigned by the REF’s ‘expert panels’ to the three elements in each university’s submission (outputs 65%, impact 20%, environment 15%)—just behind Imperial, LSE, Oxford and Cambridge. Whether this ‘confirms [Cardiff’s] place as a world-leading university,’ as its website claims, is more questionable.[iii] These figures are a minefield.
Although HEFCE encouraged universities to be ‘inclusive’ in entering their staff in REF2014, they were not obliged to return all eligible staff and there were good reasons for those with aspirations to climb the league tables to be more ‘strategic’ in staff selection than in previous RAEs. Prominent among these were (1) HEFCE’s defunding of 2* outputs from 2011, which meant outputs scoring below 3* would now negatively affect a university’s rank order without any compensating gain in QR income, and (2) HEFCE’s pegging the number of impact case studies required to the number of staff members entered per unit of assessment, which created a perverse incentive to exclude research-active staff if this would avoid having to submit a weak impact case study.[iv] Though the wholesale exclusions feared by some did not materialize across the sector, it is clear that some institutions were far more selective in REF2014 than in RAE2008.
Unfortunately data that would have permitted direct comparisons with numbers of staff entered by individual universities in RAE2008 were never published, but Higher Education Statistical Authority (HESA) figures for FTE staff eligible to be submitted allow broad comparisons across universities in REF2014. It is evident from these that selectivity, rather than an improvement in research quality per se, played a large part in Cardiff’s ‘meteoric rise’ in the rankings. The same may be true for some other schools that significantly improved their positions, among them Kings (up to 7th in 2014 from 22= in 2008), Bath (14= from 20=), Swansea (22= from 56=), Cranfield (31= from 49), Heriot-Watt (33 from 45), and Aston (35= from 52=). All of these universities except Kings entered fewer than 75% of their eligible staff members, and Kings has the lowest percentage (80%) of any university in the REF top 10 other than Cardiff itself.