<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>CDBU &#187; peer review</title>
	<atom:link href="http://cdbu.org.uk/tag/peer-review/feed/" rel="self" type="application/rss+xml" />
	<link>http://cdbu.org.uk</link>
	<description>Council for the Defence of British Universities</description>
	<lastBuildDate>Fri, 23 Jun 2017 21:43:20 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.18</generator>
	<item>
		<title>The Haldane principle: remembering Fisher and getting that definition right</title>
		<link>http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/</link>
		<comments>http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/#comments</comments>
		<pubDate>Mon, 27 Feb 2017 18:21:33 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[Guest blog]]></category>
		<category><![CDATA[H.A.L. Fisher]]></category>
		<category><![CDATA[Haldane Principle]]></category>
		<category><![CDATA[higher education and research bill]]></category>
		<category><![CDATA[history]]></category>
		<category><![CDATA[peer review]]></category>
		<category><![CDATA[Universities]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=2149</guid>
		<description><![CDATA[Opinion piece by G. R. Evans It is very welcome news that the Government has decided to include a definition of the Haldane Principle on the face of the Bill. Jo Johnson made a special point of this in his &#8230; <a href="http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><strong>Opinion piece by G. R. Evans</strong></p>

<p>It is very welcome news that the Government has decided to include a definition of the Haldane Principle on the face of the Bill. Jo Johnson made a special point of this in his <a href="https://www.gov.uk/government/speeches/jo-johnson-higher-education-and-research-bill">speech to Universities UK on 24 February.  </a>An accompanying document was published jointly by both the Departments of State that will in future be responsible for higher education. It proudly states that:</p>

<p><em>the amendment that we have tabled will, <strong>for the first time in history, </strong></em><a href="https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/594554/HERB_Lords_report_government_amendments.pdf"><strong><em>enshrine the Haldane Principle in law.</em></strong></a></p>

<p>This document did not, however, give more details. The <a href="https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/594554/HERB_Lords_report_government_amendments.pdf">actual Amendment</a> of Clause 99 proposing and containing the definition is to be found in yet another document:</p>

<p>Page 64, line 10, at end insert -</p>

<p><strong>“<em>The &#8216;Haldane principle&#8217; is the principle that decisions on individual research proposals are best taken following an evaluation of the quality and likely impact of the proposals (such as a peer review process).”</em> </strong></p>

<p>Note that this definition does not stipulate an exercise of academic judgement, merely an ‘evaluation’ including ‘likely impact’ of research to be funded. Furthermore, the definition does not mention that infrastucture funding will come from Research England. Rather, an earlier statement merely stipulates the Councils (a sub-set of UKRI) will be responsible for the disbursement of project funding:</p>

<p>Page 64, line 7, at end insert -</p>

<p><strong><em>“the Haldane principle, where the grant or direction mentioned in subsection (1) is in respect of functions exercisable by one or more of the Councils mentioned in section 91(1) pursuant to arrangements under that section,” </em></strong></p>

<p>This took me back to the question what Haldane actually called for and the context in which he did so. His thoughts on higher education matters are chiefly to be found in some collected writings put together in a period when he was actively involved in fostering the development of the new ‘redbrick’ universities. He developed a special enthusiasm for technical education but essentially he was interested in the work of a university as a whole, not merely its research.</p>

<p>He recognised that if higher education was going to expand successfully something would have to be done about the funding that would be needed:</p>

<p><strong><em>‘t</em></strong><strong><em>he truth is that work of this kind must be more largely assisted and </em></strong><strong><em>fostered by the State than is the tradition of today if it is to succeed’</em></strong></p>

<p>(<em>Education and Empire: Addresses on certain topics of the day</em> (London, 1902), p.38).</p>

<p>The new universities began to accept state funding but it was not at first expected that Oxford or Cambridge would need to apply. The First World War upset many expectations.</p>

<p>A decisive correspondence followed between November 1918 and May 1919, between the then President of the Board of Education, H. A. L. Fisher, and the Vice-Chancellor of the University of Oxford. This was published in full in May in the <em>Oxford University Gazette,</em> under the heading <em>Applications for Government Grants</em> <em>(Oxford University Gazette,</em> 49 (1918-9), p.471-8).</p>

<p>A deputation from the universities ‘asking for larger subsidies from the State’ met Fisher on 23 November. Oxford and Cambridge consulted one another and agreed that it would be wise to join in, but without committing themselves. Oxford was understandably nervous about accepting state funding because of the likelihood that it would bring State control.</p>

<p>But the Oxford scientists, scenting money, put in their own bids for specific sums for particular purposes. The heads of departments of the University Museum wrote on 3 March, 1919 with a list of such ‘needs’, identifying sums for capital outlay and salaries and pensions for Heads of Department and scholarships for what would now be called STEM subjects.</p>

<p>It was in this context that Fisher seems to have made his far-reaching policy decision and stated the ‘Fisher Principle’, that the state would not interfere in the allocation of funds within universities. It would not decide directly whether to fund, say, science at Oxford, or History at Manchester. It would give funding in the form of ‘Block Grants’ and allow the universities themselves to decide how to use the money.</p>

<p>He wrote to the Oxford Vice-Chancellor on 16 April:</p>

<p><strong><em>‘Henceforth…each University which receives aid from the State will receive it in the form of a single inclusive grant, for the expenditure of which the University, as distinguished from any particular Department, will be responsible. Both the Government and, I think, the great majority of the Universities are convinced that such an arrangement is favourable not only to the preservation of University autonomy but also to the efficient administration of the University funds.’</em></strong></p>

<p>The University’s Council (then the Hebdomadal Council, meeting weekly in term-time) requested an interview with Fisher and on May 15 a deputation of five, led by the Vice-Chancellor, had a meeting with him. The Memorandum of the Interview ‘kindly furnished by Mr. Fisher’s Secretary’ is also published in the <em>Gazette</em>. It repeated the policy principle arrived at in November, that ‘the English Universities in receipt of State-aid favoured …a general Block Grant’. It was explained that a Standing Committee was in process of ‘formation’ and that ‘henceforward, practically all the money for University Education would be borne on the Treasury Vote and would be allocated in annual Block Grants’ as the Standing Committee recommended.</p>

<p>This Standing Committee developed into the University Grants Committee, which was replaced a quarter of a century ago by first one then four Funding Councils. One of those, HEFCE, is now to be replaced as distributor of the remnant of that Block Grant mainly by Research England within UKRI, with only a vestige of the element previously used to fund teaching still remaining.</p>

<p>So there seem to be features of the Government Amendment to Clause 99 which would bear further thought if a definition of the ‘Haldane Principle’ is to enter statute.</p>

<p>The Haldane Principle arguably needs to be understood as it was developed in the ‘Fisher Principle’ and has been maintained for a century since. That placed a ‘buffer’ body between State and university and protected the freedom of the university to choose how to use its block grant on academic not political principles. That is not quite the thrust of the definition as it stands at present.</p>

<p>Nor did the ‘Fisher-Haldane Principle’ apply to the buffer body itself. The buffer stood between academic freedom and state control. It was not itself subject to that principle. It merely ensured that it was respected.</p>

<p>It is to be hoped that the legal draftsmen working on the Bill will try again. The version in the current Amendment, if it passes into law, will fail to protect the autonomy of the providers receiving funding from UKRI. Nor will it require funding decisions to be taken by academics or by autonomous institutions. The ‘peer review process’ is given as a mere example. There seems nothing to prevent a Minister or Secretary of State conducting ‘an evaluation of the quality and likely impact of the proposals’. Haldane and Fisher could both be turning in their graves.</p>

<p>&nbsp;</p>

<p>&nbsp;</p>

<p>&nbsp;</p>
<div class="pvc_clear"></div><p id="pvc_stats_2149" class="pvc_stats " element-id="2149"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/" data-count="vertical" data-text="The Haldane principle: remembering Fisher and getting that definition right" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Fthe-haldane-principle-remembering-fisher-and-getting-that-definition-right%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/the-haldane-principle-remembering-fisher-and-getting-that-definition-right/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Reflections on the REF and the need for change</title>
		<link>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/</link>
		<comments>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/#comments</comments>
		<pubDate>Wed, 07 Jan 2015 14:24:54 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[CDBU Updates]]></category>
		<category><![CDATA[funding formula]]></category>
		<category><![CDATA[grade inflation]]></category>
		<category><![CDATA[HEFCE]]></category>
		<category><![CDATA[league tables]]></category>
		<category><![CDATA[metrics]]></category>
		<category><![CDATA[peer review]]></category>
		<category><![CDATA[REF]]></category>
		<category><![CDATA[REF2014]]></category>
		<category><![CDATA[university ratings]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=1759</guid>
		<description><![CDATA[Discussion piece by the CDBU Steering Group Results from the research excellence framework (REF) were publicly announced on 18th December, followed by a spate of triumphalist messages from University PR departments. Deeper analysis followed, both in the pages of the &#8230; <a href="http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><i>Discussion piece by the CDBU Steering Group</i></p>

<p style="text-align: center;"><i style="color: #333333; line-height: 24.375px;"><a style="text-decoration: underline;" href="http://cdbu.org.uk/wp-content/uploads/2015/01/possible-picture-for-header.jpg"><img class="aligncenter  wp-image-1763" style="margin-top: 0.4em; border: 0px; background: #eeeeee;" alt="possible picture for header" src="http://cdbu.org.uk/wp-content/uploads/2015/01/possible-picture-for-header.jpg" width="4000" height="2250" /></a></i></p>

<p>Results from the research excellence framework (REF) were publicly announced on 18th December, followed by <a href="http://www.chris-hackley.com/2014/12/when-17th-really-means-51st-and-leading.html?m=1" target="_blank">a spate of triumphalist messages</a> from University PR departments. Deeper analysis followed, both in the pages of the Times Higher Education, and in the media and on blogs.</p>

<p>CDBU has from the outset expressed concern about the REF, much of it consistent with the criticism that has been expressed elsewhere. In particular, we note:</p>

<p><strong>Inefficiency:</strong> As <a href="http://www.timeshighereducation.co.uk/features/one-scholars-crusade-against-the-ref/2017405.fullarticle" target="_blank">Derek Sayer has noted</a>, the REF has absorbed a great deal of time and money that might have been spent better elsewhere. The precise cost has yet to be reported, but it is likely to be greater than the £60m official figure, and that is not taking into account the <a href="http://www.theguardian.com/higher-education-network/2014/dec/15/research-excellence-framework-five-reasons-not-fit-for-purpose" target="_blank">cost in terms of the time of academic staff</a>. Universities have taken on new staff to do the laborious work of compiling data and writing impact statements, but this has diverted funds from front-line academia and increased administrative bloat.</p>

<p><strong>Questionable validity</strong>: <a href="http://cdbu.org.uk/problems-with-peer-review-for-the-ref/" target="_blank">Derek Sayer</a> has cogently argued the case that the peer review element of REF is open to bias from subjective, idiosyncratic and inexpert opinions. It is also unaccountable in the sense that ratings made of individual outputs are destroyed. One can see why this is done: otherwise HEFCE could be inundated with requests for information and appeals. But if the raw data is not available, then this does not inspire confidence in the process, especially when there are widespread accusations of <a href="http://www.timeshighereducation.co.uk/story.aspx?storyCode=2017670" target="_blank">games-playing</a> and <a href="http://www.wonkhe.com/blogs/ref-results-marred-by-fears-over-grade-inflation/" target="_blank">grade inflation</a>.</p>

<p><strong>Concentration of funding in a few institutions: </strong>We are told that the goal is to award quality-related funding, but as currently implemented, this leads inevitably to a process whereby <a href="http://deevybee.blogspot.co.uk/2013/10/the-matthew-effect-and-ref2014.html" target="_blank">the rich get richer and the poor get poorer</a>, with the bulk of funds concentrated in a few institutions. We suspect that the intention of including &#8216;impact&#8217; in the REF was to reduce the disparity between the Golden Triangle (Oxford, Cambridge and London) and other institutions which might be doing excellent applied work, but if anything the opposite has happened. We do not yet know what the funding formula will be, but if it is, as widely predicted, heavily biased in favour of 4* research, we could move to a situation where <a href="http://deevybee.blogspot.co.uk/2014/12/dividing-up-pie-in-psychology-in.html" target="_blank">only the large institutions will survive to be research-active</a>. There has been no discussion of whether such an outcome is desirable.</p>

<p><strong>Shifting the balance of funding across disciplines:</strong> A recent <a href="http://www.timeshighereducation.co.uk/news/funding-plea-for-humanities-as-life-sciences-crowned-ref-2014-champion/2017667.article#.VKaCV8xpSNc.twitter" target="_blank">article in the Times Higher Education</a> noted another issue: the tendency for those in the Sciences to obtain higher scores on the REF than those in the Humanities. Quotes from HEFCE officials in the article offered no reassurance to those who were concerned this could mean a cut in funding for humanities. Such a move, if accompanied by <a href="http://civitas.org.uk/newblog/2014/04/give-vocational-courses-priority-and-make-them-cheaper/" target="_blank">changes to student funding to advantage those in STEM subjects</a>, could dramatically reduce the strength of Humanities in the UK.</p>

<p><span id="more-1759"></span></p>

<p><strong>Unaccountable flexibility in the funding formula:</strong> There are <a href="http://www.wonkhe.com/blogs/rankings-data-tables-and-spin/" target="_blank">many different ways of achieving ratings</a>: For instance, whether or not the ratings include <a href="http://www.wonkhe.com/blogs/ref-2014-sector-results-2/" target="_blank">&#8216;intensity&#8217; (number of returnable staff who were entered), can dramatically alter rank orderings</a>. Or we could look at a suggestion by <a href="http://www.wonkhe.com/blogs/the-bang-for-buck-heroes-of-uk-research/" target="_blank">Graeme Wise</a> that a &#8216;bang for your buck&#8217; metric that assessed outputs in relation to grant income would be most appropriate. Even more radical is a suggestion by <a href="http://researchrandomness.blogspot.co.uk/2014/12/bang-for-buck-in-ref-2014.html" target="_blank">Dermot Lynott</a>, that we should be giving the most rewards to those whose outputs were impressive in relation to their scores on environment. Needless to say, a very different profile of winners and losers emerged from such an analysis.  It will ultimately be a political decision as to how to translate the REF scores into funding. We have to ask whether it worth going through this entire long-winded exercise if, by simply changing the funding formula, one can make a dramatic difference to an institution&#8217;s funding to achieve a politically expedient outcome.</p>

<p><strong>Damage inflicted on careers and morale:</strong> The criteria for entering staff for the REF could appear quite cavalier; for instance, the requirement for a numerical ratio between number of staff entered and number of case studies meant that some departments with few case studies were unable to enter all plausible staff. <a href="http://www.timeshighereducation.co.uk/news/lancaster-historian-appeals-against-his-inclusion-in-ref/2008570.article" target="_blank">Derek Sayer</a> has described instances of decisions to enter staff being made on what appeared to be flimsy evidence based on ad hoc internal evaluations. Yet being identified as &#8216;non-REFable&#8217; is not only damaging to morale, but could have real impacts on prospects for promotion and job security.</p>

<p><strong>Focus on competition rather than collaboration: </strong> The REF exercise creates rank orderings, and everyone is desperately trying to nudge ahead of the others. In fact, there are so many different ways of doing the ranking, that almost everyone can be satisfied that they are &#8216;among the top&#8217; on some index or other. Those who crowed loudest about their success tried to temper this by arguing that they were celebrating a broader &#8216;British&#8217; success, but this seems perverse. Why should concentrating ever more of the excellent research in an ever small number of institutions be regarded as a national success story? It is, of course, widely believed that competition is a force for good, stimulating people to do better than they otherwise might. However, many in academia take the view that they don&#8217;t need to be incentivised by competition to work hard: they are in the job for the love of it, and would like their efforts to be appreciated for what they are, not because they help push the institution up a league table. Competition can also damage relationships between different departments within a University, if there are disparities in REF performance that lead to bickering about who is more valuable.</p>

<p><strong>Perverse incentives that damage research: </strong>These may play out differently in different subject areas, but overall, many academics feel that they are not able to do the research they want in the way they would like. In science, there are <a href="https://theconversation.com/the-dark-side-of-research-when-chasing-prestige-becomes-the-prize-35001" target="_blank">intense pressures to publish in high-impact</a> journals and bring in grant income. Some institutions are notorious for threatening redundancy to scientific staff who do not meet some agreed quota of research income, creating incentives to do ever more expensive research (see for instance cases at <a href="http://deevybee.blogspot.co.uk/2014/06/the-university-as-big-business.html" target="_blank">Kings College London</a>, <a href="http://www.timeshighereducation.co.uk/news/imperial-college-professor-stefan-grimm-was-given-grant-income-target/2017369.article" target="_blank">Imperial College</a>, and <a href="http://www.timeshighereducation.co.uk/news/simplistic-redundancy-metrics-criticised/2016357.article" target="_blank">Warwick University Medical School</a>).  In humanities, the pressure to produce a steady stream of research articles and monographs has led to the sense of an enforced move to over-specialisation, with academics increasingly incapable of explaining or demonstrating the broader significance of their work. Younger generations of academics find that their direction of research is being wholly driven by &#8216;REF-ability’, and that journal publications automatically trump those in volumes of collected essays, even when the latter may be more important for the field.</p>

<p><strong>Perverse incentives on hiring practices: </strong>This is another consequence of the intensely competitive culture that is induced by the REF. We have, particularly around the time of the REF, a market in research &#8216;super-stars&#8217;, who can attract impressive transfer fees. <a href="http://crookedtimber.org/2014/12/18/research-excellence-framework-the-denouement/" target="_blank">People from other institutions who are employed at only 20 per cent part-time</a> suddenly appear on the books, boosting the institution&#8217;s return on funding and outputs.</p>

<p><strong>Devaluation of non-research activity</strong>: Academics whose positions require them to teach and do research have felt pressured to focus principally on research, and teaching has consequently been devalued. It is sometimes suggested that the Impact agenda of the REF also encourages academics to spend time on public engagement, but in fact it has the opposite effect. Public engagement does not count as &#8216;impact&#8217; for REF  purposes: to demonstrate REF impact, one must provide concrete documentation of how a specific piece of research has influenced non-academic users, such as policy-makers, health professionals, museums, etc.</p>

<p><strong>How have we come to this?</strong></p>

<p>Given that<a href="http://www.theguardian.com/higher-education-network/2014/dec/16/research-excellence-framework-2014-the-postmortem-live-chat?CMP=share_btn_tw" target="_blank"> many of these points were made in the run-up to the announcement of REF results</a>, we have to ask how it is that we find ourselves trapped in such an undesirable system. It is noteworthy that the REF is popular with many vice-chancellors and administrative teams. It makes it easier to manage staff, with objective criteria for hiring and firing, and provides league tables to measure progress by. For those already attached to the vision of a university as big business, it seems the natural next step to have objective rules for defining winners and losers so that one can directly measure an individual&#8217;s likely ability to bring money into the university without having to make difficult judgements about the intrinsic quality of their work.</p>

<p>It is a moot point whether the collapse of the circle of winners to Oxbridge and London was the result of deliberate planning, or an unintended consequence of how the system operates. Be this as it may, one concern is that this could provide further pressure for the British university system to reconfigure itself so that it can compete with the American private elite. There would be increasing reluctance to use general taxation to concentrate even more educational resources within close proximity to London, and instead we could see a shift to private funding, increasing the extent to which access to the best institutions is distributed according to wealth rather than ability.  In a few short years we could see the destruction of a genuinely national system of higher education, publicly funded because it is designed to serve everyone, to a privately funded system that is world-class for the few who can afford to access it, but a disaster for the country as a whole. As in the US, educational opportunity would be concentrated overwhelmingly in places where it can be accessed primarily by the wealthy, privileged and well-connected.</p>

<p>At the time of the announcement of REF results, there was a sense that anyone who criticised the celebrations was either a bad loser, or – if they came from an institution who did well – a traitor for not celebrating British success. At CDBU we are proud of UK Universities and their research reputation, but our loyalty is to our discipline, our profession, our vocation and our sense of their place in the wider scheme of things: we fear that the assessment process embodied in REF will in the longer term damage these.</p>

<p><strong>Where next?</strong></p>

<p>It is, of course, all very well to criticise and paint visions of a dystopian future. If we wish to replace the current system, we must look at alternative ways forward. <a href="http://www.thebookseller.com/news/sage-debates-value-research-excellence-framework" target="_blank">At a debate about the REF</a>, organised by Sage Publishers on 8<sup>th</sup> December, David Willetts MP, who until July 2014 was Minister of State for Universities and Science, took the view that some of those who disapproved of REF were just dinosaurs who wanted to go back to the 1970s, when funds were allocated to institutions by a group of the great and the good making judgements over dinner at the Athenaeum. We would dispute that this is the only alternative to the current system. But Willetts was right on another count: he pointed out that the current REF system was not imposed by government, nor by HEFCE. Indeed, they had been actively pursuing the idea of using a simpler metrics-based system for the REF, but it was resoundingly rejected by the academic community. Government, according to Willetts, would listen to any reasonable proposal for a new system. Clearly, it is now up to the academics themselves to propose a viable alternative.</p>

<p>At the same meeting, <a href="http://www.researchresearch.com/index.php?option=com_news&amp;template=rr_2col&amp;view=article&amp;articleId=1348861" target="_blank">David Sweeney</a>, Director of HEFCE responsible for Education and Knowledge Exchange, implied that critics of REF just wanted to be handed public money without any accountability. He emphasised that the government and taxpayer put money into university research and had a right to know about the outcomes from the investment they had made. We totally agree. But we take issue with those who, like <a href="http://www.wonkhe.com/blogs/the-research-excellence-framework-fascinating-flawed-and-essential/" target="_blank">Mark Leach, Director and Editor-in-Chief of Wonke</a>, think that the REF, for all its limitations, provides a good solution. Our position is that, for all the reasons given above, the REF is a seriously flawed system for deciding on disbursement of research funds, which in the long run will do the UK University system more harm than good.</p>

<p><strong>What alternatives are there?</strong></p>

<ol>
<li><p>One possibility that has been discussed is to remove the QR component of funding altogether, and give all funds to the research councils. The problem with this solution is that it would mean the research councils would have to grow in size enormously, the load on reviewers, already seen as unsustainable by many would increase yet further, and pressures on academics to bring in research grants would be even more intense. <a href="http://www.theguardian.com/education/2014/dec/21/university-funding-reform-brain-drain-london?CMP=share_btn_tw&amp;utm_content=bufferdf118&amp;utm_medium=social&amp;utm_source=twitter.com&amp;utm_campaign=buffer" target="_blank">It has also been argued</a> that it would further increase disparities between the Golden Triangle (Oxbridge and London) and the rest, and would disfavour non-STEM disciplines.</p></li>
<li><p><a href="http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/" target="_blank">HEFCE is has been looking at various publication-based</a> metrics that might substitute for the REF, and plans to do some empirical studies comparing metric-based evaluation with REF results. However, metrics have been vigorously opposed by many in the academic community, especially in the humanities, where there is much worse agreement with expert opinion than in the sciences. We do at least now have hard data from the REF that can be considered when evaluating how metrics would perform, but there&#8217;s a real risk that introduction of any metric will further distort incentives, so that the measure becomes the goal.</p></li>
<li><p>At the Sage meeting, Derek Sayer put forward another interesting alternative, which was that funding should be based just on the &#8216;research environment&#8217; component of the REF, which focuses more on inputs than outputs.</p></li>
<li><p>An even simpler option that would retain the dual support system but remove quality-related funding would involve disbursing funds purely on the basis of the number of active researchers in a department. This could be criticised for leading to a &#8216;prairie farming&#8217; model whereby departments would band together to create enormous conglomerates that would benefit from economies of scale. One could, however, put a limit on the size of unit entered. At the Sage meeting, David Sweeney expressed himself as strongly opposed to this solution, even though <a href="http://deevybee.blogspot.co.uk/2014/10/some-thoughts-on-use-of-metrics-in.html" target="_blank">in the last round it gave a funding result that was highly correlated with actual funding outcomes from RAE</a>, in both science and humanities. He is right in noting that, despite the high correlation, there would nevertheless undoubtedly be winners and losers who would suffer substantial gains and losses in real terms, relative to the RAE result, but the question is whether this would involve unfairness. It&#8217;s hard to say, given that we have no gold standard. You could of course further argue that you have to have some measure of quality to incentivise people to do better, as well as a need to guard against freeloaders, like Laurie Taylor&#8217;s <a href="http://www.timeshighereducation.co.uk/comment/the-poppletonian/laurie-taylor-column/156378.article" target="_blank">Dr Piercemuller</a>. Yet, as we have noted, for most academics, exhortations from managers to &#8216;do better&#8217; don’t achieve much and may indeed be counterproductive. We need to be accountable for the public money spent on university research, but subjecting every apple in the barrel to an exhaustive x-ray examination may not be the best way to identify the rotten ones.</p></li>
</ol>

<p>We do not have a single solution, but we think that academics must take control of this process and not leave it in the hands of HEFCE and the government. <a href="https://www.academia.edu/9923660/The_academic_manifesto_From_an_occupied_to_a_public_unversity" target="_blank">This article</a> about Dutch Universities Netherlands has strong parallels with the UK situation: the author argues academics have been too passive in accepting an emphasis on competition, and the use of evaluation systems as means of control. There is unlikely to be an ideal solution and we may have to live with the &#8216;least bad&#8217; option. But let us consider all options in terms of how far they are likely to exacerbate or resolve the problems outlined above, or we may find ourselves saddled with something even worse than REF2014.</p>

<p>We hope this article opens up the discussion on this topic. Please do add your comments. We are moderating comments to exclude spam, but non-anonymous, on-topic comments will be published unless they contravene the usual rules.</p>

<p>Finally, if you agree with the broad concerns expressed here, please do consider <a href="http://cdbu.org.uk/participate/become-a-member/">joining the CDBU</a> to help us campaign more effectively for change.</p>
<div class="pvc_clear"></div><p id="pvc_stats_1759" class="pvc_stats " element-id="1759"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/" data-count="vertical" data-text="Reflections on the REF and the need for change" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Freflections-on-the-ref-and-the-need-for-change%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/feed/</wfw:commentRss>
		<slash:comments>22</slash:comments>
		</item>
		<item>
		<title>Problems with Peer Review for the REF</title>
		<link>http://cdbu.org.uk/problems-with-peer-review-for-the-ref/</link>
		<comments>http://cdbu.org.uk/problems-with-peer-review-for-the-ref/#comments</comments>
		<pubDate>Fri, 21 Nov 2014 12:32:34 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[bibliometrics]]></category>
		<category><![CDATA[peer review]]></category>
		<category><![CDATA[RAE]]></category>
		<category><![CDATA[REF]]></category>
		<category><![CDATA[research assessment]]></category>
		<category><![CDATA[research metrics]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=1681</guid>
		<description><![CDATA[Opinion Piece by Derek Sayer*  At the behest of universities minister David Willetts, HEFCE established an Independent review of the Role of Metrics in Research Assessment in April 2014 chaired by James Wilsden. This followed consultations in 2008-9 that played &#8230; <a href="http://cdbu.org.uk/problems-with-peer-review-for-the-ref/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><em>Opinion Piece by Derek Sayer* </em></p>

<p>At the behest of universities minister David Willetts, HEFCE established an Independent review of the Role of Metrics in Research Assessment in April 2014 chaired by James Wilsden. This followed consultations in 2008-9 that played a decisive role in persuading the government to back down on previous plans to replace the RAE with a metrics-based system of research assessment. Wilsden&#8217;s call for evidence, which was open from 1 May to 30 June 2014, received 153 responses &#8216;reflecting a high level of interest and engagement from across the sector&#8217; (<a href="http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/" target="_blank">Letter to Rt. Hon. Greg Clark MP</a>). Sixty-seven of these were from HEIs, 27 from learned societies and three from mission groups. As in 2008-9, the British academic establishment (including the Russell Group, RCUK, the Royal Society, the British Academy, and the Wellcome Trust) made its voice heard. Predictably, &#8217;57 per cent of the responses expressed overall scepticism about the further introduction of metrics into research assessment,&#8217; while &#8216;a common theme that emerged was that peer review should be retained as the primary mechanism for evaluating research quality. Both sceptical and supportive responses argued that metrics must not be seen as a substitute for peer review &#8230; which should continue to be the &#8220;gold standard&#8221; for research assessment&#8217; (Wilsden review, <a href="http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/" target="_blank">Summary of responses to submitted to the call for evidence</a>).</p>

<p>The stock arguments against the use of metrics in research assessment were widely reiterated: journal impact factors cannot be a proxy for quality because ‘high-quality’ journals may still publish poor-quality articles; using citations as a metric ignores negative citation and self-citation; in some humanities and social science disciplines it is more common to produce books than articles, which will significantly reduce their citation counts, and so on. Much of this criticism, I would argue, is a red herring. Most of these points could easily be addressed by anybody who seriously wished to consider how bibliometrics might sensibly inform a research assessment exercise rather than kill any such suggestion at birth (don&#8217;t use JIFs, exclude self-citations, use indices like Publish or Perish that include monographs as well as articles and control for disciplinary variations). What is remarkable, however, is that while these faults are often presented as sufficient reason to reject the use of metrics in research assessment out of hand, the virtues of &#8216;peer review&#8217; are simply assumed by most contributors to this discussion rather than scrutinized or evidenced. This matters because whatever the merits of peer review in the abstract—and there is room for debate on what is by its very nature a subjective process—the evaluation procedures used in REF 2014 (and previous RAEs) not only fail to meet HEFCE&#8217;s own claims to provide &#8216;expert review of the outputs&#8217; but fall far short of internationally accepted norms of peer review.</p>

<p><span id="more-1681"></span></p>

<p>In sharp contrast with the evaluative procedures used in a range of other academic contexts in the UK and internationally, the REF relies entirely on in-house assessment by panels of (so-called) experts. On some panels, like history, just one assessor may read each output, something unheard of in peer review for journal and book submissions, research grant competitions, and tenure and promotion proceedings. Additionally, almost all REF panelists are drawn from British universities alone. David Eastwood admitted back in 2007 that ‘international benchmarking of quality’ was ‘one thing that the RAE has not been able to do’—which is cute, considering that REF panels award their stars on the basis of whether outputs are &#8216;world leading&#8217;, &#8216;internationally excellent,&#8217; or merely &#8216;recognized internationally.&#8217; Eastwood then still hoped to solve the problem with ‘bibliometrics, used with sensitivity and sophistication’ (‘Goodbye to the RAE … and hello to the REF’, <em>Times Higher Education</em>, 30 November 2007). HEFCE&#8217;s prohibitions on using journal impact factors, rankings, or the perceived standing of publishers (and humanities and most social science panels’ refusal to use any bibliometric data, including citations) reinforce the total dependency of REF evaluations on these panelists’ subjective judgments. Meantime the abandonment of RAE 2008’s use of external advisors where a panel felt it lacked specialist expertise and an overall cut in the number of panels from 67 in RAE 2008 to 36 in REF 2014 further reduced the pool of expertise available to panels, which were now also only ‘exceptionally’ allowed to cross-refer outputs to other panels.
If we could be confident that REF panels nevertheless ‘provide sufficient breadth and depth of expertise to undertake the assessment across the subpanel’s remit (including as appropriate expertise in interdisciplinary research and expertise in the wider use or benefits of research)’ (HEFCE, <em>REF 2014: Units of Assessment and Recruitment of Expert Panels</em>, 2010, para 55) this might not be a problem. But we cannot. Nobody on the REF history panel, for example, has specialist knowledge of China, Japan, the Middle East, Latin America, or many countries in Europe (including once-great powers like Austria-Hungary, Spain and Turkey), though work on all these areas has likely been submitted to the REF. Whatever their general eminence in the historical profession, these &#8216;experts&#8217; do not know the relevant languages, archives, or literatures. How, then, can they possibly judge the &#8216;originality&#8217; of an output or its &#8216;significance&#8217; in any of these fields? And on what conceivable basis can they be entrusted to determine whether it is ‘internationally excellent’ or merely ‘internationally recognized’—the critical borderline between 3* research that will attract QR funding and 2* research that will not?</p>

<p>REF panelists are unlikely to have the time to do a proper assessment anyway. In all, around 1000 evaluators will have graded all 191,232 outputs for REF 2014 in under a year—the same number in total as the US National Endowment for the Humanities uses to evaluate 5700 applications for its 40 grant programs! Peter Coles calculates that each member of the physics panel must read 640 research papers, i.e., about two a day. &#8216;It is &#8230; blindingly obvious,&#8217; he concludes, &#8216;that whatever the panel does do will not be a thorough peer review of each paper, equivalent to refereeing it for publication in a journal&#8217; (&#8216;The apparatus of research assessment&#8217;, <em>LSE Impact Blogs</em>, 14 May 2014). One RAE 2008 panelist told <em>Times Higher Education</em> that it would require &#8216;two years’ full-time work, while doing nothing else&#8217; to read properly the 1200 journal articles he had been allocated (‘Burning questions for the RAE panels’, 24 April 2008). Another admitted: ‘You read them sufficiently to form a judgment, to get a feeling … you don’t have to read to the last full stop’ (‘Assessors face “drowning” as they endeavour to read 2,363 submissions’, 17 April 2008).</p>

<p>For major academic journals the process of review is often double blind. Though university presses and other academic book publishers divulge authors’ identities to reviewers, they will often also first consult with the author on appropriate reviewers. Reviewers are required to provide substantial comments in either case. The REF, by contrast, makes no attempt to protect authors&#8217; anonymity—something we might think especially important when judgments may lie in the hands of a single assessor. And far from providing comments justifying their grades, RAE 2008 subpanels shredded all documents showing how they reached their conclusions and ordered members ordered to destroy personal notes in order to avoid having to reveal them under Freedom of Information Act requests. It is difficult to think of a procedure that would make it easier for evaluators to further ideological agendas or settle personal scores, should they be so inclined.</p>

<p>Metrics may have problems. But a process that willfully ignores whether an output has gone through any peer review before publication, where it has been published, and how often it has been cited in favor of the subjective opinions of evaluators who may have no specialist expertise in the field and then systematically erases all records does not strike me as a very defensible alternative. It also gives extraordinary gatekeeping power to the individuals who sit on REF panels. This is worrying because the mechanisms through which panelists are recruited are tailor-made for the sponsored replication of disciplinary elites. All applicants for panel chairs have to be endorsed by learned societies, chairs in turn &#8216;advise&#8217; on the appointment of panel members, and at least one third of panelists have to have served in a previous RAE. Apart from being disproportionately older, white, and male compared with the UK academic profession in general, these may not always be the scholars best placed to identify cutting-edge research, especially where such research crosses or challenges disciplinary boundaries. NEH, we might note, prohibits its evaluators from serving in successive competitions to reduce this risk.</p>

<p>Were Wilsden&#8217;s committee to assess what might be achieved employing appropriately sophisticated metrics against what is actually done in the REF rather than comparing a crude caricature of metrics with an idealized chimera of peer review, I think it would have to take a different view of the merits of the two systems than that put forward in most responses to the call for evidence. For its REF process to be comparable to what is understood elsewhere as peer review, HEFCE would have to use subject matter-specific experts from an international pool, commission a minimum of two reviews of each output, and not overload reviewers with too many outputs for them to read them properly in the timeframe available. This would be even costlier in public money and academics&#8217; time than the present REF. To replace the REF with metrics, on the other hand, would yield a process that is (in Dorothy Bishop&#8217;s words) &#8216;transparent and objective, it would not require departments to decide who they do and don’t enter for the assessment, and most importantly, it wins hands down on cost-effectiveness&#8217; (‘An alternative to REF2014?’ <a href="http://deevybee.blogspot.co.uk/2013/01/an-alternative-to-ref2014.html" target="_blank">Bishopblog</a>, 26 January 2013).</p>

<p>Metrics have in fact proven to be highly reliable predictors of RAE performance, irrespective of whether or not they provide valid measures of research quality. It is not without irony that there is considerable overlap between RAE scores and major commercial university rankings, even though the research component in the latter relies primarily on bibliometrics. The <em>Times Higher Education</em> bases 30% of its <a href="http://www.timeshighereducation.co.uk/world-university-rankings/2013-14/world-ranking/methodology" target="_blank">World University Rankings</a> on citations. Eleven British universities made the top 100 in its 2013–2014 rankings. Of these, eight were also in the top 10 in RAE 2008 and the other three in the top 20. Knowing this, given the choice between the intellectual charade of REF &#8216;expert peer review&#8217; and appropriate metrics (the Web of Science is of little use in history) I would unhesitatingly choose the latter. It is infinitely cheaper, much less time consuming, and does not have the negative consequences for collegiality and staff morale of the present system. My own department must be one of many in which some colleagues are now no longer talking to one another because of a breakdown in trust over staff selection for REF 2014—hardly a framework for research excellence.</p>

<p>The conclusion I would rather draw, however, is that peer review vs. metrics is in many ways not the issue. Neither is capable of measuring research quality as such—whatever that may be. Peer review measures conformity to disciplinary expectations and bibliometrics measure how much a given output has registered on other academics&#8217; horizons, either of which might be an indicator of quality but neither of which has to be. It seems rather silly to base 65% of the REF ranking on something that we cannot measure and that may be inherently unmeasurable because it is a subjective judgment. Perhaps we should instead be asking which features of the research environment (which currently counts for a mere 15% of the REF assessment) are most conducive to a vibrant research culture and focus funding accordingly. Library and laboratory resources, research income, faculty members&#8217; involvement in conferences, journal or series editing, and professional associations, PhD student numbers and the intellectual life of a department as reflected in research seminars and public lectures are all good indicators of research vitality. They are also measurable.</p>

<p>&nbsp;</p>

<p><strong>Derek Sayer</strong> is Professor of Cultural History at Lancaster University and Professor Emeritus (Canada Research Chair in Social Theory and Cultural Studies) at the University of Alberta.  His new book <em>Rank Hypocrisies: The Insult of the REF</em> is to be published by Sage on December 3.</p>

<p><strong><em></em></strong><em>*This piece originally appeared on the <a href="http://blogs.lse.ac.uk/impactofsocialsciences/2014/11/19/peer-review-metrics-ref-rank-hypocrisies-sayer/" target="_blank">LSE&#8217;s Impact of Social Sciences blog, </a>under the title &#8216;Time to abandon the gold standard? Peer review for the REF falls far short of internationally accepted standards&#8217;, and is reposted with permission.</em></p>

<p><em><em><em><strong>Note: </strong></em></em>This article gives the views of the author, and not the position of the Council for Defence of British Universities. </em><i>This work is licensed under a <a href="http://creativecommons.org/licenses/by/3.0/deed.en_GB" target="_blank">Creative Commons Attribution 3.0 Unported License</a> unless otherwise stated.</i></p>
<div class="pvc_clear"></div><p id="pvc_stats_1681" class="pvc_stats " element-id="1681"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/problems-with-peer-review-for-the-ref/" data-count="vertical" data-text="Problems with Peer Review for the REF" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Fproblems-with-peer-review-for-the-ref%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/problems-with-peer-review-for-the-ref/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/problems-with-peer-review-for-the-ref/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
