<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>CDBU &#187; league tables</title>
	<atom:link href="http://cdbu.org.uk/tag/league-tables/feed/" rel="self" type="application/rss+xml" />
	<link>http://cdbu.org.uk</link>
	<description>Council for the Defence of British Universities</description>
	<lastBuildDate>Fri, 23 Jun 2017 21:43:20 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=4.1.18</generator>
	<item>
		<title>A whole lotta cheatin&#8217; going on?  REF stats revisited</title>
		<link>http://cdbu.org.uk/ref-stats-revisited/</link>
		<comments>http://cdbu.org.uk/ref-stats-revisited/#comments</comments>
		<pubDate>Sun, 01 Feb 2015 20:36:07 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[Higher Education]]></category>
		<category><![CDATA[league tables]]></category>
		<category><![CDATA[metrics]]></category>
		<category><![CDATA[REF]]></category>
		<category><![CDATA[REF 2014]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=1824</guid>
		<description><![CDATA[Opinion Piece by Derek Sayer* Editorial Note: In our previous blogpost, we noted that while there was agreement that REF2014 was problematic, there was less agreement about alternatives. To make progress, we need more debate. We hope that this piece &#8230; <a href="http://cdbu.org.uk/ref-stats-revisited/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><em>Opinion Piece by Derek Sayer*</em></p>

<p><em><strong>Editorial Note:</strong> In our <a href="http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/">previous blogpost</a>, we noted that while there was agreement that REF2014 was problematic, there was less agreement about alternatives. To make progress, we need more debate. We hope that this piece by Derek Sayer will stimulate this, and we welcome comments. Please note that comments are moderated and will not be published immediately.</em></p>

<p>1.</p>

<p>The rankings produced by <em>Times Higher Education</em> and others on the basis of the UK&#8217;s Research Assessment Exercises (RAEs) have always been contentious, but accusations of universities&#8217; gaming submissions and spinning results have been more widespread in REF2014 than any earlier RAE. Laurie Taylor&#8217;s jibe in <em>The Poppletonian</em> that &#8216;a grand total of 32 vice-chancellors have reportedly boasted in internal emails that their university has become a top 10 UK university based on the recent results of the REF&#8217;<a href="#_edn1" name="_ednref1">[i]</a> rings true in a world in which Cardiff University can <em>truthfully</em><a href="#_edn2" name="_ednref2">[ii]</a> claim that it &#8216;has leapt to 5th in the Research Excellence Framework (REF) based on the quality of our research, a meteoric rise&#8217; from 22nd in RAE2008. Cardiff ranks 5th among universities in the REF2014 &#8216;Table of Excellence,&#8217; which is based on the GPA of the scores assigned by the REF&#8217;s &#8216;expert panels&#8217; to the three elements in each university&#8217;s submission (outputs 65%, impact 20%, environment 15%)—just behind Imperial, LSE, Oxford and Cambridge. Whether this &#8216;confirms [Cardiff&#8217;s] place as a world-leading university,&#8217; as its website claims, is more questionable.<a href="#_edn3" name="_ednref3">[iii]</a>  These figures are a minefield.</p>

<p>Although HEFCE encouraged universities to be &#8216;inclusive&#8217; in entering their staff in REF2014, they were not obliged to return all eligible staff and there were good reasons for those with aspirations to climb the league tables to be more &#8216;strategic&#8217; in staff selection than in previous RAEs. Prominent among these were (1) HEFCE&#8217;s defunding of 2* outputs from 2011, which meant outputs scoring below 3* would now negatively affect a university&#8217;s rank order without any compensating gain in QR income, and (2) HEFCE&#8217;s pegging the number of impact case studies required to the number of staff members entered per unit of assessment, which created a perverse incentive to exclude research-active staff if this would avoid having to submit a weak impact case study.<a href="#_edn4" name="_ednref4">[iv]</a> Though the wholesale exclusions feared by some did not materialize across the sector, it is clear that some institutions were far more selective in REF2014 than in RAE2008.</p>

<p>Unfortunately data that would have permitted direct comparisons with numbers of staff entered by individual universities in RAE2008 were never published, but Higher Education Statistical Authority (HESA) figures for FTE staff eligible to be submitted allow broad comparisons across universities in REF2014. It is evident from these that selectivity, rather than an improvement in research quality per se, played a large part in Cardiff&#8217;s &#8216;meteoric rise&#8217; in the rankings. The same may be true for some other schools that significantly improved their positions, among them Kings (up to 7th in 2014 from 22= in 2008), Bath (14= from 20=), Swansea (22= from 56=), Cranfield (31= from 49), Heriot-Watt (33 from 45), and Aston (35= from 52=).  All of these universities except Kings entered fewer than 75% of their eligible staff members, and Kings has the lowest percentage (80%) of any university in the REF top 10 other than Cardiff itself.</p>

<p><span id="more-1824"></span></p>

<p>Cardiff achieved its improbable rank of 5th on the basis of a submission that included only 62% of eligible staff. This is the second-lowest percentage of any of the 28 British universities that are listed in the top 200 in the 2014-15 <em>Times Higher Education</em> World University Rankings (of these schools only Aberdeen entered fewer staff, submitting 52%). No other university in this cohort submitted less than 70% of eligible staff, and half (14 universities) submitted over 80%. Among the top schools, Cambridge entered 95% of eligible staff, Imperial 92%, UCL 91% and Oxford 87%.</p>

<p>Many have suggested that &#8216;research power&#8217; (which is calculated by multiplying the institution’s overall rounded GPA by the total number of full-time equivalent staff it submitted to the REF) gives a fairer indication of a university&#8217;s place in the national research hierarchy than GPA rankings alone. By this measure, Cardiff falls to a more credible but still respectable 18th. But when measured by &#8216;research intensity&#8217; (that is, GPA multiplied by the percentage of eligible staff entered), its rank plummets from 5th to 50th. To say this provides a more accurate indication of its true standing might be overstating the case, but it certainly underlines why Cardiff does not belong among &#8216;world-leading&#8217; universities. Cardiff doubtless produces some excellent research, but its overall (and per capita) performance does not remotely justify comparisons with Oxford, Cambridge, or Imperial—let alone Caltech, Harvard, Stanford, Princeton, MIT, UC-Berkeley and Yale (the other universities in the THE World University Rankings top 10). In this sense the GPA Table of Excellence can be profoundly misleading.</p>

<p>&#8216;To their critics,&#8217; writes Paul Jump in <em>Times Higher Education</em>, &#8216;such institutions are in essence cheating because in reality their quality score reflects the work produced by only a small proportion of their staff.&#8217;<a href="#_edn5" name="_ednref5">[v]</a> I am not sure the accusation of cheating is warranted, because nobody is doing anything here that is outside HEFCE&#8217;s rules. The problem is rather that the current REF system rewards—and thereby encourages—bad behavior, while doing nothing to penalize the most egregious offenders like Cardiff.</p>

<p>The VCs at Bristol (11= in the REF2014 GPA table) and Southampton (18=, down from 14= in 2008) might be forgiven for ruefully reflecting that they, too, might now be boasting that they are &#8216;a top ten research university&#8217; had they not chosen to submit 91% and 90% of their eligible faculty respectively—a submission rate that on any reasonable criteria (as distinct from HEFCE&#8217;s rules) might itself be seen as an indicator of research excellence.  Measured by research intensity Bristol comes in at 5= (jointly with Oxford) and Southampton at 8= (jointly with Queen&#8217;s University Belfast, which submitted 95% of its staff and is ranked 42= on GPA).  Meantime the VCs at St Andrews (down from 14= to 21=, 82% of eligible staff submitted), Essex (11th to 35=, 82% submitted), Loughborough (28= to 49=, 88% submitted) and Kent (31= to 49=, 85% submitted) may by now have concluded that—assuming they hold onto their jobs—they will have no alternative other than to be much more ruthless in culling staff for any future REF.</p>

<p>2.</p>

<p>The latest <em>Times Higher Education</em> World University Rankings puts Cardiff just outside the top 200, in the 201-225 group—which places it 29= among UK universities, along with Dundee, Newcastle, and Reading. Taking GPA, research power and research intensity into account—as we surely should, in recognition that not only the <em>quality</em> of research outputs but the <em>number</em> and <em>proportion</em> of academic staff who are producing them are also necessary elements in evaluating any university&#8217;s overall contribution to the UK&#8217;s research landscape—such a ranking feels intuitively to be just about right.</p>

<p>I have shown elsewhere<a href="#_edn6" name="_ednref6">[vi]</a> that there was, in fact, a striking degree of overall agreement between the RAE2008 rankings and the <em>Times Higher Education</em> World University Rankings. Repeating the comparison for UK universities ranked in the top 200 in the THE World University Rankings for 2014-15 and the REF2014 GPA-based &#8216;Table of Excellence&#8217; yields similar findings. These data are summarized in <em>Table 1</em>.</p>

<p><strong><em>Table 1</em>:  REF2014 performance of universities ranked in the top 200 in Times Higher Education World University Rankings 2014-15   </strong></p>

<p><a href="http://cdbu.org.uk/wp-content/uploads/2015/02/Table1.png"><img class="  wp-image-1832 alignleft" src="http://cdbu.org.uk/wp-content/uploads/2015/02/Table1.png" alt="Table1" width="563" height="710" /></a></p>

<p style="text-align: left;">Seven UK universities make the top 50 in the 2014-15 THE World University Rankings: Oxford, Cambridge, Imperial, UCL, LSE, Edinburgh, and Kings.  Six of these are also in the REF2014 top 10, while the other (Edinburgh) is only just outside it at 11=.  Four of the leading five institutions are same in both rankings (the exception being UCL, which is 8= in REF 2014), though not in the same rank order. Of 11 UK universities in THE top 100, only one (Glasgow, at 24th) is outside the REF top 20.  Of 22 UK universities in THE top 150, only two are outside REF top 30 (Birmingham, 31 in REF, and Sussex, 40 in REF).  Of 28 UK universities in THE top 200, only two are outside the REF top 40 (Aberdeen at 46= and Leicester at 53).</p>

<p>Conversely, only two universities in the REF2014 top 20, Cardiff at 6 and Bath at 14=, do not make it into the THE top 200 (their respective ranks are 201-225 and 301-350). Other universities that are ranked in the top 40 in REF2014 but remain outside the THE top 200 are Newcastle (26=), Swansea (26=), Cranfield (31), Herriot-Watt (33), Essex (35=), Aston (35=), Strathclyde (37), Dundee (38=) and Reading (38=).</p>

<p><em>Table 2</em> provides data on the performance of selected UK universities that submitted to REF2014 but are currently ranked outside the THE world top 200.</p>

<p><strong>Table 2.  REF2014 performance of selected UK universities outside top 200 in Times Higher Education World University Rankings 2014-15</strong></p>

<p><a href="http://cdbu.org.uk/wp-content/uploads/2015/02/Table2.png"><img class="aligncenter size-full wp-image-1835" src="http://cdbu.org.uk/wp-content/uploads/2015/02/Table2.png" alt="Table2" width="573" height="503" /></a></p>

<p>Dundee, Newcastle and Reading only just miss the THE cut (they are all in the 201-225 bracket). While all three outscored Aberdeen and Leicester, who are above them in the THE rankings (in Leicester&#8217;s case, at 199, very marginally so) in the REF, only Newcastle does substantially worse in the THE rankings than in the REF.  It is ranked 26= in the REF with Nottingham and Royal Holloway, ahead of Leicester (53), Aberdeen (46), Sussex (40), Liverpool (33), Birmingham (31) and Exeter (30)—all of which are in the top 200 in the THE World Rankings. While there was a yawning gulf between Essex&#8217;s RAE2008 ranking of 11th and its THE ranking in the 301-350 group, the latter does seem to have presaged its precipitous REF2014 fall from grace to 35=.  Conversely, the THE inclusion of Plymouth in the 276-300 group of universities places it considerably higher than its RAE rank of 66= would lead us to expect. This is not the case with most of the UK universities listed in the lower half of the THE top 400. Birkbeck, Bangor, Aberystwyth and Portsmouth also all found themselves outside the top 40 in REF 2014.</p>

<p>The greatest discrepancies between REF2014 and THE World Rankings come with Cardiff (6 in REF, 201-225 in REF), Bath (14= in REF, 301-350 in THE), Swansea (26= in REF, not among THE top 400), Aston (35= in REF, 350-400 in THE), Cranfield, Heriot-Watt and Strathclyde (31=, 33 and 37 respectively in REF, yet not among THE top 400). On the face of it, these cases flatly contradict any claim that THE (or other similar) rankings are remotely accurate predictors of REF performance. I would argue, on the contrary, that these are the exceptions that prove the rule. With the exception of Strathclyde (18 in research intensity with 84% of eligible staff submitted and the worst-performing member of this group in REF GPA), <em>all these schools were prominent among universities who inflated their GPA by submitting smaller percentages of their eligible staff in REF2014.</em>  Were we to adjust raw GPA figures by research intensity, we would get a much closer match, as <em>Table 3</em> shows.</p>

<p><strong><em>Table 3.</em>  Comparison of selected universities performance in THE World University Rankings 2014-15 and REF2014 by GPA and research intensity.</strong></p>

<p><a href="http://cdbu.org.uk/wp-content/uploads/2015/02/Table3.png"><img class="aligncenter size-full wp-image-1836" src="http://cdbu.org.uk/wp-content/uploads/2015/02/Table3.png" alt="Table3" width="473" height="161" /></a></p>

<p>The most important general conclusion to emerge from this discussion is that despite some outliers there is a remarkable degree of agreement between the top 40 in REF2014 and top 200 in the THE 2014-15 World University Rankings, and the correlation increases the higher we go in the tables.  Where there are major discrepancies, these are usually explained by selective staff submission policies.</p>

<p>One other correlation is worth noting at this point.  All 11 of the British universities in the THE top 100 are members of the Russell Group, as are 10 of the 17 British universities ranked between 100-200. The other six universities in this latter cohort (St Andrews, Sussex, Royal Holloway, Lancaster, UEA, Leicester) were all members of the now-defunct 1994 Group. Only one British university in the THE top 200 (Aberdeen) belonged to neither the Russell Group nor the 1994 Group. Conversely, only two Russell Group universities, Newcastle and Queen&#8217;s University Belfast, did not make the top 200 in the THE rankings.<a href="#_edn7" name="_ednref1">[vii]</a>  In 2013-14 Russell Group and former 1994 Group universities between them received almost 85% of QR funding.  Here, too, an enormous amount of money, time, and acrimony seems to have been expended on a laborious REF exercise that merely confirms what THE rankings have already shown.</p>

<p>3.</p>

<p>The most interesting thing about this comparative exercise is that the <em>Times Higher Education </em>World University Rankings not only make no use of RAE/REF data, but rely on quantitative methodologies that have repeatedly been rejected by the British academic establishment in favor of the &#8216;expert peer review&#8217; that is supposedly offered by REF panels. THE gives 30% of the overall score for the learning environment, 7.5% for international outlook, and 2.5% for industry income. The remaining 60% is based entirely on research-related measures, of which &#8216;the single most influential of the 13 indicators,&#8217; counting for 30% of the overall THE score, is &#8216;the number of times a university’s published work is cited by scholars globally&#8217; as measured by the Web of Science. The rest of the research score is derived from research income (6%), ‘research output scaled against staff numbers’ (6%, also established through the Web of Science), and ‘a university’s reputation for research excellence among its peers, based on the 10,000-plus responses to our annual academic reputation survey’ (18%).</p>

<p>The comparisons undertaken here strongly suggest that at such metrics-based measures have proved highly reliable predictors of performance in REF2014—just as they did in previous RAEs. To be sure, there are differences in the details of the order of ranking of institutions between the THE and REF, but in such cases can we be confident that it is the REF panels&#8217; highly subjective judgments of quality that are the more accurate? To suggest there is no margin for error in tables where the difference in GPA between 11th (Edinburgh, 3.18) and 30th (Exeter, 3.08) is a mere 0.1 points would be ridiculous.  I have elsewhere suggested that there are in fact many reasons why such confidence would be totally misplaced, including lack of specialist expertise among panel members and lack of time for reading outputs in the depth required.<a href="#_edn8" name="_ednref2">[viii]</a>  But my main point is this.</p>

<p>If metrics-based measures can produce similar results to those arrived at through the REF&#8217;s infinitely more costly, laborious and time-consuming process of ‘expert review’ of individual outputs, there is a compelling reason to go with the metrics; not because it is necessarily a <em>valid</em> measure of anything but because it as <em>reliable</em> as the alternative (whose validity is no less dubious for different reasons) and a good deal more cost-efficient. The benefits for collegiality and staff morale of universities not having to decide whom to enter or exclude from the REF might be seen as an additional reason for favoring metrics. I am sure that if HEFCE put their minds to it they could come up with a more sophisticated basket of metrics than <em>Times Higher Education</em>, which would be capable of meeting many of the standard objections to quantification.  I hope James Wilsdon&#8217;s committee might come up with some useful suggestions for ways forward.</p>

<p>&nbsp;</p>

<p><a href="#_ednref1" name="_edn1">[i]</a> Laurie Taylor, <a href="http://www.timeshighereducation.co.uk/comment/the-poppletonian/we-have-bragging-rights/2017834.article" target="_blank">&#8216;We have bragging rights!&#8217;</a> in The Poppletonian, <em>Times Higher Education</em>, 8 January 2015.</p>

<p><a href="#_ednref2" name="_edn2">[ii]</a> Well, not quite. Cardiff is actually ranked 6th in the REF2014 &#8216;Table of Excellence,&#8217; which is constructed by <em>Times Higher Education</em> on the basis of the grade point average (GPA) of the marks awarded by REF panels, but the #1 spot is held not by a university but the Institute of Cancer Research (which submitted only two UoAs).  This table and others drawn upon here for &#8216;research power&#8217; and &#8216;research intensity&#8217; can be downloaded <a href="http://www.timeshighereducation.co.uk/story.aspx?storyCode=2017590%20and http://www.timeshighereducation.co.uk/features/ref-2014-rerun-who-are-the-game-players/2017670.article" target="_blank">here</a>.</p>

<p><a href="#_ednref3" name="_edn3">[iii]</a> &#8216;REF 2014,&#8217; <a href="http://www.cardiff.ac.uk/research/impact-and-innovation/quality-and-performance/ref-2014" target="_blank">Cardiff University website</a>.</p>

<p><a href="#_ednref4" name="_edn4">[iv]</a> Paul Jump, <a href="http://www.timeshighereducation.co.uk/news/careers-at-risk-after-case-studies-game-playing-ref-study-suggests/2018086.article" target="_blank">&#8216;Careers at risk after case studies ‘game playing’, REF study suggests.&#8217; </a> Times Higher Education, 22 January 2015, at</p>

<p><a href="#_ednref5" name="_edn5">[v]</a> Paul Jump, <a href="http://www.timeshighereducation.co.uk/features/ref-2014-rerun-who-are-the-game-players/2017670.article" target="_blank">&#8216;REF 2014 rerun: who are the &#8216;game players&#8217;?&#8217;</a>  <em>Times Higher Education</em>, 1 January 2015.</p>

<p><a href="#_ednref6" name="_edn6">[vi]</a> See Derek Sayer, <em>Rank Hypocrisies: The Insult of the REF</em>.  London: Sage, 2014.</p>

<p><a href="#_ednref7">[vii]</a> I have discussed Newcastle already.  Queen&#8217;s came in just outside the REF top 40 (42=) but with an excellent intensity rating (8=, 95% of eligible staff submitted).</p>

<p><a href="#_ednref8">[viii]</a> See, apart from <em>Rank Hypocrisies</em>, &#8216;One scholar&#8217;s crusade against the REF,&#8217; <em>Times Higher Education</em>, 11 December, 34-6; &#8216;Time to abandon the gold standard?  Peer Review for the REF Falls Far Short of Internationally Acceptable Standards,&#8217; LSE <em>Impact of Social Sciences</em> blog, 19 November (reprinted as &#8216;Problems with peer review for the REF,&#8217; CDBU blog, 21 November).</p>

<p>*This is a revised version of an article first posted on<a href="http://coastsofbohemia.com/2015/01/27/a-whole-lotta-cheatin-going-on-ref-stats-revisited/" target="_blank"> Sayer&#8217;s blog <em>coastsofbohemia</em> </a>on 27 January 2015.</p>

<p><em><strong>Note: </strong>This article gives the views of the author, and not the position of the Council for Defence of British Universities. </em><i>This work is licensed under a <a href="http://creativecommons.org/licenses/by/3.0/deed.en_GB" target="_blank">Creative Commons Attribution 3.0 Unported License</a> unless otherwise stated.</i></p>
<div class="pvc_clear"></div><p id="pvc_stats_1824" class="pvc_stats " element-id="1824"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/ref-stats-revisited/" data-count="vertical" data-text="A whole lotta cheatin' going on?  REF stats revisited" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Fref-stats-revisited%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/ref-stats-revisited/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/ref-stats-revisited/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Reflections on the REF and the need for change</title>
		<link>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/</link>
		<comments>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/#comments</comments>
		<pubDate>Wed, 07 Jan 2015 14:24:54 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[CDBU Updates]]></category>
		<category><![CDATA[funding formula]]></category>
		<category><![CDATA[grade inflation]]></category>
		<category><![CDATA[HEFCE]]></category>
		<category><![CDATA[league tables]]></category>
		<category><![CDATA[metrics]]></category>
		<category><![CDATA[peer review]]></category>
		<category><![CDATA[REF]]></category>
		<category><![CDATA[REF2014]]></category>
		<category><![CDATA[university ratings]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=1759</guid>
		<description><![CDATA[Discussion piece by the CDBU Steering Group Results from the research excellence framework (REF) were publicly announced on 18th December, followed by a spate of triumphalist messages from University PR departments. Deeper analysis followed, both in the pages of the &#8230; <a href="http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><i>Discussion piece by the CDBU Steering Group</i></p>

<p style="text-align: center;"><i style="color: #333333; line-height: 24.375px;"><a style="text-decoration: underline;" href="http://cdbu.org.uk/wp-content/uploads/2015/01/possible-picture-for-header.jpg"><img class="aligncenter  wp-image-1763" style="margin-top: 0.4em; border: 0px; background: #eeeeee;" alt="possible picture for header" src="http://cdbu.org.uk/wp-content/uploads/2015/01/possible-picture-for-header.jpg" width="4000" height="2250" /></a></i></p>

<p>Results from the research excellence framework (REF) were publicly announced on 18th December, followed by <a href="http://www.chris-hackley.com/2014/12/when-17th-really-means-51st-and-leading.html?m=1" target="_blank">a spate of triumphalist messages</a> from University PR departments. Deeper analysis followed, both in the pages of the Times Higher Education, and in the media and on blogs.</p>

<p>CDBU has from the outset expressed concern about the REF, much of it consistent with the criticism that has been expressed elsewhere. In particular, we note:</p>

<p><strong>Inefficiency:</strong> As <a href="http://www.timeshighereducation.co.uk/features/one-scholars-crusade-against-the-ref/2017405.fullarticle" target="_blank">Derek Sayer has noted</a>, the REF has absorbed a great deal of time and money that might have been spent better elsewhere. The precise cost has yet to be reported, but it is likely to be greater than the £60m official figure, and that is not taking into account the <a href="http://www.theguardian.com/higher-education-network/2014/dec/15/research-excellence-framework-five-reasons-not-fit-for-purpose" target="_blank">cost in terms of the time of academic staff</a>. Universities have taken on new staff to do the laborious work of compiling data and writing impact statements, but this has diverted funds from front-line academia and increased administrative bloat.</p>

<p><strong>Questionable validity</strong>: <a href="http://cdbu.org.uk/problems-with-peer-review-for-the-ref/" target="_blank">Derek Sayer</a> has cogently argued the case that the peer review element of REF is open to bias from subjective, idiosyncratic and inexpert opinions. It is also unaccountable in the sense that ratings made of individual outputs are destroyed. One can see why this is done: otherwise HEFCE could be inundated with requests for information and appeals. But if the raw data is not available, then this does not inspire confidence in the process, especially when there are widespread accusations of <a href="http://www.timeshighereducation.co.uk/story.aspx?storyCode=2017670" target="_blank">games-playing</a> and <a href="http://www.wonkhe.com/blogs/ref-results-marred-by-fears-over-grade-inflation/" target="_blank">grade inflation</a>.</p>

<p><strong>Concentration of funding in a few institutions: </strong>We are told that the goal is to award quality-related funding, but as currently implemented, this leads inevitably to a process whereby <a href="http://deevybee.blogspot.co.uk/2013/10/the-matthew-effect-and-ref2014.html" target="_blank">the rich get richer and the poor get poorer</a>, with the bulk of funds concentrated in a few institutions. We suspect that the intention of including &#8216;impact&#8217; in the REF was to reduce the disparity between the Golden Triangle (Oxford, Cambridge and London) and other institutions which might be doing excellent applied work, but if anything the opposite has happened. We do not yet know what the funding formula will be, but if it is, as widely predicted, heavily biased in favour of 4* research, we could move to a situation where <a href="http://deevybee.blogspot.co.uk/2014/12/dividing-up-pie-in-psychology-in.html" target="_blank">only the large institutions will survive to be research-active</a>. There has been no discussion of whether such an outcome is desirable.</p>

<p><strong>Shifting the balance of funding across disciplines:</strong> A recent <a href="http://www.timeshighereducation.co.uk/news/funding-plea-for-humanities-as-life-sciences-crowned-ref-2014-champion/2017667.article#.VKaCV8xpSNc.twitter" target="_blank">article in the Times Higher Education</a> noted another issue: the tendency for those in the Sciences to obtain higher scores on the REF than those in the Humanities. Quotes from HEFCE officials in the article offered no reassurance to those who were concerned this could mean a cut in funding for humanities. Such a move, if accompanied by <a href="http://civitas.org.uk/newblog/2014/04/give-vocational-courses-priority-and-make-them-cheaper/" target="_blank">changes to student funding to advantage those in STEM subjects</a>, could dramatically reduce the strength of Humanities in the UK.</p>

<p><span id="more-1759"></span></p>

<p><strong>Unaccountable flexibility in the funding formula:</strong> There are <a href="http://www.wonkhe.com/blogs/rankings-data-tables-and-spin/" target="_blank">many different ways of achieving ratings</a>: For instance, whether or not the ratings include <a href="http://www.wonkhe.com/blogs/ref-2014-sector-results-2/" target="_blank">&#8216;intensity&#8217; (number of returnable staff who were entered), can dramatically alter rank orderings</a>. Or we could look at a suggestion by <a href="http://www.wonkhe.com/blogs/the-bang-for-buck-heroes-of-uk-research/" target="_blank">Graeme Wise</a> that a &#8216;bang for your buck&#8217; metric that assessed outputs in relation to grant income would be most appropriate. Even more radical is a suggestion by <a href="http://researchrandomness.blogspot.co.uk/2014/12/bang-for-buck-in-ref-2014.html" target="_blank">Dermot Lynott</a>, that we should be giving the most rewards to those whose outputs were impressive in relation to their scores on environment. Needless to say, a very different profile of winners and losers emerged from such an analysis.  It will ultimately be a political decision as to how to translate the REF scores into funding. We have to ask whether it worth going through this entire long-winded exercise if, by simply changing the funding formula, one can make a dramatic difference to an institution&#8217;s funding to achieve a politically expedient outcome.</p>

<p><strong>Damage inflicted on careers and morale:</strong> The criteria for entering staff for the REF could appear quite cavalier; for instance, the requirement for a numerical ratio between number of staff entered and number of case studies meant that some departments with few case studies were unable to enter all plausible staff. <a href="http://www.timeshighereducation.co.uk/news/lancaster-historian-appeals-against-his-inclusion-in-ref/2008570.article" target="_blank">Derek Sayer</a> has described instances of decisions to enter staff being made on what appeared to be flimsy evidence based on ad hoc internal evaluations. Yet being identified as &#8216;non-REFable&#8217; is not only damaging to morale, but could have real impacts on prospects for promotion and job security.</p>

<p><strong>Focus on competition rather than collaboration: </strong> The REF exercise creates rank orderings, and everyone is desperately trying to nudge ahead of the others. In fact, there are so many different ways of doing the ranking, that almost everyone can be satisfied that they are &#8216;among the top&#8217; on some index or other. Those who crowed loudest about their success tried to temper this by arguing that they were celebrating a broader &#8216;British&#8217; success, but this seems perverse. Why should concentrating ever more of the excellent research in an ever small number of institutions be regarded as a national success story? It is, of course, widely believed that competition is a force for good, stimulating people to do better than they otherwise might. However, many in academia take the view that they don&#8217;t need to be incentivised by competition to work hard: they are in the job for the love of it, and would like their efforts to be appreciated for what they are, not because they help push the institution up a league table. Competition can also damage relationships between different departments within a University, if there are disparities in REF performance that lead to bickering about who is more valuable.</p>

<p><strong>Perverse incentives that damage research: </strong>These may play out differently in different subject areas, but overall, many academics feel that they are not able to do the research they want in the way they would like. In science, there are <a href="https://theconversation.com/the-dark-side-of-research-when-chasing-prestige-becomes-the-prize-35001" target="_blank">intense pressures to publish in high-impact</a> journals and bring in grant income. Some institutions are notorious for threatening redundancy to scientific staff who do not meet some agreed quota of research income, creating incentives to do ever more expensive research (see for instance cases at <a href="http://deevybee.blogspot.co.uk/2014/06/the-university-as-big-business.html" target="_blank">Kings College London</a>, <a href="http://www.timeshighereducation.co.uk/news/imperial-college-professor-stefan-grimm-was-given-grant-income-target/2017369.article" target="_blank">Imperial College</a>, and <a href="http://www.timeshighereducation.co.uk/news/simplistic-redundancy-metrics-criticised/2016357.article" target="_blank">Warwick University Medical School</a>).  In humanities, the pressure to produce a steady stream of research articles and monographs has led to the sense of an enforced move to over-specialisation, with academics increasingly incapable of explaining or demonstrating the broader significance of their work. Younger generations of academics find that their direction of research is being wholly driven by &#8216;REF-ability’, and that journal publications automatically trump those in volumes of collected essays, even when the latter may be more important for the field.</p>

<p><strong>Perverse incentives on hiring practices: </strong>This is another consequence of the intensely competitive culture that is induced by the REF. We have, particularly around the time of the REF, a market in research &#8216;super-stars&#8217;, who can attract impressive transfer fees. <a href="http://crookedtimber.org/2014/12/18/research-excellence-framework-the-denouement/" target="_blank">People from other institutions who are employed at only 20 per cent part-time</a> suddenly appear on the books, boosting the institution&#8217;s return on funding and outputs.</p>

<p><strong>Devaluation of non-research activity</strong>: Academics whose positions require them to teach and do research have felt pressured to focus principally on research, and teaching has consequently been devalued. It is sometimes suggested that the Impact agenda of the REF also encourages academics to spend time on public engagement, but in fact it has the opposite effect. Public engagement does not count as &#8216;impact&#8217; for REF  purposes: to demonstrate REF impact, one must provide concrete documentation of how a specific piece of research has influenced non-academic users, such as policy-makers, health professionals, museums, etc.</p>

<p><strong>How have we come to this?</strong></p>

<p>Given that<a href="http://www.theguardian.com/higher-education-network/2014/dec/16/research-excellence-framework-2014-the-postmortem-live-chat?CMP=share_btn_tw" target="_blank"> many of these points were made in the run-up to the announcement of REF results</a>, we have to ask how it is that we find ourselves trapped in such an undesirable system. It is noteworthy that the REF is popular with many vice-chancellors and administrative teams. It makes it easier to manage staff, with objective criteria for hiring and firing, and provides league tables to measure progress by. For those already attached to the vision of a university as big business, it seems the natural next step to have objective rules for defining winners and losers so that one can directly measure an individual&#8217;s likely ability to bring money into the university without having to make difficult judgements about the intrinsic quality of their work.</p>

<p>It is a moot point whether the collapse of the circle of winners to Oxbridge and London was the result of deliberate planning, or an unintended consequence of how the system operates. Be this as it may, one concern is that this could provide further pressure for the British university system to reconfigure itself so that it can compete with the American private elite. There would be increasing reluctance to use general taxation to concentrate even more educational resources within close proximity to London, and instead we could see a shift to private funding, increasing the extent to which access to the best institutions is distributed according to wealth rather than ability.  In a few short years we could see the destruction of a genuinely national system of higher education, publicly funded because it is designed to serve everyone, to a privately funded system that is world-class for the few who can afford to access it, but a disaster for the country as a whole. As in the US, educational opportunity would be concentrated overwhelmingly in places where it can be accessed primarily by the wealthy, privileged and well-connected.</p>

<p>At the time of the announcement of REF results, there was a sense that anyone who criticised the celebrations was either a bad loser, or – if they came from an institution who did well – a traitor for not celebrating British success. At CDBU we are proud of UK Universities and their research reputation, but our loyalty is to our discipline, our profession, our vocation and our sense of their place in the wider scheme of things: we fear that the assessment process embodied in REF will in the longer term damage these.</p>

<p><strong>Where next?</strong></p>

<p>It is, of course, all very well to criticise and paint visions of a dystopian future. If we wish to replace the current system, we must look at alternative ways forward. <a href="http://www.thebookseller.com/news/sage-debates-value-research-excellence-framework" target="_blank">At a debate about the REF</a>, organised by Sage Publishers on 8<sup>th</sup> December, David Willetts MP, who until July 2014 was Minister of State for Universities and Science, took the view that some of those who disapproved of REF were just dinosaurs who wanted to go back to the 1970s, when funds were allocated to institutions by a group of the great and the good making judgements over dinner at the Athenaeum. We would dispute that this is the only alternative to the current system. But Willetts was right on another count: he pointed out that the current REF system was not imposed by government, nor by HEFCE. Indeed, they had been actively pursuing the idea of using a simpler metrics-based system for the REF, but it was resoundingly rejected by the academic community. Government, according to Willetts, would listen to any reasonable proposal for a new system. Clearly, it is now up to the academics themselves to propose a viable alternative.</p>

<p>At the same meeting, <a href="http://www.researchresearch.com/index.php?option=com_news&amp;template=rr_2col&amp;view=article&amp;articleId=1348861" target="_blank">David Sweeney</a>, Director of HEFCE responsible for Education and Knowledge Exchange, implied that critics of REF just wanted to be handed public money without any accountability. He emphasised that the government and taxpayer put money into university research and had a right to know about the outcomes from the investment they had made. We totally agree. But we take issue with those who, like <a href="http://www.wonkhe.com/blogs/the-research-excellence-framework-fascinating-flawed-and-essential/" target="_blank">Mark Leach, Director and Editor-in-Chief of Wonke</a>, think that the REF, for all its limitations, provides a good solution. Our position is that, for all the reasons given above, the REF is a seriously flawed system for deciding on disbursement of research funds, which in the long run will do the UK University system more harm than good.</p>

<p><strong>What alternatives are there?</strong></p>

<ol>
<li><p>One possibility that has been discussed is to remove the QR component of funding altogether, and give all funds to the research councils. The problem with this solution is that it would mean the research councils would have to grow in size enormously, the load on reviewers, already seen as unsustainable by many would increase yet further, and pressures on academics to bring in research grants would be even more intense. <a href="http://www.theguardian.com/education/2014/dec/21/university-funding-reform-brain-drain-london?CMP=share_btn_tw&amp;utm_content=bufferdf118&amp;utm_medium=social&amp;utm_source=twitter.com&amp;utm_campaign=buffer" target="_blank">It has also been argued</a> that it would further increase disparities between the Golden Triangle (Oxbridge and London) and the rest, and would disfavour non-STEM disciplines.</p></li>
<li><p><a href="http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/" target="_blank">HEFCE is has been looking at various publication-based</a> metrics that might substitute for the REF, and plans to do some empirical studies comparing metric-based evaluation with REF results. However, metrics have been vigorously opposed by many in the academic community, especially in the humanities, where there is much worse agreement with expert opinion than in the sciences. We do at least now have hard data from the REF that can be considered when evaluating how metrics would perform, but there&#8217;s a real risk that introduction of any metric will further distort incentives, so that the measure becomes the goal.</p></li>
<li><p>At the Sage meeting, Derek Sayer put forward another interesting alternative, which was that funding should be based just on the &#8216;research environment&#8217; component of the REF, which focuses more on inputs than outputs.</p></li>
<li><p>An even simpler option that would retain the dual support system but remove quality-related funding would involve disbursing funds purely on the basis of the number of active researchers in a department. This could be criticised for leading to a &#8216;prairie farming&#8217; model whereby departments would band together to create enormous conglomerates that would benefit from economies of scale. One could, however, put a limit on the size of unit entered. At the Sage meeting, David Sweeney expressed himself as strongly opposed to this solution, even though <a href="http://deevybee.blogspot.co.uk/2014/10/some-thoughts-on-use-of-metrics-in.html" target="_blank">in the last round it gave a funding result that was highly correlated with actual funding outcomes from RAE</a>, in both science and humanities. He is right in noting that, despite the high correlation, there would nevertheless undoubtedly be winners and losers who would suffer substantial gains and losses in real terms, relative to the RAE result, but the question is whether this would involve unfairness. It&#8217;s hard to say, given that we have no gold standard. You could of course further argue that you have to have some measure of quality to incentivise people to do better, as well as a need to guard against freeloaders, like Laurie Taylor&#8217;s <a href="http://www.timeshighereducation.co.uk/comment/the-poppletonian/laurie-taylor-column/156378.article" target="_blank">Dr Piercemuller</a>. Yet, as we have noted, for most academics, exhortations from managers to &#8216;do better&#8217; don’t achieve much and may indeed be counterproductive. We need to be accountable for the public money spent on university research, but subjecting every apple in the barrel to an exhaustive x-ray examination may not be the best way to identify the rotten ones.</p></li>
</ol>

<p>We do not have a single solution, but we think that academics must take control of this process and not leave it in the hands of HEFCE and the government. <a href="https://www.academia.edu/9923660/The_academic_manifesto_From_an_occupied_to_a_public_unversity" target="_blank">This article</a> about Dutch Universities Netherlands has strong parallels with the UK situation: the author argues academics have been too passive in accepting an emphasis on competition, and the use of evaluation systems as means of control. There is unlikely to be an ideal solution and we may have to live with the &#8216;least bad&#8217; option. But let us consider all options in terms of how far they are likely to exacerbate or resolve the problems outlined above, or we may find ourselves saddled with something even worse than REF2014.</p>

<p>We hope this article opens up the discussion on this topic. Please do add your comments. We are moderating comments to exclude spam, but non-anonymous, on-topic comments will be published unless they contravene the usual rules.</p>

<p>Finally, if you agree with the broad concerns expressed here, please do consider <a href="http://cdbu.org.uk/participate/become-a-member/">joining the CDBU</a> to help us campaign more effectively for change.</p>
<div class="pvc_clear"></div><p id="pvc_stats_1759" class="pvc_stats " element-id="1759"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/" data-count="vertical" data-text="Reflections on the REF and the need for change" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Freflections-on-the-ref-and-the-need-for-change%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/reflections-on-the-ref-and-the-need-for-change/feed/</wfw:commentRss>
		<slash:comments>22</slash:comments>
		</item>
		<item>
		<title>Staff satisfaction is as important as student satisfaction</title>
		<link>http://cdbu.org.uk/staff-satisfaction-is-as-important/</link>
		<comments>http://cdbu.org.uk/staff-satisfaction-is-as-important/#comments</comments>
		<pubDate>Thu, 13 Nov 2014 12:49:39 +0000</pubDate>
		<dc:creator><![CDATA[CDBU Admin]]></dc:creator>
				<category><![CDATA[Opinion]]></category>
		<category><![CDATA[academic staff]]></category>
		<category><![CDATA[league tables]]></category>
		<category><![CDATA[staff satisfaction]]></category>
		<category><![CDATA[student satisfaction]]></category>
		<category><![CDATA[student survey]]></category>

		<guid isPermaLink="false">http://cdbu.org.uk/?p=1672</guid>
		<description><![CDATA[Opinion piece by Dorothy Bishop, 13 November 2014 Universities have become obsessed with competition: it is no longer enough to do well; you have to demonstrate you are better than the rest. And to do that, you need some kind &#8230; <a href="http://cdbu.org.uk/staff-satisfaction-is-as-important/">Continue reading <span class="meta-nav">&#8594;</span></a>]]></description>
				<content:encoded><![CDATA[<p><em>Opinion piece by <a href="http://en.wikipedia.org/wiki/Dorothy_Bishop_%28psychologist%29" target="_blank">Dorothy Bishop</a>, 13 November 2014</em></p>

<p>Universities have become obsessed with competition: it is no longer enough to do well; you have to demonstrate you are better than the rest. And to do that, you need some kind of metric. Organisations have grown up to meet this need, and to produce league tables that compare institutions on a range of characteristics, including research excellence, reputation and teaching.</p>

<p>The <a href="https://www.ipsos-mori.com/researchspecialisms/socialresearch/specareas/highereducation/nss.aspx" target="_blank">National Student Survey</a> has become established as a major component of this process. It has run annually across all publicly funded Higher Education Institutions (HEIs) in the UK. It features prominently in student guides to the best universities, such as <a href="http://www.theguardian.com/education/2013/jun/04/how-to-use-the-guardian-university-guide" target="_blank">this one by the Guardian</a>. There is no doubt that the survey has made universities more responsive to student views, and it is to be welcomed that <a href="http://www.timeshighereducation.co.uk/news/national-student-survey-2014-results-show-record-levels-of-satisfaction/2015108.article" target="_blank">reported student satisfaction levels have increased</a> since the survey was introduced. Nevertheless, some, like <a href="http://www.theguardian.com/higher-education-network/blog/2014/aug/12/do-we-still-need-national-student-survey-university" target="_blank">Arti Agrawal</a> have expressed concerns about universities introducing quick fixes that may produce higher ratings in the short term, but lower academic quality overall: &#8216;With increased tuition fees, students are seen as customers who must be kept happy, and the NSS is now a customer satisfaction survey&#8217;. We even have evidence that within some universities, <a href="http://www.timeshighereducation.co.uk/comment/letters/staff-at-the-mercy-of-satisfaction-scores/2013880.article" target="_blank">student satisfaction is used as an index of the quality of the teaching staff</a>.</p>

<p>It is perhaps not surprising then, that as the same time as we are told that students are getting happier and happier, academic staff seem to be growing ever more miserable. Now this could, of course, just be down to the fact that everyone likes a good moan<sup>1</sup>. But the impression one gets from reading the Times Higher Education and looking at stories anonymously contributed to CDBU’s <a href="http://cdbu.org.uk/participate/your-stories/record-the-rot/rot-log/" target="_blank">Record the Rot</a> archive is that there is more to it than that. The very same pressures that lead managers to treat students as consumers have led them to treat academic staff as dispensible ‘human resources’. The view of universities as institutions in constant competition with one another and the rest of the world has trickled down to the departmental level, destroying any sense of collegiality. In the long run, if teaching is done by a body of demoralised and ever-changing academics, this can only be bad for staff and students alike.</p>

<p>But this is only anecdote, and it would be good to have some data. The Times Higher Education started a <a href="http://www.timeshighereducation.co.uk/features/the-best-university-workplace-survey-2014-the-results/2010792.fullarticle" target="_blank">Best Workplace Survey</a> last year, which has the potential to provide just that. However, the sample was relatively small and self-selected. Findings such as 39 per cent of academics felt their health was negatively affected by their work, and one third felt their job was not secure are hard to interpret given the vagaries of sampling. Is this typical, or was it the most disaffected who replied? Concerns about the low response rate and potential for bias meant that the THE decided not to report results by institution. My guess is that if we had proper survey data, and if staff satisfaction were incorporated into ‘best university’ rankings, then rank orderings might change quite dramatically. Furthermore, institutions <a href="http://deevybee.blogspot.co.uk/2014/06/the-university-as-big-business.html" target="_blank">sacked staff to improve rankings</a> might find their strategy backfiring.</p>

<p>The THE’s <a href="http://www.timeshighereducation.co.uk/news/thes-best-university-workplace-survey-2015-take-part/2015788.article" target="_blank">workplace survey for 2015</a> is now live. I would encourage everyone working in higher education to take part, whether or not you have something you want to moan about. We need to get an adequate database on this topic so that we can have a solid basis for identifying those institutions that are genuinely at the top of the league, in terms of their treatment of staff, versus those who achieve a high status on other indicators while presiding over an anxious and demoralised staff.</p>

<p><sup>1</sup> Especially the English. I can thoroughly recommend this book for an amusing and informative account: Fox, K. (2005). <em>Watching the English: The Hidden Rules of English Behaviour</em>. London: Hodder &amp; Stoughton.</p>
<div class="pvc_clear"></div><p id="pvc_stats_1672" class="pvc_stats " element-id="1672"><img src="http://cdbu.org.uk/wp-content/plugins/page-views-count/ajax-loader.gif" border=0 /></p><div class="pvc_clear"></div><div class='dd_post_share'><div class='dd_buttons'><div class='dd_button'><a href="http://twitter.com/share" class="twitter-share-button" data-url="http://cdbu.org.uk/staff-satisfaction-is-as-important/" data-count="vertical" data-text="Staff satisfaction is as important as student satisfaction" data-via="" ></a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div><div class='dd_button'><iframe src='http://www.facebook.com/plugins/like.php?href=http%3A%2F%2Fcdbu.org.uk%2Fstaff-satisfaction-is-as-important%2F&amp;locale=en_US&amp;layout=box_count&amp;action=like&amp;width=50&amp;height=60&amp;colorscheme=light' scrolling='no' frameborder='0' style='border:none; overflow:hidden; width:50px; height:62px;' allowTransparency='true'></iframe></div><div class='dd_button'><a name='fb_share' type='box_count' share_url='http://cdbu.org.uk/staff-satisfaction-is-as-important/' href='http://www.facebook.com/sharer.php'></a><script src='http://static.ak.fbcdn.net/connect.php/js/FB.Share' type='text/javascript'></script></div></div><div style='clear:both'></div></div><!-- Social Buttons Generated by Digg Digg plugin v5.3.6,
    Author : Buffer, Inc
    Website : http://bufferapp.com/diggdigg -->]]></content:encoded>
			<wfw:commentRss>http://cdbu.org.uk/staff-satisfaction-is-as-important/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
