This is a very useful summary of recent debates on twitter and elsewhere on the UK’s RAE-REF research assessment exercises.  I’m reposting it for information.  Thank you Ian Pace!

Desiring Progress

I am writing this piece at what looks like the final phase of the USS strike involving academics from pre-1992 UK universities. A good deal of solidarity has been generated through the course of the dispute, with many academics manning picket lines together discoverying common purpose and shared issues, and often noting how the structures and even physical spaces of modern higher education discourage such interactions when working. Furthermore, many of us have interacted regularly using Twitter, enabling the sharing of experiences, perspectives, vital data (not least concerning the assumptions and calculations employed for the USS future pensions model), and much else about modern academic life. As noted by George Letsas in the Times Higher Education Supplement (THES), Becky Gardiner in The Guardian, Nicole Kobie in Wired, and various others, the strike and other associated industrial action have embodied a wider range of frustrations amongst UK-based…

View original post 9,810 more words

signs of passage

Geoffrey James, End of the Fence, looking West, Otay Mesa, from the series Running Fence, 1997, gelatin silver print, 76.3 x 84 cm; image: 46.1 x 57.9 cm, CMCP Collection, National Gallery of Canada  Ottawa. © Geoffrey James. Photo : NGC

“Frontera: Views of the U.S.-Mexico Border brings together a roster of national and international artists, whose works question the very notion of borders, attempt to define their edges, and explore their representation. The exhibition, organized by Luce Lebart in collaboration with the FotoMexico festival, is on view in the Canadian Photography Institute Galleries of the National Gallery of Canada.

The exhibition takes its title from Frontera, a series of photographs by Mexican photographer Pablo López Luz. Shot from a helicopter in 2014 and 2015, these aerial images reveal the meandering course of the dividing line between the two neighbouring countries. The border, easily identifiable in many of the images, is invisible in others. Along the base of mountain ranges the frontier seems a trail of lacerations in the landscape, while in desolate terrains it merges and finally disappears into a network of lines. In places the border takes the form of different kinds of fencing, while elsewhere it is embodied in architectural structures that are both imposing and dissuasive. Along its entire length, the border is one of harsh landscape that deters crossings.

‘Is this Mexico, or is it the United States?” comments Lebart. “It is often impossible to distinguish one side from the other. But Pablo López Luz’s images systematically reveal a key identifying feature: the presence of a road running along the border, used by the US Border Patrol for surveillance.'”

 


 

an embarrassment at Oxford

 

oxford woman cleaner

The Daily Telegraph reports that “The University of Oxford has apologised after an image of a female cleaner being made to clear a message reading “Happy International Women’s Day” was shared on Twitter by a professor.

Oxford Associate Professor of Political theory, Dr Sophie Smith, tweeted the photograph, writing: “Oxford security makes a woman cleaner scrub out ‘Happy International Women’s Day’ on the Clarendon steps. What an image for #IWD, @UniofOxford.”

Sometimes a picture is worth a thousand words …


 

South London comes to the Big Apple

nubya-1024x613

New York was recently invaded by South London, mostly not white.  “The concert – a showcase of British jazz held at downtown club Le Poisson Rouge – was America’s introduction to a small but mighty group of young musicians who during the past three years have helped turned South London into a new jazz epicenter,” reports Rolling Stone.  “There was tenor saxophonist Shabaka Hutchings, at 33 the scene’s elder statesman … Also on tenor was Nubya Garcia, whose quartet embraced classic postbop, but with a fiery group interplay that transcended rote chorus-solos-chorus structures …

“It’s a strange word, ‘jazz,'” Hutchings tells Rolling Stone two days after the showcase, when asked if he’d describe his own music that way. Born in London but raised in his parents’ native Barbados, Hutchings picked up the clarinet at nine, practicing it by mimicking the flows of Nas, Biggie and Tupac verses he was hearing on American radio, and the hyper rhythms of the local Carnival, before returning to England to receive a classical-music degree on the instrument … “The people I revere as master jazz musicians have said they don’t want the word,” he continues. “It’s limiting. It tells them more what they can’t be than what they can. So – do I consider myself a musician who is limited?”

Like Hutchings, his younger colleagues – first- and second-generation Afro-Caribbean immigrants, multi-hyphenated in their cultural backgrounds and in their music – uniformly reject a narrow definition of their chosen style. London’s sound is less a riff on classic African-American jazz than a polyglot party music of the city’s minorities – with calypso and dub, grime and Afrobeat as much its building blocks as Coltrane’s ‘Giant Steps.'”  Long live multiculturalism.


 

an embarrassment at Cambridge

Oxford’s faux pas brought to mind slightly older news from Cambridge, which I didn’t post here at the time because other things crowded it out that week.

Commenting on the Oxfam Haiti scandal, Cambridge Professor of Classics and well-known media personality about town Mary Beard caused uproar when she tweeted:  “I do wonder how hard it must be to sustain ‘civilised’ values in a disaster zone.”  She made things worse on a follow-up post on her regular TLS blog “A Don’s Life,” where she drew an unfortunate analogy between aid workers in Haiti and the boys abandoned on a desert island in William Golding’s Lord of the Flies.  She later tweeted an image of herself in tears, saying “I am really not the nasty colonialist you say I am” …

In a public response, Cambridge English lecturer Priyamvada Gopal urged Beard “to rethink the problematic concept of a ‘disaster zone’ (Trump was more upfront — he called them ‘shitholes’) and what that really means in geopolitical terms in terms of who does what and who is responsible for their appearance as spaces of catastrophe. Still more troubling,” she continued, “is your notion that moral bearings (‘civilised values’!) understandably disappear in spaces where people struggle with the worst things that can happen to human beings.”   She described Beard’s tweet as symptomatic of the culture at Cambridge “where there is little direct abuse but plenty of genteel and patrician casual racism passing as frank and well-meaning observations …”

Gopal got a lot of flak for daring to call out “a national treasure,” including a dressing-down from Times columnist David Aaronovitch, who accused her of being “a privileged Oxbridge academic shivving a colleague.”

 


 

sign the brexit papers!

nottingham trent
Photograph: Fabio de Paolo/Fabio De Paola

Rufaro Chisango, a student at Nottingham Trent University, posted a video on Wednesday in which a group of men can be heard chanting outside her student dorm room “we hate the blacks” and “sign the Brexit papers,” reports the Guardian.

“Words cannot describe how sad this makes me feel, in this, 2018, people think this is still acceptable,” she wrote on Twitter …

In the footage, a group of men can be heard chanting “ooh-aah, fuck the blacks”, “we hate the blacks” and “sign the Brexit papers”.  Chisango said the video did not catch other phrases the men shouted, such as “blacks would go back to picking cotton”. She wrote on Twitter: “I’m the only black person on my floor and they were chanting this outside my door, so don’t be surprised to why I didn’t leave my room.”

Nottingham Trent was named University of the Year in the 2017 Times Higher Education awards, and Modern University of the Year in the 2018 Times and Sunday Times awards.


 

The English surrealist and documentary filmmaker Humphrey Jennings explained the intellectual project of his book Pandaemonium as to “present, not describe or analyse” the “imaginative history of the Industrial Revolution … by means of what I call Images.  These are quotations from writings of the period in question … which either in the writing or in the nature of the matter itself or both have revolutionary and symbolic and illuminatory quality.  I mean that they contain in little a whole world—they are the knots in a great net of tangled time and space—the moments at which the situation of humanity is clear—even if only for the flash time of the photographer or the lighting.”  

These “snippets” are intended to function in the same way.  Click on the headings to go to the original articles, which are mostly from the mainstream aka fake news media.

0418 NEDENAVRUPA.indd

One of my earliest attempts to formulate the argument of my Prague Trilogy was in a keynote lecture I wrote for the conference New Directions in Writing European History at the Middle Eastern Technical University, Ankara, Turkey, on October 25-6, 1994.  I was one of three keynote speakers, along with Paul Langford and John Hall.  My lecture was titled “Prague as a Vantage Point on Modern European History. ”

The conference proceedings, including the three keynote lectures, responses by Turkish scholars, and a transcript of audience questions and panel discussions, were published in English in METU Studies in Development, vol. 22, no. 3, 1995.

I am pleased to belatedly discover that my lecture, along with those of John Hall and Paul Langford, has appeared in Turkish translation in Huri Islamoglu (ed.), Neden Avrupa Tarihi (Istanbul: Iletisim Yayincilik, 2nd ed, 2014).  I like the cover too.

The title means “Why European history?”—a good question.   I began my contribution to the discussion of John Hall’s paper (which was titled “The Rise of the West”) as follows:

I found the presentation very compelling and I was suspicious precisely because of that.  It was the clarity, the simplicity, the elegance of it that came across so strongly, but I wonder can you do that when you are talking about 2000 years of European history and contrasting it with the rest of the world?  Can you compass that complexity within so simple an argumentative framework, within a single theory?  I want to try to pin you down by asking three simple questions …

The simple questions are: first, what is Europe?  Second, where is the West?  And third, when was modernity? 


 

toplum20170617121649

 

My 1986 book with David Frisby, Society, has coincidentally also just appeared in Turkish under the title Toplum.  The same publisher previously did a Turkish edition of The Violence of Abstraction.

Given the appalling repression going on in Turkish universities since the failed coup in 2016, it is heartening that such texts are still being published.

Not by Derek Sayer.  Another excellent piece from Liz Morrish, comparing the scandalous workplace regime at Amazon exposed in today’s New York Times with what is rapidly becoming the new normal in UK universities (and likely to only get worse with the introduction of the TEF to complement the REF).

Academic Irregularities

The photo above made me start contemplating the intrusion of a repressive disciplinary culture into UK universities. Disciplinary action for tailgating? Whatever happened to having a quiet word with somebody? Just a few years ago, campus security was left in the capable hands of a few retirees from the services and the police. They knew academics and students by name, and exerted a calm authority refined through years of dealing with minor infractions. Now, a mere parking violation incurs a meeting with HR.

Many of us will be aware of new university policies on disciplinary procedures. If we have read them, we will be aware that the policies themselves are often not in the least repressive or out of kilter with professional expectations. It is when these policies intersect with over-zealous performance management procedures that things get troublesome – I have previously blogged about so-called under-performing professors 

So when I…

View original post 898 more words

“So get out and stay out of academia would be my advice. It is unlikely to get better. ” Another great and gutsy piece from Liz Morrish.

Academic Irregularities

Liz Morrish replies to a feminist colleague’s letter of resignation. 

I was very sorry to read your letter of resignation. I was, though, delighted that you decided to circulate it among colleagues at NeoLiberal U, along with an article, rapidly becoming a classic, if my Twitter feed is any predictor, by Mountz et al in the Great Lakes Feminist Geography Collective, offering a manifesto for a slower pace of academic life. This is what you have not found at NLU, and you weren’t prepared to go on sacrificing the possibility of intellectual creativity, family life and personal space forever. Sometimes principles have to be lived by, because that’s the right thing to do. NLU doesn’t seem to have any other principle than to ‘maximize the staffing resource and leverage the maximum from the academic contract’ (I paraphrase).

It has been a long time since we sat down and discussed all…

View original post 969 more words

Be warned. Thank you Liz for an excellent post on the proposed “Teaching REF.”

Academic Irregularities

Liz Morrish discusses some new ways the Conservative government will seek to assess and rank universities. ‘Learning gain’ is about to be ‘a thing’.

acronyms

It is just over two weeks after the General Election, and our thoughts turn to the prospect of more cuts in public spending, a new leader for the Labour Party, some uncertainty over Brexit and the referendum on EU membership, and, post UKIP, a somewhat muted dialogue over immigration. But what lies in the future for higher education? Have you been paying selective attention over the months leading up to the election? A tuition fee cut may have lodged in your memory, but that was Labour Party policy, and we can forget that now. What does a Conservative government have planned for universities? We know that abolition of the cap on student numbers was already in the offing, as was a national postgraduate loan system for…

View original post 793 more words

HEFCE letter

And the tapestry of lies, damned lies and statistics that is REF2014 keeps on unraveling.

I learn from this morning’s Twitterfeed that Dr Dan Lockton, of the Royal College of Art, and Professor Melissa Terras, Professor of Digital Humanities and Director of the UCL Centre for Digital Humanities, have received identical letters in response to requests under the Freedom of Information Act for HEFCE to disclose information held on them in connection with the REF.

Dr Lockton had asked to see “data held by HEFCE concerning myself or my work, as part of the REF, including any comments, assessments or other material.

HEFCE responded that they did not hold the information he was seeking and referred him to a FAQ on the REF website:

Can you provide the scores for my outputs that were submitted to the REF?

Individual outputs were assessed in order to produce the output sub-profiles for each submission. Once the sub-profiles were complete, the scores for individual outputs were no longer required and have been destroyed. In accordance with data protection principles, we no longer hold the scores for individual outputs as they constitute personal data, which should not be held for longer than required to fulfil their purpose.

When it first emerged that RAE2008 was making the same use of such Orwellian memory holes, an (anonymous) panelist explained to Times Higher Education that “It is for our own good. The process could become an absolute nightmare if departmental heads or institutions chose to challenge the panels and this information was available.[1]

HEFCE’s letter to Dr Lockton goes on to emphasize that:

The purpose of the REF is to assess the quality of research and produce outcomes for each submission in the form of sub-profiles and an overall quality profile. These outcomes are then used to inform funding, provide accountability for public investment and provide benchmarking information. The purpose of the REF is not to provide a fine-grained assessment of each individual’s contribution to a submission and the process is not designed to deliver this.

Yes, but. At the risk of appearing obtuse, I would have thought that when 65% of the overall quality profile rests on REF subpanels’ assessment of the quality of individuals’ outputs, we would expect “fine-grained assessment” of those outputs. Is this not why we have this cumbersome, time-consuming, expensive process of panel evaluation—as distinct, for instance, from using metrics—to begin with?

If it’s not fine-grained assessment, what sort of assessment is it?  And how can we trust it to provide a reliable basis for funding decisions, accountability for public investment, or benchmarking?

In the immortal words of Amy Winehouse, what kind of fuckery is this?

[1] Zoe Corbyn, ‘Panels ordered to shred all RAE records’. Times Higher Education, 17 April 2008.

 

1.

The rankings produced by Times Higher Education and others on the basis of the UK’s Research Assessment Exercises (RAEs) have always been contentious, but accusations of universities’ gaming submissions and spinning results have been more widespread in REF2014 than any earlier RAE. Laurie Taylor’s jibe in The Poppletonian that “a grand total of 32 vice-chancellors have reportedly boasted in internal emails that their university has become a top 10 UK university based on the recent results of the REF”[1] rings true in a world in which Cardiff University can truthfully[2] claim that it “has leapt to 5th in the Research Excellence Framework (REF) based on the quality of our research, a meteoric rise” from 22nd in RAE2008. Cardiff ranks 5th among universities in the REF2014 “Table of Excellence,” which is based on the GPA of the scores assigned by the REF’s “expert panels” to the three elements in each university’s submission (outputs 65%, impact 20%, environment 15%)—just behind Imperial, LSE, Oxford and Cambridge. Whether this “confirms [Cardiff’s] place as a world-leading university,” as its website claims, is more questionable.[3] These figures are a minefield.

Although HEFCE encouraged universities to be “inclusive” in entering their staff in REF2014, they were not obliged to return all eligible staff and there were good reasons for those with aspirations to climb the league tables to be more “strategic” in staff selection than in previous RAEs. Prominent among these were (1) HEFCE’s defunding of 2* outputs from 2011, which meant outputs scoring below 3* would now negatively affect a university’s rank order without any compensating gain in QR income, and (2) HEFCE’s pegging the number of impact case studies required to the number of staff members entered per unit of assessment, which created a perverse incentive to exclude research-active staff if this would avoid having to submit a weak impact case study.[4] Though the wholesale exclusions feared by some did not materialize across the sector, it is clear that some institutions were far more selective in REF2014 than in RAE2008.

Unfortunately data that would have permitted direct comparisons with numbers of staff entered by individual universities in RAE2008 were never published, but Higher Education Statistical Authority (HESA) figures for FTE staff eligible to be submitted allow broad comparisons across universities in REF2014. It is evident from these that selectivity, rather than an improvement in research quality per se, played a large part in Cardiff’s “meteoric rise” in the rankings. The same may be true for some other schools that significantly improved their positions, among them Kings (up to 7th in 2014 from 22= in 2008), Bath (14= from 20=), Swansea (22= from 56=), Cranfield (31= from 49), Heriot-Watt (33 from 45), and Aston (35= from 52=). All of these universities except Kings entered fewer than 75% of their eligible staff members, and Kings has the lowest percentage (80%) of any university in the REF top 10 other than Cardiff itself.

Cardiff achieved its improbable rank of 5th on the basis of a submission that included only 62% of eligible staff. This is the second-lowest percentage of any of the 28 British universities that are listed in the top 200 in the 2014-15 Times Higher Education World University Rankings (of these schools only Aberdeen entered fewer staff, submitting 52%). No other university in this cohort submitted less than 70% of eligible staff, and half (14 universities) submitted over 80%. Among the top schools, Cambridge entered 95% of eligible staff, Imperial 92%, UCL 91% and Oxford 87%.

Many have suggested that “research power” (which is calculated by multiplying the institution’s overall rounded GPA by the total number of full-time equivalent staff it submitted to the REF) gives a fairer indication of a university’s place in the national research hierarchy than GPA rankings alone. By this measure, Cardiff falls to a more credible but still respectable 18th. But when measured by “research intensity” (that is, GPA multiplied by the percentage of eligible staff entered), its rank plummets from 5th to 50th. To say that this provides a more accurate indication of its true standing might be overstating the case, but it certainly underlines why Cardiff does not belong among “world-leading” universities.  Cardiff doubtless produces some excellent research, but its overall (and per capita) performance does not remotely justify comparisons with Oxford, Cambridge, or Imperial—let alone Caltech, Harvard, Stanford, Princeton, MIT, UC-Berkeley and Yale (the other universities in the THE World University Rankings top 10).  In this sense the GPA Table of Excellence can be profoundly misleading.

“To their critics,” writes Paul Jump in Times Higher Education, “such institutions are in essence cheating because in reality their quality score reflects the work produced by only a small proportion of their staff.”[5] I am not sure the accusation of cheating is warranted, because nobody is doing anything here that is outside HEFCE’s rules. The problem is rather that the current REF system rewards—and thereby encourages—bad behavior, while doing nothing to penalize the most egregious offenders like Cardiff.

The VCs at Bristol (11= in the REF2014 GPA table) and Southampton (18=, down from 14= in 2008) might be forgiven for ruefully reflecting that they, too, might now be boasting that they are “a top ten research university” had they not chosen to submit 91% and 90% of their eligible faculty respectively—a submission rate that on any reasonable criteria (as distinct from HEFCE’s rules) should itself be regarded as a mark of research excellence. Measured by research intensity Bristol comes in at 5= (jointly with Oxford) and Southampton 8= (jointly with Queen’s University Belfast, which submitted 95% of its staff and is ranked 42= on GPA). Meantime the VCs at St Andrews (down from 14= to 21=, 82% of eligible staff submitted), Essex (11th to 35=, 82% submitted), Loughborough (28= to 49=, 88% submitted) and Kent (31= to 49=, 85% submitted) may by now have concluded that—assuming they hold onto their jobs—they will have no alternative other than to be much more ruthless in culling staff for any future REF.

2.

The latest Times Higher Education World University Rankings puts Cardiff just outside the top 200, in the 201-225 group—which places it 29= among UK universities, along with Dundee, Newcastle, and Reading. Taking GPA, research power and research intensity into account—as we surely should, in recognition that not only the quality of research outputs but the number and proportion of academic staff who are producing them are also necessary elements in evaluating any university’s overall contribution to the UK’s research landscape—such a ranking seems intuitively to be just about right.

I have shown elsewhere[6] that there was, in fact, a striking degree of overall agreement between the RAE2008 rankings and the Times Higher Education World University Rankings. Repeating the comparison for UK universities ranked in the top 200 in the THE World University Rankings for 2014-15 and the REF2014 GPA-based “Table of Excellence” yields similar findings. The data are summarized in Table 1.

 

Table 1: REF2014 performance of universities ranked in the top 200 in Times Higher Education World University Rankings 2014-15

University THE World University Rankings 2014-15 REF2014 ranking by GPA (RAE2008) REF2014 ranking by research power REF2014 ranking by research intensity Percentage of eligible staff submitted
1-50
Oxford 3 4 (4=) 2 5= 87
Cambridge 5 5 (2) 3 2 95
Imperial 9 2 (6) 8 3 92
UCL 22 8= (7) 1 4 91
LSE 34 3 (4=) 28 7 85
Edinburgh 36 11= (12) 4 12= 83
Kings 40 7 (22=) 6 17 80
50-100
Manchester 52 17 (8) 5 26= 78
Bristol 74 11= (14) 9 5= 91
Durham 83 20 (14=) 20 24= 79
Glasgow 94 24 (33=) 12 15 84
100-150
Warwick 103 8= (9) 15 11 83
QMUL 107 11= (13) 22 34= 74
St Andrews 111= 21= (14=) 22 16 82
Sussex 111= 40 (30) 34 42= 73
York 113 14= (10) 23 32 75
Royal Holloway 118 26= (24=) 40 31 77
Sheffield 121 14= (14=) 13 33 74
Lancaster 131 18= (20=) 26 29 77
Southampton 132 18= (14=) 11 8= 90
Leeds 146 21= (14=) 10 34= 75
Birmingham 148 31 (26) 14 23 81
150-200
Exeter 154 30 (28=) 21 19= 82
Liverpool 157 33 (40) 19 46= 70
Nottingham 171 26= (24=) 7 28 79
Aberdeen 178 46= (38) 29 57 52
UEA 198 23 (35=) 36 37 75
Leicester 199 53 (51) 24 39 78

Seven UK universities make the top 50 in the 2014-15 THE World University Rankings: Oxford, Cambridge, Imperial, UCL, LSE, Edinburgh, and Kings. Six of these are also in the REF2014 top 10, while the other (Edinburgh) is only just outside it at 11=. Four of the leading five institutions are same in both rankings (the exception being UCL, which is 8= in REF 2014), though not in the same rank order. Of 11 UK universities in THE top 100, only one is outside the REF top 20 (Glasgow, at 24th). Of 22 UK universities in THE top 150, only two (Birmingham, 31 in REF, and Sussex, 40 in REF) are outside REF top 30. Of the 28 UK universities in THE top 200, only two (Aberdeen at 46= and Leicester at 53) rank outside the REF top 40.

Conversely, only two universities in the REF2014 top 20, Cardiff at 6 and Bath at 14=, do not make it into the THE top 200 (their respective ranks are 201-225 and 301-350). Other universities that are ranked in the top 40 in REF2014 but remain outside the THE top 200 are Newcastle (26=), Swansea (26=), Cranfield (31), Herriot-Watt (33), Essex (35=), Aston (35=), Strathclyde (37), Dundee (38=) and Reading (38=).

Table 2 provides data on the performance of selected UK universities that submitted to REF2014 but are currently ranked outside the THE world top 200.

Table 2. REF2014 performance of selected UK universities outside top 200 in Times Higher Education World University Rankings 2014-15

University THE World University Rankings 2014-15 REF2014 ranking by GPA (RAE 2008) REF2014 ranking by research power REF2014 ranking by research intensity Percentage of eligible staff submitted
Cardiff 201-225 6 (22=) 18 50 62
Dundee 201-225 38= (40=) 39 49 68
Newcastle 201-225 26= (27) 16 26= 80
Reading 201-225 38= (42) 27 19= 83
Birkbeck 226-250 46= (33=) 48 30 81
Plymouth 276-300 66= (75=) 47 59 50
Bath 301-350 14= (20=) 35 34= 74
Bangor 301-350 42= (52=) 59 51 63
Essex 301-350 35= (11) 45 22 82
Aberystwyth 350-400 58= (45=) 51 46= 76
Aston 350-400 35= (52=) 69 60 43
Portsmouth 350-400 65 (68=) 55 80 27
Swansea 26= (52=) 42 42= 71
Cranfield 31= (49) 61 64 37
Heriot-Watt 33 (45) 44 53 57

Dundee, Newcastle and Reading only just miss the THE cut (they are all in the 201-225 bracket). While all three outscored Aberdeen and Leicester, who are above them in the THE rankings (in Leicester’s case, at 199, very marginally so) in the REF, only Newcastle does substantially worse in the THE rankings than in the REF. It is ranked 26= in the REF with Nottingham and Royal Holloway, ahead of Leicester (53), Aberdeen (46), Sussex (40), Liverpool (33), Birmingham (31) and Exeter (30)—all of which are in the top 200 in the THE World Rankings. While there was a yawning gulf between Essex’s RAE2008 ranking of 11th and its THE ranking in the 301-350 group, the latter does seem to have presaged its precipitous REF2014 fall from grace to 35=. Conversely, the THE’s inclusion of Plymouth in the 276-300 group of universities places it considerably higher than its RAE rank of 66= would lead us to expect. This is not the case with most of the UK universities listed in the lower half of the THE top 400. Birkbeck, Bangor, Aberystwyth and Portsmouth also all found themselves outside the top 40 in REF2014.

The greatest discrepancies between REF2014 and the THE World Rankings come with Cardiff (6 in REF, 201-225 in REF), Bath (14= in REF, 301-350 in THE), Swansea (26= in REF, not among THE top 400), Aston (35= in REF, 350-400 in THE), Cranfield and Heriot-Watt (31= and 33 respectively in REF, yet not among THE top 400). On the face of it, these cases flatly contradict any claim that THE (or other similar) rankings are remotely accurate predictors of REF performance. I would argue, on the contrary, that these are the exceptions that prove the rule. All these schools were prominent among those universities identified above who inflated their GPA by submitting smaller percentages of their eligible staff in REF2014. Were we to adjust raw GPA figures by research intensity, we would get a much closer match, as Table 3 shows.

Table 3. Comparison of selected universities performance in THE World University Rankings 2014-15 and REF2014 by GPA and research intensity.

University THE 2014-15 REF2014 intensity REF2014 GPA
Cardiff 201-225 50 6
Bath 301-350 34= 14=
Swansea 42= 26=
Aston 350-400 60 35=
Cranfield 64 31=
Heriot-Watt 53 33

The most important general conclusion to emerge from this discussion is that despite some outliers there is a remarkable degree of agreement between the top 40 in REF2014 and the top 200 in the THE 2014-15 World University Rankings, and the correlation increases the higher we go in the tables. Where there are major discrepancies, these are usually explained by selective staff submission policies.

One other correlation is worth noting at this point. All 11 of the British universities in the THE top 100 are members of the Russell Group, as are 10 of the 17 British universities ranked between 100-200. The other six universities in this latter cohort (St Andrews, Sussex, Royal Holloway, Lancaster, UEA, Leicester) were all members of the now-defunct 1994 Group. Only one British university in the THE top 200 (Aberdeen) belonged to neither the Russell Group nor the 1994 Group. Conversely, only two Russell Group universities, Newcastle and Queen’s University Belfast, did not make the top 200 in the THE rankings.[7] In 2013-14 Russell Group and former 1994 Group universities between them received almost 85% of QR funding. Here, too, an enormous amount of money, time, and acrimony seems to have been expended on a laborious REF exercise that merely confirms what THE rankings have already shown.

3.

The most interesting thing about this comparative exercise is that the Times Higher Education World University Rankings not only make no use of RAE/REF data, but rely on quantitative methodologies that have repeatedly been rejected by the British academic establishment in favor of the “expert peer review” that is supposedly offered by REF panels. THE gives 30% of the overall score for the learning environment, 7.5% for international outlook, and 2.5% for industry income. The remaining 60% is based entirely on research-related measures, of which “the single most influential of the 13 indicators,” counting for 30% of the overall THE score, is “the number of times a university’s published work is cited by scholars globally” as measured by the Web of Science. The rest of the research score is derived from research income (6%), ‘research output scaled against staff numbers’ (6%, also established through the Web of Science), and ‘a university’s reputation for research excellence among its peers, based on the 10,000-plus responses to our annual academic reputation survey’ (18%).

The comparisons undertaken here strongly suggest that such metrics-based measures have proved highly reliable predictors of performance in REF2014—just as they did in previous RAEs. To be sure, there are differences in the fine details of the order of ranking of institutions between the THE and REF, but in such cases can we be confident that it is the REF panels’ highly subjective judgments of quality that are the more accurate? To suggest there is no margin for error in tables where the difference in GPA between 11th (Edinburgh, 3.18) and 30th (Exeter, 3.08) is a mere 0.1 points would be ridiculous. I have elsewhere suggested (here and here) that there are in fact many reasons why such confidence would be totally misplaced, including lack of specialist expertise among panel members and lack of time for reading outputs in the depth required.[8] But my main point here is this.

If metrics-based measures can produce similar results to those arrived at through the REF’s infinitely more costly, laborious and time-consuming process of ‘expert review’ of individual outputs, there is a compelling reason to go with the metrics; not because it is necessarily a valid measure of anything but because it as reliable as the alternative (whose validity is no less dubious for different reasons) and a good deal more cost-efficient. The benefits for collegiality and staff morale of universities not having to decide who to enter or exclude from the REF might be seen as an additional reason for favoring metrics. I am sure that if HEFCE put their minds to it they could come up with a more sophisticated basket of metrics than Times Higher Education, which would be capable of meeting many of the standard objections to quantification.  Supplementing the Web of Science with Publish or Perish or other citation indices that capture books as well as articles might be a start.  I hope James Wilsdon’s committee will come up with some useful suggestions for ways forward.

[1] Laurie Taylor, “We have bragging rights!” in The Poppletonian, Times Higher Education, 8 January 2015.

[2] Well, not quite. Cardiff is actually ranked 6th in the REF2014 “Table of Excellence,” which is constructed by Times Higher Education on the basis of the grade point average (GPA) of the marks awarded by REF panels, but the #1 spot is held not by a university but the Institute of Cancer Research (which submitted only two UoAs). This table and others drawn upon here for “research power” and “research intensity” are all compiled by Times Higher Education.

[3] “REF 2014,” Cardiff University website at http://www.cardiff.ac.uk/research/impact-and-innovation/quality-and-performance/ref-2014

[4] Paul Jump, “Careers at risk after case studies ‘game playing’, REF study suggests.” Times Higher Education, 22 January 2015.

[5] Paul Jump, “REF 2014 rerun: who are the ‘game players’?” Times Higher Education, 1 January 2015.

[6] See Derek Sayer, Rank Hypocrisies: The Insult of the REF. London: Sage, 2014.

[7] I have discussed Newcastle already. Queen’s came in just outside the REF top 40 (42=) but with an excellent intensity rating (8=, 95% of eligible staff submitted).

[8] See, apart from Rank Hypocrisies, my articles “One scholar’s crusade against the REF,” Times Higher Education, 11 December, 34-6; “Time to abandon the gold standard? Peer Review for the REF Falls Far Short of Internationally Acceptable Standards,” LSE Impact of Social Sciences blog, 19 November (reprinted as “Problems with peer review for the REF,” CDBU blog, 21 November).

Sayer_Rank_SWIFTS copy

UPDATE.  There is a (much cheaper) Kindle edition of this book to come soon, but it is not currently listed on amazon or other sites.

Seems oddly appropriate that I should be celebrating my 64th birthday with a new book that attacks a centerpiece of the unholy alliance of neoliberalism and Old Corruption that has been running and ruining British universities for the last thirty years.  Published December 3, 2015.  Lets hope it has “impact”! For more details see here.

Time to abandon the gold standard? Peer review for the REF falls far short of internationally accepted standards.

The REF2014 results are set to be published next month. Alongside ongoing reviews of research assessment, Derek Sayer points to the many contradictions of the REF. Metrics may have problems, but a process that gives such extraordinary gatekeeping power to individual panel members is far worse. Ultimately, measuring research quality is fraught with difficulty. Perhaps we should instead be asking which features of the research environment (a mere 15% of the assessment) are most conducive to a vibrant research culture and focus funding accordingly.  [LSE Impact Blog, 19 November 2014]