Guardian League Table 2015: Well I never

New Guardian League Table for 2015 entry (and yes, Cambridge is top)

Top 25 of the full list (available here) is as follows (last year’s position in brackets):

1 (1) Cambridge
2 (2) Oxford
3 (4) St Andrews
4 (7) Bath
5 (9) Imperial
6 (8) Surrey
7 (3) LSE
8 (6) Durham
9 (10) Warwick
10 (11) Lancaster
11 (5) UCL
12 (12) Exeter
13 (18) Heriot-Watt
14 (17) UEA
15 (14) Loughborough
16 (16) York
17 (15) Birmingham
18 (19) Edinburgh
19 (24) Southampton
20 (20) Kent
21 (13) Leicester
22 (30) Aston
23 (28) Nottingham
24 (22) SOAS
25 (21) Glasgow

Exciting commentary of the new ranking is included:

The Universities of Bath, Surrey and Lancaster have charged ahead of their Russell Group peers in the Guardian league table of universities, knocking University College London (UCL) out of its traditional place in the top 10.

The London School of Economics (LSE), UCL’s neighbour and fellow member of the elite Russell Group of research-focused universities, has also suffered, losing its place in the top three to St Andrews.

The University of Cambridge underlines its dominance by coming top of the table for the fourth year in a row and increasing the gap between itself and its ancient rival Oxford, which remains in second place. St Andrews, in third, is followed by Bath, Imperial College, Surrey, LSE, Durham, Warwick and Lancaster.

Other climbers include include the Universities of Glyndŵr (from 108 to 64), Derby (from 79 to 50) and Falmouth (76 to 53).

Anglia Ruskin has seen the biggest drop (from 67 to 105), caused in part by a rising student/staff ratio. In civil engineering there are now 22.7 students per member of academic staff, where previously there had been just 14.3.

Birmingham City also fell (from 61 to 88), as did Bournemouth (52 to 71), Aberystwyth (88 to 106), Greenwich (70 to 87), Chester (46 to 61) and Bristol (23 to 34).

It’s pretty difficult to understand what is actually going on here but there is some helpful commentary:

The chief factor causing UCL’s drop out of the top ten was a fall in the number of leavers getting graduate-level jobs, says Matt Hiely-Rayner, from Intelligent Metrix, the independent consultancy that compiles the tables.

“There have been big falls in the numbers going on to graduate-level jobs or further study, particularly in psychology and chemistry. Another contributor is a drop in overall student satisfaction.”

As for the LSE, a drop in its employability score and spending per student caused it to lose its place in the top three.

Universities hoping to climb the league tables, Hiely-Rayner advises, should “identify their areas of weakness and concentrate on improving those. They should do that according to their own internal analysis, not just what the league tables say.”

Outstandingly useful tip that.

Advertisements

Guardian League Table 2014: One or two changes

New Guardian League Table for 2014

Top 20 of the full list (available here) is as follows (last year’s position in brackets):

1 (1) Cambridge
2 (2) Oxford
3 (3) LSE
4 (4) St Andrews
5 (6) UCL
6 (7) Durham
7 (9) Bath
8 (12) Surrey
9 (13) Imperial
10 (5) Warwick
11 (7) Lancaster
12 (10) Exeter
13 (19) Leicester
14 (11) Loughborough
15 (30) Birmingham
16 (17) York
17 (24) UEA
18 (20) Heriot-Watt
19 (15) Edinburgh
20 (22) Kent

The full story on the extraordinary news that Cambridge has held on to top slot for the second year running can be found here. The top 20 is largely unchanged although Birmingham, UEA and Kent are all new entries.

A couple of other comments in the piece are worth noting if only because of the dramatic and bizarre consequences of the methodology on some institutions’ placings:

Lower down the table but still remarkable is the rise of Northampton, which climbs 39 places to 47 (from 86), largely thanks to improved job prospects and the entry standards of its students. And Portsmouth jumps from 78 to 48 this year. The main contributory factor here is a sharp increase in the number of students achieving a first or a 2:1.

It’s less good news at Sussex, which falls from 27th to 50th place as graduates find it hard to secure a job, particularly in philosophy and anthropology. But it’s not all bad news – on the back of extremely high student satisfaction and entry standards, Sussex has climbed to the top of the table for social work.

The biggest fall of all is by Cardiff Met, from 66th to 105th place. This is because of a sharp fall in student satisfaction. The ratios of expenditure and staffing per student also deteriorated.

Unistats and KIS – just too much information?


Unistats – now with added KIS – has launched

The all new Unistats site has launched:

Unistats is the official site that allows you to search for and compare data and information on university and college courses from across the UK. The site draws together comparable information on those areas that students have identified as important in making decisions about what and where to study. The items that students thought were most useful have been included in a Key Information Set (KIS), which can be found on the Overview tab for each course.

The site draws on the following official data on higher education courses:

  • Student satisfaction from the National Student Survey
  • Student destinations on finishing their course from the Destinations of Leavers from Higher Education survey
  • How the course is taught and study patterns
  • How the course is assessed
  • Course accreditation
  • Course costs (such as tuition fees and accommodation)

There is a mass of information here and, as this screenshot shows, data is presented in a handy tabular form:

However, we do have a problem. As previous posts have noted there really is just too much data here and across the various university, HE sector information and league table websites. The launch recently of the new Which? University site (about which I posted here recently) added to the mess and the Unistats upgrade just serves to make the picture even more complicated for applicants.

There is no information deficit in HE. We do not need more and better course comparison websites. What we do need is fewer new websites and more and better guidance for prospective students.

Guardian League Table 2013: Ups and downs

New Guardian League Table for 2013

Top 20 of the full list (available here) is as follows (last year’s position in brackets):

1 (1) Cambridge
2 (2) Oxford
3 (4) LSE
4 (3) St Andrews
5 (6) Warwick
6 (5) UCL
7 (8) Durham
7 (7) Lancaster
9 (14) Bath
10 (11) Exeter
11 (9) Loughborough
12 (19) Surrey
13 (10) Imperial
14 (21) Glasgow
15 (16) Edinburgh
16 (na) Buckingham
17 (15) York
18 (25) Bristol
19 (17) Leicester
20 (27) Heriot Watt

The full story on the (not terribly surprising) news that Cambridge has held on to top slot for the second year running can be found here. A couple of comments in the piece are worth noting:

Most of the shifts in this year’s league table are due to changing levels of student satisfaction. Sussex dropped to 27th place from 11th after students in English and geography became significantly less happy with their departments. Stirling dropped from 44th to 67th after value-added scores in business and law declined.

Aberystwyth fell in six subjects, with declines in all performance measures. It drops from 50th place to 81st.

Among the climbers is Brunel, up from 82nd to 44th, taking the top spot for social work. Chester went from 80th to 52nd, with student satisfaction results driving improved ratings in biosciences, history, law and psychology. The career prospects of its biosciences graduates also improved. Coventry rose from 63rd to 46th, with student survey results a major factor.

Overall, there is plenty of swapping of places inside the top 10 and some more dramatic movements inside and beyond the top 20. Given the emphasis placed on NSS scores this is perhaps not much of a shock. But the high level of volatility in the table does keep things fresh every year. Irritating that University of Nottingham slips out of the top 20 though.

Killing the myths in higher education

Misunderstandings and myths

An interesting new pamphlet has just been published by HEPI. Misunderstanding Modern Higher Education: Eight “category mistakes” is a brief and snappy read and is available from the HEPI website:

In this HEPI occasional report, Professor Sir David Watson discusses eight myths – category mistakes – concerning higher education that are widely believed, and argues that these need to be exploded if higher education is to maintain its current comparatively healthy state. This report is based on his presentation to a joint HEPI/HEA seminar at the House of Commons on 26 January 2012.

It’s quite a challenging set of propositions. Here a category mistake is defined as a “sentence that says one thing in one category that can only intelligibly be said of something of another” eg “what does blue smell like?” Watson suggests there are at least eight category mistakes in higher education discourse at present. Some of these I’d agree with but other I think are less convincing.

1. “University” performance

Watson argues that it is the sector or the subject rather than the institution which is the more meaningful unit of analysis. This is certainly true in certain areas, eg NSS, as suggested here. BUT the institution is the key organisational unit, indeed the primary one. While it can reasonably be argued that the university is no more than the sum of its (academic) parts and the staff in those units identify with them more strongly than with the university itself, it is surely wrong to imagine that the subject/department can regarded as an entirely independent unit. There is a mutual dependency here.

3. HE “Sector”

We should be talking about tertiary, ie post-secondary, education rather than exclusively about higher ed. I’m not sure I agree nor does it seem to me that this is a category error. “Higher” education is a sub-category of tertiary education. It is funded differently and has a different set of traditions and regulatory frameworks to other tertiary provision. We might want to take a more rounded view of tertiary education and, indeed, it would be short-sighted not to. But do we gain much by preventing sub-divisions within the very wide range of activity that is tertiary education?

4. Research “selectivity”

Research concentration, which the system encourages, is running counter to the national need and the general trend towards inter-institutional collaboration. In the long run, concentration of research will be counter-productive and isolated work will wither. Two tiers won’t work therefore. But surely this is just an argument for a different kind of selectivity, one based on different criteria to those generated through RAE/REF? For example, signficant collaboration could be the primary criterion. With limited resources to go round though there is always going to be some selectivity.

Probably mythical


5. World-classness

Watson highlights the madness of the international league tables and notes that what everyone says they want is not reflected in what league tables measure. The international tables, which are the determinant of ‘world-classness’, are fundamentally related to research. Again therefore this is about the criteria selected.

7. Informed choice

The paper rightly notes that student choices over time have moulded our system. The idea that students need more information which will then persuade the market to do what government wants is, Watson argues, fundamentally misguided. Additional information is simply not going to get students to do government’s bidding.

8. Reputation and quality: the confusion between the two

Clearly there is some form of relationship between reputation and quality but Watson argues that the gap in reality is much smaller than it often appears. Good quality can clearly exist independently of reputation. Also Watson rightly notes the perception of student instumentalism and its dominance in the discourse.

(I’ve ignored number 2, Access, and number 6, The public/private divide, here.)

And finally…

Finally, Watson asks “What is to be done”?

Rather than Leninist solutions though he offers three particular suggestions. First, the system will need to be messier, more flexible and co-operative. Secondly, we should not chase the Harvard model but rather aim to develop a system more like the California Masterplan – this is really about the national direction of tertiary education. Thirdly, he argues that a proper credit accumulation and transfer framework is needed: “we fail to use these systems for reasons of conservatism, snobbery and lack of imagination”. (Actually, I’d suggest it is much more about a desire to protect institutional autonomy.)

Watson concludes by arguing that we should start by tackling these category mistakes and then learn to live with “flux and contingency”. I’m not sure we would want to spend a huge amount of time on the former or that we have any choice about the latter. It’s the nature of the world we operate in. Do read the piece though.

Code of practice needed to “halt degree course mis-selling”

Should universities stop using NSS data to promote courses?

An interesting article by John Holmwood on the questionable validity and reliability of the National Student Survey – he argues that universities and others should not therefore use the outcomes of the NSS in league tables or promotional material. Furthermore, he argues that a code of practice is needed to stop what he says is degree course mis-selling:

It is a clear public interest that there be proper standards in the presentation of information to prospective students. The changes to higher education funding are of such far-reaching importance that the presentation of information should be subject to scrutiny by the UK Statistics Authority. A first step might be for Universities UK —and the separate University Mission Groups, such as Russell Group, 1994 Group, and Million+ to agree a Code of Practice among its members not to use statements of rank order position in their claims about their own institution and courses. It is a matter of shame for universities that this is necessary in the presentation of evidence, appropriate standards for which are intrinsic to their raison-d’être.

It’s a well-argued case. But in an environment where every institution will be competing even more fiercely for applicants, where they will be required to publish a particular set of information by government, where there is a plethora of league tables which draw on NSS data it would be surprising if any university or mission group would sign up for such a code. Of course the National Student Survey and league tables have flaws and there aren’t any league tables which stand up to serious academic scrutiny. But they aren’t going to go away and universities aren’t going to stop using the outputs where they believe it is in their interest to do so. And as for involving the UK Statistics Authority, do we really want even more regulation and intervention in universities’ business than we already have?

NSS: can things get any worse in universities?

Press stories on latest NSS results seem to be largely of the glass one fifth empty variety

Indeed, you could be forgiven for thinking the sector was already in meltdown if you read the Independent which says “one-third of university students unhappy with lecturers’ performance”:

Thousands of university students still find their lecturers too remote despite pledges that standards of service would improve with the introduction of top-up fees of up to £3,225 a year. A national survey by the Higher Education Funding Council for England showing the level of student satisfaction with their courses reveals there has been no improvement in three years. Overall, 82 per cent are satisfied with their course – but the figure dips to 67 per cent when it comes to assessment of their work and the feedback they get from lecturers.

The BBC has a similar line:

UK students’ satisfaction with their undergraduate courses has stalled, the National Student Survey has found. Overall, 82% of finalists at UK universities in 2010 were satisfied with the quality of their course, the same percentage as last year. Universities warn satisfaction ratings could deteriorate as funding cuts bite. The NSS, in which 252,000 students took part, is published by the Higher Education Funding Council for England (Hefce) to help maintain standards.

But really. OK, there remains plenty of scope for improvement, particularly in the area of feedback to students on their work but to deliver an overall satisfaction rating of more than 80% over such a large number of students is surely hugely positive? So why are universities getting a kicking for this? Presumably even an average satisfaction rating of 90% plus would be inadequate.

2010 Independent League Table

Latest Independent league table

First of the new season’s UK tables has just been published by the Independent.

Full details of the institutional and subject rankings are provided by the Complete University Guide which can be found here. There isn’t much change at the top but the most striking thing is the inclusion for the first time in a UK league table (I think) of the University of Buckingham, the UK’s first private university.

Rankings (2009 rank in brackets)

    1 (1) Oxford
    2 (2) Cambridge
    3 (3) Imperial College London
    4 (5) Durham
    5 (4) London School of Economics>
    6 (7) St Andrews
    7 (6) Warwick
    8 (12) Lancaster
    9 (8) University College London
    10 (10) York
    11 (11) Edinburgh
    12 (9) Bath
    13 (17) King’s College London
    14 (13) Southampton
    15 (15) SOAS
    16 (16) Bristol
    17 (14) Aston
    18 (19) Nottingham
    19 (25) Sussex
    20 (–) Buckingham

Also, the Complete University Guide people let you play with the weightings for each of the criteria, so it is possible to bump Oxbridge from the top slots if you really try.

Universities accused of pressuring students on NSS

“Pressuring” students on NSS

According to the Telegraph some institutions have been trying to persuade students to give them decent scores in the NSS:

Eight British universities were reported to the higher education funding body recently over allegations that they had encouraged students to respond positively to the annual National Student Survey. Documents released under freedom of information laws showed that the institutions had tried to persuade students to give their universities high scores in the 22-question “student satisfaction survey”.

Included in the complaints were accusations by students that lecturers and heads of department had told them to give high scores when answering the questions in order to improve the value of their degree. One lecturer was even accused of telling students that they would not get a good job if they gave their university and course a low mark.

Should we be surprised? Given the significance of NSS scores in UK league tables it would be pretty extraordinary if institutions weren’t looking for ways to improve their scores. However, the fact that there seem to have been only a handful of formal complaints suggests that most universities are seeking to improve their ratings through more appropriate means. Like actually responding to what students tell them. Maybe.

Higher ambitions…

New HE Framework

Follow up to earlier post on HE as food-labelling:

Lord Mandelson has launched Higher Ambitions. There’s a lot in here and much of it yet to be fully fleshed out. And the much trailed element on improved consumer information still requires some work:

Higher ambitions

All universities should publish a standard set of information setting out what students can expect in terms of the nature and quality of their programme.

This should set out how and what students will learn, what that knowledge will qualify them to do, whether they will have access to external expertise or experience, how much direct contact there will be with academic staff, what their own study responsibilities will be, what facilities they will have access to, and any opportunities for international experience. It should also offer information about what students on individual courses have done after graduation. The Unistats website will continue to bring together information in a comparable way so that students can make well-informed informed [sic] choices, based on an understanding of the nature of the teaching programme they can expect, and the long-term employment prospects it offers. We will invite HEFCE, the Quality Assurance Agency for Higher Education (QAA) and UKCES to work with the sector and advise on how these goals should be achieved.

Hmmm. Should be an interesting consultation.

Sunday Times League Table

Sunday Times League Table is now out

The 2010 Sunday Times Good University Guide. Change at the top but not really “a year of upheaval” as billed:

1. Oxford (2)
2. Cambridge (1)
3. Imperial (3)
4. UCL (6)
5. St Andrews (5)
6. Warwick (7)
7. Durham (8)
8. York (9)
9. LSE (4)
10. Bristol (16)
11. Bath (10)
12. Southampton (12)
13. King’s College London (17)
14. Nottingham (13)
15= Edinburgh (15)
15= Loughborough (11)
17. Exeter (14)
18. Sheffield (19)
19. Lancaster (20)
20= Leicester (18)
20= Birmingham

University of Oxford

The University of Oxford is on something of a winning streak. After a second successive victory over Cambridge in the boat race this year, the university has now knocked its light-blue rival off the top of The Sunday Times university league table for the first time.

This feat, after 11 years in second place, earns Oxford The Sunday Times University of the Year award. It edged narrowly ahead of its principal British rival in a year of upheaval in our league table, prompted by the first research assessments in seven years and the move to measuring teaching quality primarily by levels of student satisfaction expressed through the annual national student survey (NSS).

Not really a huge change to the table since last year apart from the diversion of a bit of a boat race going on at the top. Although new NSS scores and 2008 RAE do figure they don’t seem to have made a big difference. The numbers involved in the survey of Heads and peers, which results in one indicator, aren’t obviously identified.

NSS results – just about the same as last year

Good news or bad news?

Not a lot to write home about with very little change but BBC reports that satisfaction rate ‘slips’:

This year’s final year students in England were marginally less happy with their university experience than last year’s leavers, an annual survey shows. The National Student Survey shows 81% were mostly or definitely satisfied with the quality of their course, against 82% last year. In Wales the rating was unchanged, 83%, and in Northern Ireland up one at 84%. Twelve Scottish institutions also took part, achieving the highest overall score of 86%, the same as in 2008.

Pretty positive stuff you’d think but the NUS has a different perspective

NUS president Wes Streeting said: “Tuition fees in England were trebled in 2006, but students have not seen a demonstrable improvement in the quality of their experience. “Universities have a responsibility to deliver substantial improvements in return for the huge increase in income they are receiving from fees.”

nssf

And the Guardian also focuses on the negative:

Almost a fifth – 19% – of final-year students told the National Student Survey they were dissatisfied with or ambivalent about their courses – a rise of 1% on last year.

HEFCE though offers a more positive interpretation and the full details of results.

But overall this is surely a good news story, albeit one that is pretty much the same as in 2008.

Students “more satisfied than ever before”

According to the Times Higher Education analysis of the latest NSS data, students are more satisfied than they’ve ever been.

For universities in England, students’ overall satisfaction rate rose slightly from 81 per cent last year to 82 per cent, while satisfaction scores in six specific areas, including teaching, assessment and academic support, also all increased. Students are most satisfied with the teaching they receive, with 83 per cent reporting general satisfaction. But satisfaction with “assessment and feedback” remained lower than in other areas, at 64 per cent. Minister for Students Delyth Morgan said: “The continued high level of satisfaction is a welcome testament to the quality of the teaching and learning experience in this country.”

But is this really telling us very much about the real quality of the student experience? Especially when you note the following:

The top UK satisfaction score of 96 per cent went to the University of Buckingham, a private institution. Vice-chancellor Terence Kealey said: “This is the third year that we’ve come top because we are the only university in Britain that focuses on the student rather than on government or regulatory targets. Every other university should copy us and become independent.”

I’m sure students at Buckingham have a distinctive experience but the reasons for this result are perhaps a bit more complicated than suggested here. Still, the NSS does at least provide much-needed fodder (or core data on the quality of the student experience) for the league table compilers.

The full data is available from Hefce. The THE rankings are as follows:

Most-satisfied students
Institution 2005 2006 2007 2008
University of Buckingham 94 93 96
Royal Academy of Music 95 81 90 94
The Open University 95 95 95 94
University of St Andrews 92 94 93
Courtauld Institute of Art 100 81 74 93
University of Cambridge 93
University of Oxford 92 92
University of East Anglia 88 89 89 92
Birkbeck, University of London 90 91 92 92
Bishop Grosseteste University College Lincoln 88 89 87 92
University of Leicester 89 89 90 92
University of Exeter 86 85 91 91
University of Aberdeen 88 91
Loughborough University 88 88 89 91
Harper Adams University College 90 86 91 90
Aberystwyth University 87 90 90 90
St George’s Hospital Medical School 86 80 87 90
Institute of Education 83 80 90
University of Kent 86 86 88 90
University of Sheffield 86 84 87 89
The table shows the percentage of students, full and part time, who “definitely” or “mostly” agreed with the statement: “Overall, I am satisfied with the quality of my course.”

Taking “enhancing the student experience” too far?

Interesting piece in the Chronicle about the student experience at High Point University (where “every student receives an extraordinary education in a fun environment with caring people”). The features apparently include:

cone

cone

  • valet parking
  • a hot tub in the middle of the campus
  • an ice-cream truck that circles the campus giving out free ice creams etc
  • live music in the cafeteria
  • Snack kiosks around the campus offer free bananas, pretzels, and drinks
  • Gifts are left for students in their halls for when they return from breaks.

Perhaps most scarily:

Birthdays are big events at High Point. Each undergraduate — and there are 2,000 — receives a birthday card from the university, signed by the president, with a Starbucks gift card tucked inside. Plus balloons. What’s more, when birthday boys and girls visit the cafeteria, their ID cards electronically alert the kitchen staff. The staff then fixes a slice of cake, and the featured musicians sing “Happy Birthday.”

All is overseen by “a director of WOW!”. How long before we have one of those at a UK university? And might free ice cream help those NSS scores?

New Conservative position on fees?

Which seems to be: neither higher nor lower, we should neither raise nor lower the cap

According to a Guardian report on a recent speech:

The Conservatives today called for the review of tuition fees planned for 2009 to start now to allow for enough preparation time. The government has promised a review of the increased tuition fees regime in two years. But, in a speech at Sheffield University, shadow universities secretary David Willetts said: “A proper review takes time. We do not need to make a decision any sooner than the government suggests – but why waste this two years which could be spent collecting data, talking to people, or analysing what is happening?

“We are not calling for the cap to be lifted and we are not calling for it to be lowered. Nobody knows enough about tuition fees and their impact to make any decisions at all on this issue,” he said.

Suggesting that a review be brought forward a bit does not appear to represent a bold new position.

Moreover, we need more information:

Mr Willetts also urged universities to give students and their parents more information about contact hours, class sizes and employability before they start courses. “Students and their parents are not simply concerned about the cost of higher education. They care about quality. Students now regard themselves as customers, and they want to know that they are investing in the right student experience. He claimed the national student survey was being manipulated by universities and called for a national student experience website to pull together information on research ratings, drop-out rates, library facilities and university estates, as well as contact hours, class sizes and employability.

Sounds a bit like a combination of the Sunday Times League Table and the data recently produced by HEPI – see earlier post on this topic. And isn’t this what the new (as yet unlaunched) Unistats site is largely intended to address?