Sunday Times 2012 University League Table

Sunday Times League Table 2012

The full table has now been published and is available (paid access) on the Sunday Times website. Some interesting changes inside the top 20, largely I suspect down to changes in the way NSS data has been used. Birmingham, Southampton and Edinburgh drop out of top 20.

(last year’s position in brackets)

1 (2) University of Cambridge
2 (1) University of Oxford
3 (6) Durham University
4 (5) LSE
5 (9) University of Bath
6 (7) University of St Andrews
7 (4) University College London
8 (8) University of Warwick
9 (17) University of Exeter
10 (11) University of Bristol
11 (16) Loughborough University
12 (20) Newcastle University
13 (15) University of Sheffield
14 (3) Imperial College London
15= (12) University of Nottingham
15= (13) University of York
17 (10) King’s College London
18 (21=) Lancaster
19 (21=) Sussex
20 (26=) University of Glasgow

Bath is University of the Year.

The criteria used are broadly similar to previous years although the selective approach to use of NSS data seems a bit odd:

Student satisfaction
The academics’ verdicts
Research quality
Ucas tariff points
Graduate-level jobs
Graduate unemployment
Firsts/2:1s awarded
Student/staff ratio
Dropout rate

The effect of these NSS changes has been rather severe on some institutions – Imperial has dropped from 3rd to 14th, Edinburgh from 14th to 27th and Manchester from 25th to 37th.

There is lots of interesting stuff on the website, including interactive subject tables and separate rankings by indicator.

About these ads

15 thoughts on “Sunday Times 2012 University League Table

  1. Pingback: Ninth Level Ireland » Blog Archive » Sunday Times 2012 University League Table

  2. Go on Newcastle University – number 12 on the table now! All thanks to keen, dedicated and exceedingly supportive and understanding research supervisors like Dr. Alton Horsfall.

  3. Paul, I disagree with you in one thing. Thinking that changes are only related to the use of NSS data sounds somehow naive. Some universities are making big changes in several of their strategies (e.g. internationalisation, staff recruitment, others) in order to improve overall quality, and probably this is starting to be reflected in the results.

    • Possibly. But the NSS scores (and the subset used here) do carry a lot of weight in the rankings here. The other less tangible elements you identify, all of which are arguably as if not more important, may impact on reputation and also on some of these scores in the longer run but I still think a lot of the volatility here is down to NSS.

  4. Can you highlight exactly what the changes have been to the NSS scores? What was it in the earlier analysis of the NSS which was bolstering certain institutions’ positions?

    • Tom – the approach is a bit different an arguably not comparable to the scores last time at all. There are now two separate indicators based on the NSS. Previously there was a single measure using all of the NSS; now there is one measure which uses just the sections on teaching, assessment & feedback and academic support, and a second measure which uses the difference between the actual and benchmark score for the overall satisfaction question.

      The following is extracted from the methodology description on the Sunday Times website:

      Teaching excellence (250 points): The results of questions 1 to 12 of the 2011 national student survey (NSS) are scored taking a theoretical minimum and maximum score of 50% and 90% respectively. This meant each percentage point gained above 50% was worth 6.25 points in the league table. Questions 1 to 12 relate to student satisfaction with teaching quality, academic support, assessment and feedback. Universities were awarded points for their overall institutional score and this is shown in the university profile and the teaching excellence table. The individual subject scores are displayed in descending order within each university profile page. This is a change to our previous methodology which was based on all responses to questions 1 to 22. The data used is are drawn from the NSS data for all institutions – level 2 subjects for full-time, first-degree students only, except Birkbeck College, London and the Open University, where part-time students’ responses were used. A mean score was awarded to Abertay Dundee, the only university in Britain to not participate in the survey. Source: NSS 2011.

      Student satisfaction (+50 to -55 points): The responses given to Question 22 of the National Student Survey: “Overall, I am satisfied with the quality of the course” were compared to a benchmark for the given institution, devised according to a formula based on the social and subject mix. Five bonus points were awarded or five penalty points deducted for every percentage point above or below the benchmark score the university’s actual score happened to be. A zero score was awarded to Abertay Dundee for this measure as it does not take part in the survey. Source: NSS 2011.

  5. Pingback: The Times, Sunday Times, Guardian and Complete University Guide League Tables 2011-12 « Registrarism

  6. Pingback: Oh dear, it’s the top Registrarism posts of 2012 | Registrarism

  7. Pingback: By not very popular demand: it’s the top Registrarism posts of 2013 | Registrarism

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s