2010 Independent League Table

Latest Independent league table

First of the new season’s UK tables has just been published by the Independent.

Full details of the institutional and subject rankings are provided by the Complete University Guide which can be found here. There isn’t much change at the top but the most striking thing is the inclusion for the first time in a UK league table (I think) of the University of Buckingham, the UK’s first private university.

Rankings (2009 rank in brackets)

    1 (1) Oxford
    2 (2) Cambridge
    3 (3) Imperial College London
    4 (5) Durham
    5 (4) London School of Economics>
    6 (7) St Andrews
    7 (6) Warwick
    8 (12) Lancaster
    9 (8) University College London
    10 (10) York
    11 (11) Edinburgh
    12 (9) Bath
    13 (17) King’s College London
    14 (13) Southampton
    15 (15) SOAS
    16 (16) Bristol
    17 (14) Aston
    18 (19) Nottingham
    19 (25) Sussex
    20 (–) Buckingham

Also, the Complete University Guide people let you play with the weightings for each of the criteria, so it is possible to bump Oxbridge from the top slots if you really try.

Sunday Times League Table

Sunday Times League Table is now out

The 2010 Sunday Times Good University Guide. Change at the top but not really “a year of upheaval” as billed:

1. Oxford (2)
2. Cambridge (1)
3. Imperial (3)
4. UCL (6)
5. St Andrews (5)
6. Warwick (7)
7. Durham (8)
8. York (9)
9. LSE (4)
10. Bristol (16)
11. Bath (10)
12. Southampton (12)
13. King’s College London (17)
14. Nottingham (13)
15= Edinburgh (15)
15= Loughborough (11)
17. Exeter (14)
18. Sheffield (19)
19. Lancaster (20)
20= Leicester (18)
20= Birmingham

University of Oxford

The University of Oxford is on something of a winning streak. After a second successive victory over Cambridge in the boat race this year, the university has now knocked its light-blue rival off the top of The Sunday Times university league table for the first time.

This feat, after 11 years in second place, earns Oxford The Sunday Times University of the Year award. It edged narrowly ahead of its principal British rival in a year of upheaval in our league table, prompted by the first research assessments in seven years and the move to measuring teaching quality primarily by levels of student satisfaction expressed through the annual national student survey (NSS).

Not really a huge change to the table since last year apart from the diversion of a bit of a boat race going on at the top. Although new NSS scores and 2008 RAE do figure they don’t seem to have made a big difference. The numbers involved in the survey of Heads and peers, which results in one indicator, aren’t obviously identified.

On ‘The Edgeless University’

The Edgeless University – Demos

Available for download: via Demos Publications

This is an interesting paper which identifies a range of significant technological challenges for higher education. It is suggested that universities are on the brink of an electronic revolution like the music industry in 1999 but struggling to make sense of the opportunities or understand the strategic options:

The next stage of technological investment must be more strategic. The sector currently lacks a coherent narrative of how institutions will look in the future and the role of technology in the transition to a wider learning and research culture.

A reasonable enough proposition although this in the context of welcoming the far-sighted establishment of JANET seems a little bit harsh – it is difficult to get much more strategic than setting up a successful shared sector wide network like this.

Many of the specific points made in the report are quite pertinent (if not entirely novel):

    – openness in terms of publication of research and free access to IP remains a difficult agenda;

    – the importance of recognising the value of teaching in the context of potentially distorting RAE/REF demands is a challenge;

    – high quality e-learning, discrete or blended, is about much more than just providing new tools – it requires huge investment and support;

    – the value of face-to-face learning and teaching should not be discounted.

Fairly straightforward agenda there then.

However, some of the ideas in here are just plain wrong. In particular the idea that there is a deficit of flexible study pathways for credit-based learning and that somehow it is the role of government to take a specific policy lead in this area:

Government policy must help higher education institutions develop new ways of offering education seekers affiliation and accreditation. This might include shorter pick-and-mix courses and new forms of assessment.

Then there is the particularly misguided idea of seeking to reconcile “informal learning” with the formal system of higher education:

Informal learning is growing in popularity and significance, and attracting the attention of politicians, but there are problems in reconciling informal learning with formal frameworks, and managing the relationship between institutions of higher education and the kinds of learning that happen outside them. We have yet to find a model for collating learning from many different sources. Funding and the structure of learning in formal higher education tend to militate against this.

There is a good reason for this – if “informal learning” can be recognised then there are actually costs in doing so and, in order to have currency, it has to be within an educational framework of some kind. More often than not though, such learning will be just “informal” – it is difficult to argue that mainstream HE provision should be skewed to cope with such marginal activity. Indeed, there remains significant adult and continuing education provision parts of which are structured for this purpose.

The overall conclusion though is pretty difficult to argue with:

In building the e-infrastructure for higher education we should not just build around the needs of institutions as they exist already. To pursue the possibilities of the ‘Edgeless University’, technology will have to be taken more seriously as a strategic asset. Technology is a driver for change. But we should harness it as a solution, a tool, for the way we want universities to support learning and research in the future.

So, the future is ‘edgeless’ it seems.

RAE Funding Results

RAE funding results out

Following the results published in December 2008.

Handy summaries of research funding outcomes are published by the Times Higher Education. Resources have been spread more thinly and there are some perhaps surprising recipients of significant growth in research income:

Biggest winners by cash increase (growth, % increase)

University of Nottingham, £9,685,797, up 23.6%
University of Oxford, £8,769,293, 8.0%
Queen Mary, £7,282,125, 29.4%
University of Liverpool, £6,420,263, up 19.5%
Loughborough University, £5,965,970, 36.9%
University of Bristol £5,607,884, up 12.6%
London School of Hygiene, £4,980,410, 46.5%
University of Plymouth, £4,868,489, 125.8%
Brunel University, £4,542,356, 54.5%
University of Kent, £3,779,827, 46.4%
Cranfield University, £3,621,707, 36.8%
University of Exeter, £3,550,318, 24.4%
City University London, £3,425,676, 50.3%
University of the West of England, £3,342,120, 121.6%
The Open University, £3,323,539, 44.9%

Good news for some of us but some institutions have lost out.

RAE 2008: Results and rankings

RAE 2008 results are now out (effective 18 December 2008)

Many, many ways to calculate rankings from the data but arguably the most authoritative and convincing one comes from Research Fortnight:

Research Fortnight Power Rankings 2008

1 Oxford
2 CambridgeRAE
4 Manchester
5 Edinburgh
6 Imperial
7 Nottingham
8 Leeds
9 Sheffield
10 Bristol
11 King’s College
12 Birmingham
13 Southampton
14 Glasgow
15 Warwick
16 Cardiff
17 Newcastle
18 Liverpool
19 Durham
20 Queen Mary

The Times Higher rankings can be found here. They are using a Grade Point Average (ie no direct indication of volume). The Guardian’s calculations are here. Not very different from THE and using GPA again which shows excellent performance for institutions with slightly smaller strong submissions including Essex, Warwick and York. All of the tables show a very good improvement by Queen Mary in particular but also Nottingham.

Other analysis is awaited…

RAE results predictor?

Predicting RAE outcomes before the submission


Interesting RAE pre-results (by 13 months) commentary in The Guardian

According to a league table based on research impact, PhD numbers and income – which was drawn up by Evidence for EducationGuardian.co.uk – the frontrunners will remain the big research players, known as the “golden diamond”: Imperial and University College London, Oxbridge and Manchester universities. All five do well in terms of the impact of the research papers their academics have published, the income they get from research and the numbers of PhDs who completed between 2002 and 2006. These are the so called “metrics” that will be used to rank university research in the future.

(Aside: since when has it been a “golden diamond”? Isn’t it really a pentagon? And the inclusion of Manchester would actually require it to be an octagon according to the table below)

The league table is also published here and institutions are ranked according to an average score of each of the variables.

The variables are:

    – Impact and papers – describes how many times more than the world average (unique to subject area) the papers have been cited between 2002 and 2006 in peer-reviewed journals

    – Research grant and contract income, 2002-06

    – Number of PhDs completed 2002-06

Key research metrics for UK HEIs 2002-06
University rankings which emerge under this methodology (see here for detailed results):

    1 Oxford
    2 Cambridge
    3 Imperial College
    4 UCL
    5 Edinburgh
    6 King’s College London
    7 Birmingham
    8 Manchester
    9 Glasgow
    10 Bristol
    11 Southampton
    12 Sheffield
    13 Leeds
    14 Cardiff
    15 Nottingham
    16 Newcastle
    17 Liverpool
    18 Durham
    19 Queen Mary, London
    20 Leicester