2011 Shanghai Jiao Tong World Rankings: Top 10 and UK placings

2011 Shanghai Jiao Tong World Rankings: Top 10 and UK placings

The rankings have been published and are available at the ARWU site I believe but there seem to be problems with access at time of writing. Am therefore going with second hand accounts of the positions (which I hope are accurate).

As last year though there are no surprises and very little movement in the top 10 with Harvard retaining the number 1 spot for the fifth successive year (last year’s position in brackets):

1 Harvard University (1)

2 Stanford University (3)

3 Massachusetts Institute of Technology (MIT) (4)

4 University of California, Berkeley (2)

5 University of Cambridge (5)

6 California Institute of Technology (6)

7 Princeton University (7)

8 Columbia University (8)

9 University of Chicago (9)

10 University of Oxford (10)

The Times Higher (which clearly has managed to access the ARWU site) has the UK’s top performers (ie in the Top 100) as follows (last year’s position in brackets):

5 Cambridge (5)

10 Oxford (10)

20 University College London (21)

24 Imperial College London (26)

38 University of Manchester (44)

53 University of Edinburgh (54)

68 King’s College London (63)

70 University of Bristol (66)

85 University of Nottingham (84)

97 University of Sheffield (88)

So, very little change at all to report apart from Birmingham dropping out of the top 100. Perhaps there will be more excitement with the Times Higher and QS tables.

Advertisements

Pride and Prejudices: Problems with National and International League Tables

Presentation from AUA Conference 2011

Thank you to all who attended this session on 19 April 2011

As promised, here is the presentation:

Ranking in Latin America

New Latin American league tables emerging

The Chronicle of Higher Education reports on a league table developments in a number of Latin American nations:

The growing influence of university rankings has reached Latin America, with governments, news media, and private researchers drawing up domestic versions that they say are important for the institutions and students alike.

Brazil, Chile, Colombia, Mexico, and Peru each have at least one national ranking. Some were first published in recent months, and all use different approaches to evaluate their higher-education institutions.

A few, such as in Chile, are produced by news-media companies. Others, as in Colombia, were carried out by independent researchers. And some, like Brazil, are not so much rankings as government-sanctioned ratings.

Whatever their origin, they all serve a purpose that goes beyond boasting or one-upmanship, experts say. The rankings put pressure on lagging universities to up their game, and they give government officials, students, and parents a useful yardstick.

“Global rankings are very important. But there are close to 15,000 higher-education institutions in the world, and the global ranking deals with only 400, 500 of them,” says Kazimierz Bilanow, managing director of the Warsaw-based International Observatory on Academic Rankings and Excellence. “There are millions and millions of students who never think of going to Harvard. But they want to go to university and get an education, so they look at their own country. National rankings give them some guidance.”

The Brazil government rankings are intended to result in failing institutions being closed. The Colombian ranking uses a narrow range of indicators focusing on graduate student numbers, journals and recognised research staff numbers. Chile seems to have broader range of published indicators to draw on which are published by government including “courses most likely to lead to jobs, expected salaries on graduation, and space on campus per student”.

Whilst these national rankings seem to be having a local impact in some countries, it does seem that international developments are on the way with QS planning to introduce a new Latin American ranking. In time there will undoubtedly be more Latin American institutions in the global rankings too.

International world rankings – where do you stand? Going Global 2011 §2

International world rankings – where do you stand?

A belated note on one of the sessions at Going Global earlier this month. This session, on league tables, was for me most enjoyable session but sadly there really was insufficient time for debate. The outline looked good:

As with the economic shifts we have seen over the last decade, changes in education are happening at breathtaking speed. The growing differentiation in the higher education sector in terms of universities’ missions, international strategies, capacities and resources, confronts traditional ways of ‘ranking’ institutions. Contributions from Phil Baty, Times Higher Education world rankings, and John Molony from QS, will present the global trends and explain changes in their ranking methodologies to justify the role of the need for rankings.

This session is designed to take the debate beyond the methodologies, to reflect on concerns on the potential impact of rankings, in such a highly competitive higher education market. Who are the audiences: how are they interpreting the information and for what useful purpose? How seriously are rankings taken by the institutions and personnel on which they pronounce judgement?

Responses from Prof Dzulkifli and Prof Malcolm Grant will debate the impact of rankings from both the perspective of internationally focused university leaders and from an academic community that may well feel disenfranchised from the adulation and denigration associated with fluctuating league tables. Giving an alternative perspective, Dr Kevin Downing, will cite the benefits that can be derived from a University’s world-class standing and success, as reflected in these ranking exercises.

Phil Baty, Deputy Editor of Times Higher Education, spoke passionately in defence of rankings. Whilst acknowledging they were rather crude and had many faults, could never be really objective, don’t reflect the diversity of higher education across the globe, they are here to stay. Phil outlined the rationale for the shift from QS to  Thomson Reuters for its data provision and the ways in which he believed THE had behaved responsibly in relation to rankings. It was a spirited defence which included the now customary declaration “I am a ranker and I am proud!”. Fuller details of Phil’s comments were published in THE article (and he really does need some new puns).

Prof Dzulkifli Abdul Razak, Vice-Chancellor of the Universiti Sains Malaysia, also commented on the many faults of league tables, noting the problems with constructing the concept of quality, the risk of ignoring the complexity of the picture and the fact that rankings generally fail to recognize a holistic view of education. Moreover, they can lead to distortion of institutional priorities, fail to reflect the intangible benefits of HE and can leave the Impression of a linear relationship between the data and rankings.

John Molony, Vice-President, Strategic Planning and Marketing, Quacquarelli Symonds, joined Phil Baty (almost) in defending rankings, arguing that QS was already fulfilling a useful role with its focus on students, and particularly those with a propensity to be mobile students. Students want and need rankings, he argued and, when they work and are used properly, the rankings do provide helpful information. He argued that there will be 7m mobile students by 2020, all of whom would be making a massive investment and needed proper information to inform their decision making. Nevertheless, rankings do need to be handled with care, they do simplify and reduce whilst being open and transparent for users. Finally, he argued that rankings, require universities to be more open and can lead to innovation and new forms of evaluation.

Professor Malcolm Grant, President and Provost, University College London, sought to demolish league tables and succeeded, at least partially, identifying a number of major “fracture points” including:

  • failure to cope with the diversity of the system and address atypical but excellent institutions such as the LSE
  • the difficulty in picking the indicators make a university world class –  many are intangible and indicators can’t necessarily reflect the real values of an institution
  • we can’t measure many things directly and therefore have to use proxies
  • international league tables do have lots of data, but it is distorting and misleading
  • comparisons compound the problem and can be of limited significance when higher education is so varied.
  • there is a problem with the weighting of indicators and the preconceptions of what university is that this implies.

Damning stuff. He added that we needed to retain academic rigour and should not abandon skepticism when dealing with rankings. We should not sleepwalk into accepting a commercial version of higher education.

Dr Kevin downing, Senior Co-ordinator (Academic Planning and Quality Assurance), City University of Hong Kong, shared many of the reservations expressed by others, noting also that none of the tables took into account community roles nor did teaching enjoy proper coverage. Arguments in favour of rankings did exist including that they were better than the alternatives including  simple subjective judgement. Pragmatically, rankings are inevitable and we need to get used to it.

THE report is here and this and many of the sessions from the conference, including this one, can be seen on the Policy Review TV site.

Age matters

For university rankings, at least

QS Intelligence unit has a diverting posting on the influence of age on university “performance”

As this post puts it, the world is changing, fast and HE is part of it

In Saudi Arabia there are 28 universities, 22 of which were founded after the turn of the millenia. Economies worldwide are turning to the ever enticing notion of creating a “knowledge economy”. I read somewhere that we have generated more written content since 2003 than the in the whole of human history until that point.

In that environment – whilst rankings such as ours may treat all institutions equally – the reality is that date of establishment clearly has a part to play in the current success profile of universities. In broad terms, universities over 100 years old, and perhaps those over 50, have already reached their “terminal velocity” – the combination of reputation, government funding, scale of operation, organisational culture, international mix and alumni profile have reached a degree of equilibrium which makes radical shifts in performance – as measured by rankings or otherwise – exceedingly difficult to impose.

It is undoubtedly the case that radical changes in rankings are going to be difficult to achieve, particularly in the international QS table. But should the tables adjust for age to allow rapidly improved performance to outweigh historical achievement? Should longevity be discounted to enable us to compare Al-Jouf University with Oxford? I’m not certain what that would prove. Age does matter.

One alternative approach might be this:

We have begun some work on developing an adjustment algorithm for our rankings tables which can potentially help identify universities that are ahead of where we might expect them to be for their given age.

Still not quite clear how this would look but is an interesting idea nevertheless.

Another World Ranking: High Impact Universities

High Impact Universities: “it’s all about research impact”

Following the rash of recent world league table publications here is one that is based primarily on research. The rankings measure universities’ Research Performance Index or RPI. The table has been developed at the University of Western Australia and can be found here.

The Top 20 is:

1 Harvard University
2 Stanford University
3 MIT
4 University of California, Los Angeles
5 University of California, Berkeley
6 University of Michigan
7 University of Washington
8 University of Pennsylvania
9 Johns Hopkins University
10 University of California, San Diego
11 Columbia University
12 University of Minnesota
13 University of Cambridge
14 University of Toronto
15 University of Chicago
16 Cornell University
17 University of Oxford
18 University of Wisconsin, Madison
19 Yale University
20 Pennsylvania State University

The methodology is based on a “simple process” which delivers your RPI for each broad subject area/faculty

Step 1. calculate the g-index (a numerical measure of the quality and consistency of publication or research output) for each faculty of the particular university
Step 2. divide or normalize the g-index for each faculty by that of the highest globally performing faculty
Step 3. average or sum the normalized faculty indices to arrive at a final RPI value for a particular university

Comparisons with the recent Times Higher Education and QS tables show some major similarities, they are all US dominated, but also some marked differences, particularly for Cambridge, Oxford and Yale.

QS World University Rankings Results 2010

QS World University Rankings Results 2010

The QS rankings have been published and join a crowded marketplace of world university league tables. Much excitement no doubt about Cambridge and Harvard swapping places but not clear what the real reasons for this might be. Remainder of top 20 is roughly the same:

Rank 2010 (2009 in brackets)

1 (2) University of Cambridge

2 (1) Harvard University

3 (3) Yale University

4 (4) UCL

5 (9) Massachusetts Institute of Technology

6 (5=) University of Oxford

7 (5=) Imperial College London

8 (7) University of Chicago

9 (10) California Institute of Technology

10 (8) Princeton University

11 (11) Columbia University

12 (12) University of Pennsylvania

13 (16) Stanford University

14 (14) Duke University

15 (19) University of Michigan

16 (15) Cornell University

17 (13) Johns Hopkins University

18 (20=) ETH Zurich

19 (18) McGill University

20 (17) Australian National University

Full details at QS World University Rankings Results 2010 website.

New University League Table iPhone App

New League Table iPhone App

QS, compilers of world university league tables, have produced an iPhone app so rankings are never out of reach. My life is now complete.

Latest Asian University Rankings

Latest Asian University Rankings

QS have just published their latest rankings of universities in Asia. No huge surprises in the top 20 but a few ups and downs. Hong Kong, South Korean and Japanese institutions dominant but NUS seems to have had an impressive leap. It will be interesting to see how this plays out in the full QS world rankings.

Ranking 2010 (2009 ranking in brackets)

1 (1) University of Hong Kong, Hong Kong

2 (4) The Hong Kong University of Science and Technology, Hong Kong

3 (10) National University of Singapore (NUS), Singapore

4 (2) The Chinese University of Hong Kong, Hong Kong

5 (3) The University of Tokyo, Japan

6 (8) Seoul National University, Korea, South

7 (6) Osaka University, Japan

8 (5) Kyoto University, Japan

9 (13) Tohoku University, Japan

10 (12) Nagoya University, Japan

11 (9) Tokyo Institute of Technology, Japan

12 (10) Peking University, China

13 (7) KAIST – Korea Advanced Institute of Science and Technology, South Korea

14 (17) Pohang University of Science And Technology (POSTECH), South Korea

15 (18) City University of Hong Kong, Hong Kong

16 (15) Tsinghua University, China

17 (15) Kyushu University, Japan

18 (14) Nanyang Technological University (NTU), Singapore

19 (25) Yonsei University, South Korea

20 (19) University of Tsukuba, Japan

Full Asian university rankings available here.

On the QS World University Rankings Methodology

QS defends its ranking methodology

Following the split from THE, and recent critique by that publication of its approach, QS has been setting out a robust defence of its methodology:

Following the end in 2009 of a six-year collaboration between the two organisations, Times Higher Education has launched a campaign of criticism of the QS World University Rankings. QS owns all the intellectual property of the World University Rankings results and methodology. It seems that THE believes the only way to legitimise producing its own new rankings is to pretend dissatisfaction with QS. Martin Ince was Rankings Editor at THE for six years, overseeing work with QS, and was previously deputy editor of THE. He says “I can honestly say that the rankings produced by QS, were regarded as an outstanding piece of work….accurate, insightful and absolutely fit for purpose.”

And they also find a number of supporters:

Professor Alan M. Kantrow, Former Editor of McKinsey Quarterly, adds “Global competition in higher education necessitates some basis for quality comparison across borders…..QS’s Academic Peer and Employer Reviews rightly place the basis for comparison in the hands of experts – academics and employers around the world.”

There is plenty of other interesting information on the QS methodology here too. Although, as with all league tables, there remain very good grounds for scepticism about elements of the approach, there is a decent degree of honesty here about the limitations of the methodology. Worth a look.

Ranking confessions

Ranking Confessions from THE’s Deputy Editor

Some time ago THE announced a bit of a change in its approach to rankings and its previous league table partner joined up with US News and World Report

Inside Higher Ed carries an interesting piece from Phil Baty in which he admits to the failings in THE’s previous league table methodology:

I have a confession. The rankings of the world’s top universities that my magazine has been publishing for the past six years, and which have attracted enormous global attention, are not good enough. In fact, the surveys of reputation, which made up 40 percent of scores and which Times Higher Education until recently defended, had serious weaknesses. And it’s clear that our research measures favored the sciences over the humanities. We always knew that rankings had their limitations. No ranking can be definitive. No list of the strongest universities can capture all the intangible, life-changing and paradigm-shifting work that universities undertake. In fact, no ranking can even fully capture some of the basics of university activity – there are no globally comparable measures of teaching quality, for example.

There’s lots of interesting critique in here, including the particular problem of the ‘reputation survey’ element of the QS ranking which had a very high weighting but very small number of participants:

The reputation survey carried out by our former ranking partner attracted only a tiny number of respondents. In 2009, about 3,500 people provided their responses – a fraction of the many millions of scholars throughout the world. The sample was simply too small, and the weighting too high. So we’ve started again.

Many of the problems identified with the QS method will be challenging to address. It will be interesting to see how the new THE table pans out. And whether its previous ranking partner prospers with a new publisher.

2007 World Rankings: UK improvements

The detail of the QS-Thes table is due to be available on-line from 9 November according to THES.

While we wait, the QS site will let you download the table.

And the BBC carries the story too.

19 UK universities appear in the top 100:

    2= Cambridge
    3= Oxford
    5 Imperial College
    9 University College London
    23 Edinburgh
    24 King’s London
    30 Manchester
    37 Bristol
    57 Warwick
    59 London School of Economics
    65 Birmingham
    68 Sheffield
    70 Nottingham
    74 York
    76 St Andrews
    80= Leeds
    80= Southampton
    83 Glasgow
    99 Cardiff

Almost all of these universities improved their position on last year. So, either UK universities are getting a whole lot better against the competition or there is an outside chance the table has developed in our favour.