Capital spend spend spend

Changing patterns of capital spending in universities

 

HESA recently released details of HEIs’ capital spend in 2012-13 showing the total spend on buildings and equipment and the sources of the funds used:

HE capital

 

Times Higher Education has a brief piece on this and notes that, unsurprisingly, as external funding for capital expenditure has declined, universities have replaced this from their own funds:

The proportion of capital investment that universities financed using internal funds has leaped 20 per cent over the past four years, according to data released by the Higher Education Statistics Agency

UK universities spent nearly £2 billion from internal sources for capital projects in 2012-13, up from £1.5 billion in 2008-09.

The Finances of Higher Education Institutions 2012-13 report states that during the past academic year, universities’ capital expenditure was nearly £3.1 billion, 64 per cent of which was provided by internal sources. This compares with a total of almost £3.5 billion four years ago, of which 43 per cent was funded by internal sources.

Expenditure funded by loans remained relatively stable, at £408 million in 2008-09 and £326 million in 2012-13, according to the data published earlier this month. Meanwhile, capital projects financed by funding body grants fell by about half over the four years, from £765 million to £359 million.

So, despite the decline in funding agency contributions the total spend on buildings and equipment has increased significantly. Will capital spend continue to grow despite reduced public funding? We can expect so given the greater competition between institutions, the “arms race” of student facilities development and the need to invest ever more to support leading edge research.

Advertisements

Money, Money, Money

HE Income and Expenditure 2012/13

Perhaps not the most exciting publication of the year to date but nevertheless some interesting information in the new Higher Education Statistics Agency report on Income and Expenditure of HE institutions.

HE Finance Plus 2012/13 shows that the total income of higher education institutions (HEIs) in 2012/13 was £29.1 billion. Funding bodies provided £7.0 billion of this income, while tuition fees and education contracts contributed £11.7 billion.

This handy chart shows the proportions of total income of UK higher education institutions by source in 2012/13:
PR201_Inc_721w

The total increase in income over 2011/12 was 4.5%.

And then there is also this helpful summary of total expenditure.
PR201_Exp_485w

Unsurprisingly, the bulk of spend (just over 55%) is on staff. Total spend has increased by 4.7% over 2011/12 and expenditure on staff has risen by 4.1%. It will be interesting to see how this global profile of total spend changes in subsequent years.

One thing is absolutely clear from this summary: with growth in spend outstripping income by 4.7% to 4.5% the position is unsustainable. And it’s only going to get worse in terms of teaching funding. So either institutions will have to find new ways to raise more money or reduce expenditure. The future doesn’t look very bright. It’s a rich university’s world.

Crime Data in the USA

University fined for misreporting crime data.

The Chronicle of Higher Education has a piece about the University of Texas at Arlington being fined for improperly classifying and reporting a number of crimes which took place on its campus:

For misclassifying crimes and underreporting disciplinary actions, the U.S. Department of Education has fined the University of Texas at Arlington $82,500, a penalty the institution is appealing.

The department imposed the fine last month under the federal campus-crime reporting law known as the Clery Act, each violation of which can cost an institution $27,500. According to a review by the department in 2011, the Arlington campus had improperly classified a forcible sex offense as an assault and an aggravated assault as an assault of a family member. Both crimes occurred in 2008.

CrimeScene

Also that year, the department found, the university excluded 27 liquor, drug, and weapons violations—classified as “disciplinary actions”—from crime statistics that by law must be submitted to federal officials and distributed publicly each year. On that count, the department imposed a third $27,500 fine.

Similar fines have just been levied against Yale:

Yale failed to report a total of four forcible sex offenses in its campus crime statistics for 2001 and 2002, according to an April 19 letter from Mary E. Gust, director of administrative actions and appeals service group at the DOE. As a result, the department is fining the university $27,500 for each offense, the letter said. The Connecticut Ivy League university also received a $27,500 fine for failing to include seven required policy statements in its annual crime reports, and another $27,500 for not including crime statistics from Yale-New Haven Hospital in its annual campus crime data.

It always surprises me that there is such a strict federal requirement on crime reporting at US universities. Given the potentially negative consequences though it is perhaps hardly surprising that there are occasional errors in classification. And the crimes on US campuses do tend to be significantly worse than those here, especially given the availability of guns at some institutions (as noted in this previous post).

How long before HESA start collecting this data in the UK?

International Students in the USA (and Nottingham)

Interesting data on international students in the USA (and at the University of Nottingham)

The Institute of International Education has just released its ‘Open Doors’ report on international education in the USA. The press release give the headlines:

The 2012 Open Doors Report on International Educational Exchange, released today, finds that the number of international students at colleges and universities in the United States increased by six percent to a record high of 764,495 in the 2011/12 academic year, while U.S. students studying abroad increased by one percent. This year, international exchanges in all 50 states contributed $22.7 billion to the U.S. economy. International education creates a positive economic and social impact for communities in the United States and around the world.

Open Doors is intended to provide helpful information on international education in the US:

Open Doors, supported by a grant from the Bureau of Educational and Cultural Affairs at the U.S. Department of State, is a comprehensive information resource on international students and scholars studying or teaching at higher education institutions in the United States, and U.S. students studying abroad for academic credit at their home colleges or universities.

The report lists the leading institutions in the USA in terms of international student numbers:

TOP INSTITUTIONS HOSTING INTERNATIONAL STUDENTS, 2011/12 
Rank Institution City State Int’l Total
1 University of Southern California Los Angeles CA 9,269
2 University of Illinois – Urbana-Champaign Champaign IL 8,997
3 New York University New York NY 8,660
4 Purdue University – Main Campus West Lafayette IN 8,563
5 Columbia University New York NY 8,024
6 University of California – Los Angeles Los Angeles CA 6,703
7 Northeastern University Boston MA 6,486
8 University of Michigan – Ann Arbor Ann Arbor MI 6,382
9 Michigan State University East Lansing MI 6,209
10 Ohio State University – Main Campus Columbus OH 6,142

What is most interesting about this data for me is that if the University of Nottingham UK (ie not including our campuses in Malaysia and China) were to be included in this table it would be at the top with, by our reckoning, 9,662 non-UK students enrolled in 2011/12. My guess is that Manchester and UCL would have even more than this.

Similar data for the UK can be found on the UKCISA website (which reports official HESA data) but note that the latest figures are for 2010/11. The US seems to be able to publish a little faster than we can. And of course we may find the numbers of international students in the UK declining in future as the full consequences of the Government’s immigration policies come into play.

The Imperfect University: First for the chop

The Imperfect University: Some people really don’t think much of administrators

Last year I wrote a piece for Times Higher Education on the problem with the term “back office” and the often casual, unthinking use of it in order to identify a large group of staff who play a key role in the effective running of universities but who are the first to be identified for removal or outsourcing in financially challenging times. But what do we mean by the back office?

In a university context, it is generally taken to mean those staff who are neither engaged in teaching or research nor involved in face-to-face delivery of services to students. So they might be, for example, working in IT, human resources, finance or student records. Or they might be the people who maintain the grounds, administer research grants or edit the website.
Too often, their somewhat anonymous roles mean that they are treated as third-class citizens in the university context. Because they are out of sight and largely out of mind, most people really don’t know what they do; as a consequence, it becomes much easier for others to write them off and offer them up as the first to be sacrificed when cuts have to be made. Back-office staff do not have an obvious income line and can easily be regarded as expendable. The attitude is resonant of the Victorian view of those “below stairs”. This perception (or lack of perception) is unhelpful, and not terribly good for morale – particularly among those who are so casually dismissed as being “just back office”.

Two recent reports offer a striking example of this. The first is an Ernst & Young report on the “University of the Future” which has found that the current public university model in Australia will prove unviable “in all but a few cases”.

A story in The Australian quotes the report’s author:

“There’s not a single Australian university that can survive to 2025 with its current business model,” says report author Justin Bokor, executive director in Ernst & Young’s education division.

“We’ve seen fundamental structural changes to industries including media, retail and entertainment in recent years – higher education is next.”

The study compared ratios of support staff to academic staff across a selection of 15 institutions and found that 14 out of 15 had more support staff than academic staff. Four of the 15 universities have 50 per cent or more support staff than academic staff, and more than half have at least 20 per cent more support staff.

The report warned that this ratio “will have to change”.

The report, which can be found here, doesn’t give any details on the definition of “support staff”. However, I would guess that it is a sum of all staff who are not academics (the definition of academics can often be unclear too). I must admit though that I’m not surprised that there are more support staff than academics in most institutions simply because of the sheer scale of university operations. I suspect that the variations are largely down to how staff are counted and categorized and differences in physical and organizational structures.

Despite this definitional imprecision, the report’s author is confident in asserting that universities need to cut:

Organisations in other knowledge-based industries, such as professional services firms, typically operate with ratios of support staff to front-line staff of 0.3 to 0.5. That is, 2-3 times as many front-line staff as support staff. Universities may not reach these ratios in 10-15 years, but given the ‘hot breath’ of market forces and declining government funding, education institutions are unlikely to survive with ratios of 1.3, 1.4, 1.5 and beyond.

Leaving aside the fact that many professional staff, for example those involved in student recruitment, careers work, counseling, financial advice, academic support, security and library operations are unequivocally front-line, the idea that the other staff who help the institution function and who support academic staff in their teaching and research are merely unnecessary overheads, ripe for cutting back, is just not credible.

Then from the US we have another report, quoted in the Chronicle. This report, produced by a pair of economists, has identified the ideal ratio of academic staff to administrators needed for universities to run most effectively. It is 3:1 and therefore makes the Ernst and Young proposition look decidedly half-hearted. However, as the article acknowledges, the definitional problems are far from insignificant:

The numbers are fuzzy and inconsistent because universities report their own data. Different institutions categorize jobs differently, and the ways they choose to count positions that blend teaching and administrative duties further complicate the data. When researchers talk about “administrators,” they can never be sure exactly which employees they are including. Sometimes colleges count librarians, for example, as administrators, and sometimes they do not.

“Look! If I just cross all these people out then we can employ an extra professor!”

Even in the UK, where there is fairly robust collection of staff data by HESA, definitional problems remain. As this earlier post noted there is significant scope for misinterpreting staff data and overstating the growth of the number of managers versus the number of academics working in universities.

These matters are exacerbated in the US for the reasons above and the comments below the piece give an indication of some of the major holes in the economists’ proposition. Nevertheless, the Chronicle finds some willing to support the proposal for an ideal ratio:

Some advocates of increasing the proportion of faculty at universities say they support the researchers’ goal of setting a three-to-one ratio of faculty to administrators.
Benjamin Ginsberg, a professor of political science at the Johns Hopkins University and author of The Fall of the Faculty: The Rise of the All-Administrative University and Why It Matters (Oxford University Press, 2011), has argued that universities would be better off with fewer administrators, people he calls “deanlets.”
The three-to-one ratio “makes a lot of sense,” Mr. Ginsberg said, because it would shift the staff balance in universities. “If an administrator disappeared, no one would notice for a year or two,” he said. “They would assume they were all on retreat, whereas a missing professor is noticed right away.”
Richard Vedder, director of the Center for College Affordability and Productivity and a professor of economics at Ohio University, said shifting the balance back toward faculty is key to keeping universities’ missions focused on teaching, as opposed to becoming too focused on other activities, like business development or sustainability efforts.
“We need to get back to basics,” said Mr. Vedder. The basics are “teaching and research,” he said, “and we need to incentivize leaders of the universities to get rid of anything that’s outside of that.”

Administrative staff – not unnecessary overheads

This is just ridiculous rhetoric and really we should just discount it. However, such views are, unfortunately, not that uncommon and do have to be challenged.

In order for the academic staff to deliver on their core responsibilities for teaching and research it is essential that all the services they and the university need are delivered efficiently and effectively. There is not much point in hiring a world-leading scholar if she has to do her all her own photocopying, spend a day a week on the ‘phone trying to sort out tax issues or cut the grass outside the office every month because there aren’t any other staff to do this work. These services are required and staff are needed to do this work to ensure academics are not unnecessarily distracted from their primary duties.

Although provision of such services is not in itself sufficient for institutional success, it is hugely important for creating and sustaining an environment where the best-quality teaching and research can be delivered. If a university chooses to dispense with the professional staff who deliver these services in order to pursue a mythical ratio then it might find it’s rather hard to hold on to those outstanding academics for very long.

Most recently there is a piece in THE reporting on the launch of the “Council for the Defence of British Universities” which notes that

The council’s initial 65-strong membership includes 16 peers from the House of Lords plus a number of prominent figures from outside the academy, including the broadcaster Lord Bragg of Wigton and Alan Bennett. Its manifesto calls for universities to be free to pursue research “without regard to its immediate economic benefit” and stresses “the principle of institutional autonomy”. It adds that the “function of managerial and administrative staff is to facilitate teaching and research”.

Now whilst I do of course agree that this is a fundamental part of administrators’ roles and it is splendid that the great and the good do accept that administrators exist, there is something here in the tone of this comment that makes me think that some might take this to be that we should be “seen and not heard”. I do hope not.

Blots on the information landscape

Exciting new report on redesigning the higher education data and information landscape

A previous post commented on regulatory issues and the work being undertaken on the “information landscape”. A report on part of this work, the imaginatively entitled “Project B” has recently been agreed by the Interim Regulatory Partnership Group. The report sets out a new way forward for the governance of HE data:

The project report was presented to the Interim Regulatory Partnership Group (IRPG) at its meeting on 15 June. The report envisages a new, collective approach to the governance of the data and information landscape in HE, which could be achieved in the medium term.

IRPG accepted the recommendations of the report and agreed that this work should be scoped as a part of the broader programme of activities being taken forward by the Group.

Part of the evidence considered by the Group was a survey which aimed to establish the totality of external reporting undertaken by HEIs in the UK. The survey identified 550 (550!) separate external reporting requirements and grouped them into seven main categories as follows:

The main recommendation contained in the report is that the key players should get together and agree what data and information is required:

To achieve this IRPG should task some of the key stakeholders in information flows (e.g. HESA, QAA, SLC, UCAS, AoC, Guild HE and UUK) to develop and propose the structure, resourcing and operation of a governance model for the data and information landscape.

This would enable a programme of work, using shared expertise, to create a more coherent set of arrangements for the collection, sharing and dissemination of data. These arrangements would include the identification, development and adoption of data and information standards and the review and scrutiny of data requests.

In order to fulfil this role there would need to be a series of enabling projects, including:

  • Develop a calendar and inventory of data collections across the year as a first step towards streamlining collections and improving the timeliness of information
  • Develop a data model, lexicon and thesaurus for the sector – this would be a purely administrative/reporting model that does not seek to impinge on academic practice or to impact the way business processes are carried out. It may be that this would be a series of linked models using a consistent approach and a common data language.The establishment of this collective oversight of the information landscape would require each of the organisations involved to make a real commitment to work collaboratively and openly on issues involving data and information.

Whilst these steps will be important they do seem relatively modest aims in the light of the sheer scale of the regulatory burden identified by the group. It remains to be seen how much benefit will result from the establishment of a “coherent set of arrangements” in the medium term. Let’s hope it leads to some real reductions in the data information demands placed on universities.

HESA Performance Indicators: More interesting than you’d think

HESA Performance Indicators: Summary tables and charts

HESA, the Higher Education Statistics Agency, has recently published its annual set of UK Higher Education Performance Indicators. See the HESA site for summary tables and charts.

One particular piece of data seems to have attracted attention – Non-continuation rates of full-time entrants after first year at institution. The details are noted here:

Table series T3 provides an indicator showing the proportion of entrants who do not continue in higher education beyond their first year. Table T3a provides this indicator separately for young and mature full-time first degree entrants to higher education.

In general, a higher proportion of mature entrants than young entrants do not continue in higher education after their first year. For full-time first degree entrants in 2009/10, the UK non-continuation rate was 13.3% for mature entrants compared with 7.2% for young entrants (sourced from table T3a). The non-continuation rate for young entrants was 10% or less at approximately 75% of institutions. For mature entrants it was between 2% and 20% at the majority of institutions.

and the summary chart for young entrants looks like this:

While there has been a small rise for young students, it doesn’t look significant and the higher rate for mature students is unsurprising. Whether it will change in future remains to be seen. The new funding regime could be predicted to have an impact either way on drop out rates.

Although the calculations are different it is interesting to compare the HESA data with the reports prepared by the Chronicle of Higher Educations on Graduation Rates and Data for 3,800 Colleges in the USA. They clearly have a problem with “missing” students.

There is plenty of other interesting data in the HESA tables. Well worth a look.

Risk of managers swamping universities?

Some seem to think that management numbers are growing too fast

HESA, the Higher Education Statistics Agency has recently published its annual summary of staff numbers in higher education. The headline data follows:

Academic staff

Of the 181,185 academic staff employed at UK HEIs, 44.2% were female, 12.4% were from an ethnic minority and nearly a quarter (24.8%) were of non-UK nationality.

17,465 academic staff had contracts conferring the title of ‘Professor’. Of these 19.8% were female, 7.3% were from an ethnic minority and 16.7% were of non-UK nationality.

Non-academic staff

As well as academic staff, there were a further 200,605 non-academic staff employed at HEIs in 2010/11. The majority (62.4%) of these staff were female. 10.0% of non-academic staff were from an ethnic minority and 9.3% were of non-UK nationality.

16,395 non-academic staff were coded as ‘Managers’. Of these 52.4% were female, 6.0% were from an ethnic minority and 5.9% were of non-UK nationality.

This is the definition of ‘Managers’ used by HESA:

Non-academic Managers are defined as those individuals who are responsible for the planning, direction and co-ordination of the policies and activities of enterprises or organisations, or their internal departments or sections. Senior academics who act as vice chancellors or directors/heads of schools, colleges, academic departments or research centres are coded as academic staff.

To summarise this HESA offers a handy infographic:

On the face of it this all looks pretty innocuous but it seems that, despite the relatively small number of managers in the sector, around 4% of the staff total and smaller than the professoriate, the rate of growth of managers has been faster than academics. For some, according to the Times Higher Education, (which seems to use different data in places) this is a bit of a problem:

The percentage increase in the number of managers in higher education in recent years is more than twice that for academics, an analysis of new figures has suggested.

Data released by the Higher Education Statistics Agency reveal there were 15,795 managers in higher education in December 2010 – up by almost 40 per cent on the 11,305 employed in the 2003-04 academic year.

That was compared to the 19.2 per cent increase in academics since 2003-04. It means there is now a manager for every 9.2 academics compared with a ratio of one to 10.8 seven years earlier.

Sally Hunt, University and College Union general secretary, said: “Despite the fact that there has been a large increase in the number of students in recent years, there has been a larger increase in the number of managers than academics.

“We have raised fears about the changing nature of universities as the market in higher education continues to grow. However, institutions and government must never lose sight of universities’ key roles in teaching and challenging students.”

Meanwhile, statistics released by Hesa on 1 March showed staffing levels at universities fell by 1.5 per cent last year.

The figures showed there were 381,790 people working at UK higher education institutions in 2010-11, down by 5,640 from 2009-10.

These numbers though really are not large and manager numbers have grown by just under 4,500 at a time when academic numbers have grown by over 16,000 (which makes the point from Sally Hunt factually incorrect).

The UCU comment suggests it is taking its lead from David Willetts.  He made a similar point in a speech made to a UUK conference back on 9 September 2010:

There are other ways of cutting overhead costs. In 2009 the number of senior university managers rose by 6% to 14,250, while the number of university professors fell by 4% to 15,530. On that trend the number of senior managers could have overtaken the number of professors this year. I recognise that universities now are big, complex institutions with revenues from many sources which need to be professionally managed. But we owe it to the taxpayer and the student to hold down these costs – we are now in a different and much more austere world. Again, we are not going to shirk our share of responsibility for tackling this. We will to do away with unnecessary burdens upon you that require the recruitment of more administrators. Do tell me – and HEFCE, of course – of any information requirement or regulation which you believe comes at a disproportionate cost. They have to go: we cannot afford them.

So this is the moment to be thinking even more creatively about cost cutting. I congratulate you on your initiative in inviting Ian Diamond to chair a UUK group on efficiency savings. You are right to get to grips with this. We can work with you on this agenda without getting sucked in to micromanaging our universities. No returning to a time – a century ago, actually – when one vice chancellor reacted to a Board of Education demand for figures on staff teaching hours by complaining that “Nothing so ungentlemanly has been done by the Government since they actually insisted on knowing what time Foreign Office clerks arrive at Whitehall.”

As noted in a recent post, these claims about reducing regulation ring rather hollow and, given that government demands on universities have increased rather than declined, this does perhaps provide one explanation for the growth.

How signifiicant is all this though? While the staff group ‘managers’ has grown faster than academic professionals at all universities and at Russell Group universities (but not at Nottingham as it happens), this is a small category of staff representing only 7-8% of all non-academic staff. The definitions of the various staff groups provided by HESA do allow some judgement in the allocation of staff to the various groups and there is some evidence of differing practice at different institutions. However, the definition of academic professional is straightforward and unambiguous and it is clear that at Nottingham such staff have grown considerably more than non-academic staff since 2003-04.

Universities need managers to function effectively. They are key to enabling academic staff focus on delivering excellent research and first class teaching and for protecting academics from the worst regulatory excesses of government. So this modest growth is really nothing to get excited about.

Junk food for league tables

Perverse incentives for misreporting the data

20111123-134812.jpg

Inside Higher Ed carries a cautionary tale about a US College Provost who misreported data to external bodies over a number of years to enhance the College’s profile:

Iona College acknowledged Tuesday that its former provost had, for nearly a decade, manipulated and misreported student-related data to government officials, accrediting bodies, bond rating agencies, and others.

“I do think there probably is a pattern” in the case at Iona and other recent incidents involving law schools at the University of Illinois and Villanova University, Clemson University’s reporting to U.S. News and World Report, and even the grade-changing scandal in the Atlanta public schools, said Jane Robbins, senior lecturer in innovation, entrepreneurship, and institutional leadership at the University of Arizona.

While making clear that she did not in any way excuse the “egregious” individual behavior on display at Iona, Robbins said the situation reflects the intense pressure and “perverse incentives” in an “intensely competitive system” in which colleges are often deemed worthy or excellent based on standardized test scores and the giving rates of their alumni.

“It’s the kind of thing that if everybody was audited, we might see a lot more of it,” said Robbins.

Iona officials sought no refuge in an “everybody does it” argument. In an interview Tuesday, Joseph E. Nyre, who became the Roman Catholic college’s president on July 1 and heard within weeks from employees at the college who suspected problems with institutionally provided data, attributed the wrongdoing there to “the actions of a person that, because we didn’t have a proper system of verification, were allowed go to undetected.”

As the table below from Inside Higher Ed shows, there were some significant differences between reported data and reality, particularly in relation to overstated SAT scores, SSRs, completion rates and alumni giving.

Could it happen in the UK? Whilst some creative reporting was feasible in the early days of league tables (I have heard of at least one institution which adopted a somewhat over-optimistic view when much of the data was self-certified), the rigorous data collection standards of HESA, Hefce, DeLHE etc make such wilful misreporting extremely difficult these days. The system though remains intensely competitive and the perverse incentives undoubtedly exist in relation to data which contribute to league table positioning.

Universities ‘scared of private sector’

Oh no we’re not

Some festive cheer from politics.co.uk.

The analysis here is somewhat overstating the case though:

Massive efficiency savings which could drive down costs in higher education are only possible if university managers get over their suspicion of the private sector, Policy Exchange has claimed. A report by the centre-right thinktank’s Alex Massey published today argues that significant benefits are possible from “productive collaborative arrangements”.

Up to 30% of the total cost of university administration could be saved if more services were shared, the report claims. Across the total higher education sector this amounts to £2.7 billion.

There really is nothing much new in this report from Policy Exchange, the full text of which can be found here.

Four brief points to note:

  1. Universities really do need the VAT changes we have argued for for many years in order to create real incentives for sharing services (the report endorses this).
  2. Simply outsourcing lots of services does not necessarily deliver a better service for students or guarantee savings: it works in certain areas in certain contexts at certain times but is no panacea.
  3. The report rightly acknowledges significant examples of sector wide shared services which already exist, including UCAS, but also what about JANET and jobs.ac.uk? Not sure would really argue that QAA is a shared service in the same sense although there is a case for HESA.
  4. The savings figures quoted here are just fantasy.

So, overall a modest contribution to the very real challenges facing universities. Yes, we should collaborate more on services but only where it will both deliver savings and improve the quality of the service we provide. But the idea that universities are ‘scared’ of the private sector is very wide of the mark.

Lots of Students in Higher Education

Latest HESA data: Lots of students in Higher Education Institutions 2008/09

The newly released Students in Higher Education Institutions 2008/09 publication from the Higher Education Statistics Agency shows that there were 2,396,050 students in higher education in the UK in 2008/09. Of these 2,027,085 (84.6%) were UK domicile students, 117,660 (4.9%) were from other EU member countries and 251,310 (10.5%) were from non-EU countries.

There are some interesting headlines in here. International student (ie non-EU) student numbers have grown 9.4% over the previous year, outstripping the growth in home student numbers which increased by 3.2%.

Students from China and India accounted for nearly one third of all non-EU domicile students at UK HE institutions in 2008/09. The table below shows the growth in numbers of students from the top ten non-EU countries of domicile from 2007/08 to 2008/09:

Further details available via HESA.

Government needs to help league table compilers

The IUSS Committee’s recent report on students and universities is a most extraordinary document in all sorts of ways. One of the more entertaining propositions relates to university league tables where the Committee accepts the existence (wisely, you might argue) of league tables and acknowledges the work that HEFCE has recently published. However, its take on such tables is somewhat different from many, in that it suggests that as much data as possible is published in a way which facilitates the creation of league tables:

In our view, it is a case of acknowledging that league tables are a fact of life and we welcome the interest that HEFCE has taken in league tables and their impact on the higher education sector. We have not carried out an exhaustive examination of league tables but on the basis of the evidence we received we offer the following views, conclusions and recommendations as a contribution to the debate on league tables which HEFCE has sought to stimulate and to improve the value of the tables to, and usefulness for, students. We conclude that league tables are a permanent fixture and recommend that the Government seek to ensure that as much information is available as possible from bodies such as HEFCE and HESA, to make the data they contain meaningful, accurate and comparable. Where there are shortcomings in the material available we consider that the Government should explore filling the gap. We give two examples. First, the results from the National Student Survey are produced in a format which can be, and is, incorporated into league tables. It appears to us therefore that additional information or factors taken into account in the National Student Survey would flow through to, and assist those consulting, league tables. To assist people applying to higher education we recommend that the Government seek to expand the National Student Survey to incorporate factors which play a significant part in prospective applicants’ decisions— for example, the extent to which institutions encourage students to engage in non-curricula activities and work experience and offer careers advice. [Para 104]

Not only therefore is it proposed that current data be modified to make the league table compilers’ work easier, but that they should be provided with additional information where it is lacking. Thus:

Second, Professor Driscoll from Middlesex University considered that league tables neglected “the contribution that universities that have focused on widening participation, like Middlesex, make to raising skills and educational levels in this country”. In other words, the National Student Survey as presently constituted does not assess the “value added” offered by individual institutions. We recommend that the Government produce a metric to measure higher education institutions’ contribution to widening participation, use the metric to measure the contribution made by institutions and publish the results in a form which could be incorporated into university league tables. [para 105]

League table compilers have struggled with this one for some time and will therefore appreciate such kind assistance from government.