Lord Mandelson keynote speech at Lord Dearing Memorial Conference

Lord Mandelson delivered the keynote address at the Lord Dearing Memorial Conference held at the University of Nottingham in February 2010.

Lord Mandelson commented on Lord Dearing’s contribution to higher education:

Lord Dearing was very clear that our higher education system was central to what made our society intellectually curious and critical, what made it socially just and humane. It is the place where we define and redefine our sense of ourselves and the forces that shape us.

The main thrust of his speech though was about the consequences of the cuts in HE funding he had recently announced. In essence, he was uncompromising in presenting the reductions as a necessary contribution to wider public finance savings and as an opportunity to universities to reconsider their spending and help to “focus minds” on the need to seek out new sources of funding (and he also commented specifically on the University of Nottingham):

Universities have been able to leverage a steep rise in non-state funding. They have widened their sources of income by exporting their teaching brands, opening their doors to fee-paying international students. Higher education is now a major export industry for the UK and a key comparative advantage – some £5.3billion in exports in 2008. Nottingham has done this very well. The best university systems in the world are defined by a wide range of public and private funding and British universities need the same diversity. I recognise that sources of additional business income are not limitless and can be irregular, especially during a downturn. But even a small expansion in this work would go a long way in closing the gap created by a period of fiscal constraint.

But a large part of the speech was dedicated to discussing the extension of part-time study and two-year intensive degrees with the argument being that these are creative ways to reduce spend:

The push to save costs can and should actually push the system in the direction of the modes of study I have been advocating. Part-time degrees, shorter and more intensive courses all offer the potential to lower student support costs, use resources more intensively and improve productivity.

Not terribly convincing. Whilst strong arguments about the need for savings can be made, the proposals around alternative modes of study are much less persuasive.

“Universities are crumbling”…

…according to a “secret database”

The Guardian claims a bit of a scoop following a Freedom of Information request on building conditions in universities:

Scores of university halls of residences and lecture theatres in the UK were judged “at serious risk of major failure or breakdown” and “unfit for purpose”, a secret database obtained after a legal battle by the Guardian reveals. Some of the most popular, high-ranking institutions, such as the London School of Economics, had 41% of their lecture theatres and classrooms deemed unsuitable for current use, while Imperial ­College London had 12% of its non-residential buildings branded “inoperable”. At City University, 41% of the student digs were judged unfit for purpose. Universities argue they have spent hundreds of millions in refurbishment since the judgments were made two years ago and use some of the buildings for storage purposes only.

Large amounts have been spent on capital improvements in the last few years but the backlog, following years of underfunding and neglect, was substantial. The position will undoubtedly have improved further since the survey referred to here but it is inevitable that there will still be poor building stock around the country. So, not clear what the shock is here.

The government agency that holds the information, the Higher Education Funding Council for England (Hefce), was forced to reveal it after an information tribunal ruled in the Guardian’s favour, agreeing that it was in the public’s interest for the data to be made public.

It looks like then that the difficulty of securing release of the information became the story here rather than the data itself which, whilst disappointing, is not exactly surprising.

Annual review of university rankings

EUA to publish ‘annual review’ of worldwide university rankings

EUA (the European University Association) has said it intends to publish an annual review of world university rankings. Given the growing number of league tables and rankings, national and international, and the impact of these on “decision-making and activities in universities across Europe” EUA has, rather helpfully, decided to publish an annual review of university rankings.

The aim of this new pilot project will be to provide universities with transparent information about international rankings by critically evaluating their methodologies, assessing potential biases and suggesting improvements. The review will also help universities to develop strategies to cope with rankings, as well as encouraging alternative approaches to enhance transparency.

The annual review – due to appear for the first time in 2011 – will include a compendium of different international ranking initiatives with a thorough critique and analysis, and will be complemented by a series of critical articles by leading experts in the field. To disseminate this information and stimulate debate on the findings, EUA also plans to organise an annual rankings seminar for university leaders across Europe.

This sounds like a timely and potentially very helpful intervention.

International Assessment of Higher Education Learning Outcomes (AHELO)

OECD is undertaking a Feasibility Study for the International Assessment of Higher Education Learning Outcomes (AHELO)

The outline of the programme, published here, is as follows:

The OECD Assessment of Higher Education Learning Outcomes (AHELO) is a ground-breaking initiative to assess learning outcomes on an international scale by creating measures that would be valid for all cultures and languages. Between ten and thirty-thousand higher education students in over ten different countries will take part in a feasibility study to determine the bounds of this ambitious project, with an eye to the possible creation of a full-scale AHELO upon its completion.

The 21st Century is witnessing the rapid transformation of higher education. More students than ever before enter higher education and a growing number study abroad. The job market demands new skills and adaptability, and HEIs (“Higher Education Institutions”, which include universities, polytechnic schools and colleges) struggle to hold their own in a fiercely competitive marketplace. Ministers at the Athens Conference agreed that OECD countries needed to take a further step by making higher education not only more available but of better quality, and that current assessment methods were not fully adequate to meet these changes. An alternative had to be found. AHELO is the result.

There are, however, some real problems with this approach. Let’s look first at the elements of the study:

The factors affecting higher education are woven so tightly together that they must first be teased apart before an accurate assessment can be made. The AHELO feasibility study thus explores four complementary strands.

The four strands are:

    – generic skills
    – discipline specific strands in engineering and economics
    – learning in context: physical and organisational characteristics; education-related behaviours and practices. including “student-faculty interaction, academic challenge, emphasis on applied work”; psycho-social and cultural attributes; behavioral and attitudinal outcomes.
    – value-added (fraught with difficulty)

All seems worthy and laudable stuff, particularly the desire to move away from the reductionism of international league tables in favour of a more rounded and teaching and learning focused view. However, the approach is fundamentally flawed in its core assumption that learning outcomes are assessable in a meaningful and comparable way and indeed that this is desirable in higher education. This approach is therefore quite mistaken. One of the reasons that the factors affecting HE are so closely woven together is because of their inter-dependence and inseperability. The approach to assessing learning outcomes which seems to underpin the study has its origins in the Learning By Objectives movement of the last century from the USA: it failed there as a means of assuring standards of education and does not offer a way forward here. Looking to make the outcomes explicit and judge the standards of those outcomes and then compare them is misguided.

This is because explicitness about such outcomes, cannot, in itself, convince us that those outcomes are being achieved or that they are correct or even worthwhile. Even if we were in a position where we were able to describe the standards embodied by such outcomes satisfactorily (a questionable assumption), this could in no way be taken as assurance that such standards were being achieved or indeed that anyone fully understood what was meant by such descriptors. There is no necessary correlation between description and understanding – rather this would represent an extended and complicated version of a naming fallacy.

An extract from my book (with apologies for the self-referencing) ‘Dangerous Medicine: Problems with Assuring Standards and Quality in UK Higher Education‘ (p158-9) reinforces this:

Commenting on assessment in US education, Stake highlights the failure of large scale mandatory externally imposed assessment in schools in the USA to improve standards. He argues that the consequences of this assessment regime need to be more fully evaluated in order better to inform policy but the lessons for the UK are instructive. Glass, pursuing a similar theme, criticises the ‘nonchalance’ of ‘experts’ in dealing with the issue of standards, particularly in relation to those concerned, such as Mager, with the setting of behavioural objectives (from which the origins of the UK competence movement can be traced) and observes that the ‘language of performance standards is pseudoquantification, a meaningless application of numbers to a question not prepared for quantitative analysis’. He further examines the evolution of ‘criterion-referenced testing’ in which he describes the meaning of ‘criterion’ as a ‘case study in confusion and corruption of meaning’. Glass takes particular exception to the use of cut-off scores to differentiate performance where, ultimately, a decision on whether ‘to ‘pass’ 30% vs. 80% is judgmental, capricious, and essentially unexamined’, ie totally arbitrary.

It is important to note the culturally specific origins of these ideas which were developed in the United States in the 1930s; adaptations of Tyler’s approach became extremely influential there in the 1960s with the country desperately seeking technological advance and therefore open to an industrially-oriented and rational model which, in providing specified and measurable behavioural objectives, was inevitably attractive to federal and state funders. Although the circumstances are rather different in a post-millennial UK, the HE sector nevertheless appears to be moving towards adopting a new version of 70-year old model, a bastardised interpretation of which failed in another country 30 years ago. There are many other problems associated with the learning by objectives approach but it is worth noting Stake in his retrospective on his earlier paper ‘The Countenance of Educational Evaluation’ admitting, in slightly apologetic tone, his error in stating in the paper ‘that evaluators could improve their judgements of quality by identifying congruence between intent and outcome’. As Stake acknowledges, all this does is assist with description and understanding and such congruence says nothing about the merit of the course, programme or the individual student’s learning evaluated. As Norris observes, objectives-based evaluation inevitably leads to an over-valuing of measurable tasks and assumes that values are relatively unimportant.

So, I would suggest that this is really the wrong approach to be taking.


Stake, R (July 1998), ‘Some Comments on Assessment in US Education’, Education Policy Analysis Archives, 6(14) http://www.olam.ed.asu/epaa
Glass, G V (1978), ‘Standards and Criteria’, Journal of Educational Measurement, 15(4), pp237-261.
Stake, R E (1991), ‘Retrospective on the Countenance of Educational Evaluation’ in McLaughlin, M W and Phillips, D C (1991), Evaluation and Education at Quarter Century: Nineteenth Yearbook of the National Society for the Study of Education, pp67-88.
Stake, R E (1967), ‘The Countenance of Educational Evaluation’, Teachers College Record, 68(7), pp52-69.
Norris, N (1990), Understanding Educational Evaluation, London: Kogan Page.

“Plagiarism up 700%” at University of Nottingham

Slight misinterpretation of non-comparable data

According to a shocking report in Impact, which doesn’t let facts get in the way of a sensational story, plagiarism is up 700% at the University of Nottingham:

The University of Nottingham has insisted that cheating is not skyrocketing among its students, following the emergence of figures which show a significant rise in recorded plagiarism at the institution in the last five years. According to records released under the Freedom of Information Act, 280 students were caught submitting copied work in coursework last year compared to 38 in 2004: a rise of over 700%. Another 11 were found to have cheated in exams in the 2008/09 session.

The University also reported 123 second offenders caught in the last five years. Only four of the culprits were expelled, while most were allowed to continue studying after having their marks docked.

But the real issue is that the means of detecting and recording plagiarism have significantly changed and, most importantly, new software has been introduced to detect and deter plagiarism.

Citing an investment in plagiarism detection software and a change in the system for gathering plagiarism data, a spokesman for the University argued that “There is no clear evidence that plagiarism and cheating have actually increased to this extent.”

In an official statement, the University spokesman said: “In 2006 the university invested in plagiarism detection software to assist our academics. This accounts in part for a noticeable rise in cases detected and proven.

“It is impossible to attempt to extrapolate increases or draw any conclusions from the limited and non-comparable data currently available in relation to plagiarism. Direct comparison is unreliable since the information held by schools within the University from before 2006 is incomplete.”

All absolutely correct. Several of the comments which follow the article also acknowledge the University position and offer a helpful critique of the piece which, far from being a serious piece of investigative reporting, is in fact simply an account of the results of an FOI request. A bit like the recent report in the Sun.

Unofficial university promos

Contrasting unofficial university promo videos

Certainly at least one of these is not endorsed by its University…

Must admit I found it difficult to get past about 4 minutes on this one:

No such trouble with this much more entertaining effort (not entirely suitable for younger viewers in parts).

Double standards on fees?

“Middle-class students face £7,000 wallop”

Grave anxiety in the Times that middle-class students might have to pay higher contributions post-graduation:

Students from middle-class families may be denied grants and cheap loans and be charged higher tuition fees under a “double whammy” to be considered by a government review of university funding. It could add nearly £7,000 a year to the cost of university for a student from a family with an income of £50,000 a year.

The higher charges are being advocated after Lord Mandelson, the first secretary of state, announced £950m of cuts to higher education. Costs are expected to increase, whoever wins the general election. Lord Browne, the chairman of the government review, has the task of producing more money for universities without extra cost to the taxpayer and is expected to look favourably on cuts to what critics claim are middle-class subsidies.

Pure speculation of course but difficult to feel a huge amount of sympathy for this special pleading, especially in the light of another piece in the same edition of the paper which explains how much middle-class parents are prepared to stump up for extra tuition:

As many as half the children in London have received private tuition as parents become more and more desperate to win places at the best schools, new research has found. The latest edition of the Good Schools Guide has found the recession has had no apparent effect on parents’ willingness to pay between £20- £40 an hour to top up their children’s education.

The boom is being fuelled both by parents’ ambitions for children to win places at the best universities and by a glut of unemployed graduates tutoring part-time while they look for a full-time job. Tuition agencies report growth of 15%-100% last year, with popularity growing quickly in cities such as Birmingham and Manchester as well as in the traditional heartlands of London and the southeast.

Double standards?

Some good news on widening participation

“Substantial increases in entry to higher education for disadvantaged young people”

Widening Participation is working according to a new report from HEFCE:

The study, conducted by Dr Mark Corver of HEFCE’s Analytical Services Group, finds that there has been a substantial and sustained increase in the HE participation rate of young people living in the most disadvantaged neighbourhoods since the mid-2000s. The participation rate of young people living in the most disadvantaged areas has increased every year since the mid-2000s. Young people from those areas are now 30 per cent more likely to enter HE than they were five years ago. Participation rates have also increased in advantaged neighbourhoods over this period, but less rapidly.

These recent trends mean that more of the additional entrants to HE since the mid-2000s have come from disadvantaged neighbourhoods than advantaged neighbourhoods. This has reduced the participation difference between advantaged and disadvantaged neighbourhoods. The study places these changes in the context of the large differences in entry to HE that are found by where young people live. In the mid-1990s, one in eight young people from the most disadvantaged areas entered HE. That figure has increased to around one in five today but remains far lower than for the most advantaged areas, where well over half of young people now enter higher education.

So, this really does look like good news and a welcome relief to government given other recent reports highlighting the growth in inequality in Britain since 1980. However, there is still a long way to go. And the concern will be that in seeking to make savings universities will reduce spend on both widening participation activities and bursary schemes for those students most in need of additional financial support.