More means worse? (Data that is)

 Lots of information is not necessarily a good thing for prospective students

I’ve written before about concerns about too much data and the importance of quality rather than just quantity in the information provided to applicants to higher education.

Now a new HEFCE report on Improving information for prospective students has come to a similar conclusion.

The report summarises existing research into decision-making behaviour and comes to some interesting conclusion:


Relevant research was identified across a wide range of disciplines, including information science, cognitive and behavioural psychology, behavioural economics and social theory. This research is likely to be relevant to how prospective students make their higher education choices.

The research draws attention to the need to examine fundamental assumptions about how people use information in decision-making.

Key findings in the report include:

  • The decision-making process is complex, personal and nuanced, involving different types of information, messengers and influences over a long time. This challenges the common assumption that people primarily make objective choices following a systematic analysis of all the information available to them at one time.
  • Greater amounts of information do not necessarily mean that people will be better informed or be able to make better decisions.


It’s a really detailed, serious and comprehensive report and sets out eight principles which it is proposed should govern future information provision for prospective HE students. Let’s hope it is taken seriously and that we now take a fresh look at this important issue. Mike Hamlyn has also commented on this report and is entertainingly sceptical on its findings.

Retaining institutional knowledge


Inside Higher Ed has an interesting article on avoiding the loss of valuable institutional knowledge which occurs when employees move on.

Previous An earlier post commented on the importance of institutional history but this is more about the efficient retention of business critical information. It’s an thought provoking commentary:

There are steps organizations can take to reduce the level of institutional knowledge that they lose with the loss of skilled employees. Specialized training, documentation of processes, and job sharing are a few of the ways to combat this loss. One of the more effective methods of lessening the loss of institutional knowledge is having the older and more experienced workers serve as mentors and trainers, allowing them to pass on their knowledge to others within the organization. In order to prepare for the loss of institutional knowledge and plan for knowledge transfer, organizations must develop strategies to ensure business continuity. This is something that many organizations, I believe, are not doing enough.

files pile

My 2012 survey of our Gen-X and millennial employees were asked a number of questions dealing with institutional knowledge. They were asked them about the value of their institutional knowledge and perception of the loss that the institution would suffer if they left. They were also asked about the business process and continuity, and other skills that they had acquired while working at the institution, and what outcomes (including gains or losses) would the institution realize, if they left. The results of the survey from these questions were not as surprising.

The majority of both generational groups believed that what they have learned at the institution was very important and had value. Furthermore, they maintained that this value, or institutional knowledge, would be a critical issue if not addressed by management. Both generational groups believed that their supervisors and managers would be hard-pressed to find replacement employees with similar skills or knowledge. The institution did not have either a tacit or explicit formal plan to transfer knowledge. Though the responses were not surprising and had a bit of humility tied into their responses, it did bring up the question of what we’re doing to retain, acquire, or transfer this knowledge before they leave.

card index

Additional research or studies may be necessary to really understand the importance of institutional knowledge and the methodologies by which to retain or acquire it. Aside from several articles on the subject, there’s not much published on this topic.

On a larger scale, I believe that if efforts aren’t made to address the retention of Gen. X and millennial employees, we could possibly see a large gap in the loss of institutional knowledge, continuity and history that the earlier generational groups had or made available. This knowledge may be difficult to replace. Hopefully, additional work on this subject will bring this issue to the forefront and lead to effective implementation of plans to preserve institutional knowledge.

I’m not sure that the target group of Generation X and millennial staff is the key grouping here but the general point about the need to develop plans to retain institutional knowledge is well made. Part of the solution is systemic, i.e. having the document management systems and processes which encourage and require knowledge retention, and the other element is cultural, everyone has to recognise the importance and value of preserving this kind of information in the long term interest of the institution. This does however require a really serious strategic focus on the issue and probably not insignificant investment of resource.

Higher Ed data – way too much information

Tackling the surfeit of data

I’ve written before here about Higher Education regulation (see for example this general commentary and this post on information provision) and the excess of information provision available to prospective students.

It’s pleasing therefore to see that HEFCE is undertaking a review of providing information about higher education. The aims of the review are set out as follows:

The review will aim to ensure that:

  • wherever possible, the different elements of the provision of information fall within a coherent framework, across UK institutions
  • we gather sound evidence to help us form the future information
  • the outcomes of different mechanisms suit the issues they are designed to address
  • information is usable and accessible, and that we are able to make the best use of technology to facilitate this in the future.

The review will reflect on how much this area of our work costs the public purse. It will also consider the role of a range of organisations in providing independent, contextualised, robust, comparable and usable information.

unistats latin

There’s plenty more where this came from

The review will look at the purpose and use of NSS results, at the Unistats site and the Key Information Set data as well as the Destination of Leavers from Higher Education (Delhe) survey. It is also going to examine how this data is used by prospective students. If all goes well this should be an extremely valuable piece of work and will, it is to be hoped, result in a significant reduction in the quantity of data collected and published (and the bureaucratic burden on universities) in favour of an improvement in the quality of information available to applicants.

A long way to go but let’s hope that the group overseeing the work, the Higher Education Public Information Steering Group (HEPISG, from which acronym I’m afraid I still derive puerile amusement) will do its job well and we will see some real change in this area.

Broadcasting university performance

Very public reports on institutional performance.

Accessible university performance data.


Rather impressed by this Performance Tracker which is concerned with reporting in a very accessible way on the progress of Michigan’s public universities:


The achievement of Michigan’s public universities is a critical factor as we look to participate in the knowledge economy of tomorrow. A well-educated, skilled talent base will help our state develop and attract new business opportunities. Universities also drive research and development, bring thousands of new faces into our state, and build lasting partnerships that advance our communities.

These goals matter to all of us, and so does the performance of Michigan’s higher education system. This website offers an overview of Michigan’s higher education achievement nationally, and shows how our universities are acting as incubators of future economic growth and change.

There is a great deal of very interesting data in here from graduation rates to tuition fees and SSRs to salary costs. Sensibly, the bench marking is against peer institutions. Will we see others adopting a similar approach? And might it catch on in the UK?

HESA Performance Indicators: More interesting than you’d think

HESA Performance Indicators: Summary tables and charts

HESA, the Higher Education Statistics Agency, has recently published its annual set of UK Higher Education Performance Indicators. See the HESA site for summary tables and charts.

One particular piece of data seems to have attracted attention – Non-continuation rates of full-time entrants after first year at institution. The details are noted here:

Table series T3 provides an indicator showing the proportion of entrants who do not continue in higher education beyond their first year. Table T3a provides this indicator separately for young and mature full-time first degree entrants to higher education.

In general, a higher proportion of mature entrants than young entrants do not continue in higher education after their first year. For full-time first degree entrants in 2009/10, the UK non-continuation rate was 13.3% for mature entrants compared with 7.2% for young entrants (sourced from table T3a). The non-continuation rate for young entrants was 10% or less at approximately 75% of institutions. For mature entrants it was between 2% and 20% at the majority of institutions.

and the summary chart for young entrants looks like this:

While there has been a small rise for young students, it doesn’t look significant and the higher rate for mature students is unsurprising. Whether it will change in future remains to be seen. The new funding regime could be predicted to have an impact either way on drop out rates.

Although the calculations are different it is interesting to compare the HESA data with the reports prepared by the Chronicle of Higher Educations on Graduation Rates and Data for 3,800 Colleges in the USA. They clearly have a problem with “missing” students.

There is plenty of other interesting data in the HESA tables. Well worth a look.

Too much data?

Will more data help prospective students?

Richard Partington, writing in THE, expresses concern about the ‘data overload’ which the Key Information Set (KIS) will deliver. He notes that the provision of information to applicants via the KIS is intended to work in a similar way to price comparison websites such as those offering car insurance. And that this, despite what Ministers might think, is not necessarily a good thing:

But what really worries me, is how the data will be “innovatively presented” by the third-party providers whom the government envisages will advise applicants. Comparing universities and courses is already really difficult. Unless students are lucky enough to be supported by excellent careers advisers, they struggle to make sense of substantially incomparable information regarding course content, teaching, learning, costs and support. The problem has arguably been exacerbated by newspaper league tables that seek to distinguish themselves by weighting data differently, or including additional delineators – sometimes of comical spuriousness. The impossibility of comparing like with like will only get worse under the new arrangements. Try, for example, comparing the fee-waiver, bursary and scholarship packages of Oxford and Cambridge. Both are, I believe, strong and broadly similar. But they look very different.

An earlier post noted similar issues around the provision of advice to students in the new system. It seems to me to be quite likely therefore that the excessive provision of detailed but not necessarily meaningfully comparable data will, as Partington suggests, baffle rather than enlighten students.

A currently pretty much empty site shows an example of what the KIS data will look like and it’s easy to see how seductive this might be for those looking for a cheap solution to the provision of advice to prospective students

Enlightening or baffling? We’ll have to wait and see.

Junk food for league tables

Perverse incentives for misreporting the data


Inside Higher Ed carries a cautionary tale about a US College Provost who misreported data to external bodies over a number of years to enhance the College’s profile:

Iona College acknowledged Tuesday that its former provost had, for nearly a decade, manipulated and misreported student-related data to government officials, accrediting bodies, bond rating agencies, and others.

“I do think there probably is a pattern” in the case at Iona and other recent incidents involving law schools at the University of Illinois and Villanova University, Clemson University’s reporting to U.S. News and World Report, and even the grade-changing scandal in the Atlanta public schools, said Jane Robbins, senior lecturer in innovation, entrepreneurship, and institutional leadership at the University of Arizona.

While making clear that she did not in any way excuse the “egregious” individual behavior on display at Iona, Robbins said the situation reflects the intense pressure and “perverse incentives” in an “intensely competitive system” in which colleges are often deemed worthy or excellent based on standardized test scores and the giving rates of their alumni.

“It’s the kind of thing that if everybody was audited, we might see a lot more of it,” said Robbins.

Iona officials sought no refuge in an “everybody does it” argument. In an interview Tuesday, Joseph E. Nyre, who became the Roman Catholic college’s president on July 1 and heard within weeks from employees at the college who suspected problems with institutionally provided data, attributed the wrongdoing there to “the actions of a person that, because we didn’t have a proper system of verification, were allowed go to undetected.”

As the table below from Inside Higher Ed shows, there were some significant differences between reported data and reality, particularly in relation to overstated SAT scores, SSRs, completion rates and alumni giving.

Could it happen in the UK? Whilst some creative reporting was feasible in the early days of league tables (I have heard of at least one institution which adopted a somewhat over-optimistic view when much of the data was self-certified), the rigorous data collection standards of HESA, Hefce, DeLHE etc make such wilful misreporting extremely difficult these days. The system though remains intensely competitive and the perverse incentives undoubtedly exist in relation to data which contribute to league table positioning.

Lots of Students in Higher Education

Latest HESA data: Lots of students in Higher Education Institutions 2008/09

The newly released Students in Higher Education Institutions 2008/09 publication from the Higher Education Statistics Agency shows that there were 2,396,050 students in higher education in the UK in 2008/09. Of these 2,027,085 (84.6%) were UK domicile students, 117,660 (4.9%) were from other EU member countries and 251,310 (10.5%) were from non-EU countries.

There are some interesting headlines in here. International student (ie non-EU) student numbers have grown 9.4% over the previous year, outstripping the growth in home student numbers which increased by 3.2%.

Students from China and India accounted for nearly one third of all non-EU domicile students at UK HE institutions in 2008/09. The table below shows the growth in numbers of students from the top ten non-EU countries of domicile from 2007/08 to 2008/09:

Further details available via HESA.