The pandemic has rendered university league tables meaningless - but there is one crucial metric that we should all be watching.
We spoke to an anonymous higher education specialist, who believes prospective students and their advisors have little to learn from league tables right now.
They outline how the current metrics are not fit for purpose in a pandemic, but there’s one indicator we should all be paying attention to. So where can marketers find the numbers that matter, and how do they communicate them?
University marketing teams are accustomed to grabbing hold of each league table as it comes out, and turning the data into appealing stories (and attention-grabbing banners) for the student market.
But in these strange times, young people – and their parents and teachers – should rightly regard information coming from league tables with some suspicion.
The tables paint a picture of how well universities did at their old job - educating students via lectures and tutorials, in laboratories and hi-tech learning spaces, with lots of group activities and face-to-face discussion - to prepare them for the job market or further study.
The present task of universities looks rather different. They find themselves teaching students mostly via online lectures and tutorials.
They have to support young people who have not been formally taught for months, who have been through a tumultuous results process and who may well have ended up in an unexpected environment. And they must keep them safe from a virus which seems to like the same things students do – parties and pubs.
Old metrics meet the new normal
How much do the old metrics tell us about how well universities are doing in the ‘new normal’?
Not very much. There is no metric to tell us which universities are ahead of the game in online learning - though we know there is a wide discrepancy between those who committed themselves a while back to digitising and future-proofing their courses and those who didn’t.
The university tables all use the National Student Survey for their student satisfaction rates. But the students who were surveyed for the current crop of tables knew nothing about the experience students are going to face this year and next.
The tables use tariff scores based on A-level results - this year’s will of course be wildly out of kilter with previous numbers, based as they are on teacher assessment.
Some use staff-student ratios - what does this mean in the context of online learning?
And then of course there is the question of graduate employability. This will be a tough criterion for universities in the foreseeable future, as they send their graduates out into an unprecedented global economic mess.
But the employment data was problematic this year too, as the table compilers were faced with a change in how it was collected.
It used to come from the DLHE survey (Destinations of Leavers from HE) and was collected by universities contacting their own students six months after graduation.
The new Graduate Outcomes survey uses a central agency to contact graduates 15 months after graduation. There were problems with the changeover – it was hard to track down students after such a long break, and less information was gathered.
This meant the tables were particularly volatile, with some universities unexpectedly falling or rising dozens of places. Eyebrows have to be raised when a university carries on much the same as usual but finds itself in a different bracket simply because statisticians changed the way they collected their data.
One way or another, a gap has opened up between the story told by league tables and the reality on the ground.
Drop-out rate
But one metric suddenly looks very important. And that’s the “continuation measure” or, to put it more bluntly, the drop-out rate.
Universities have already admitted they are worried that many freshers won’t make it to the end of the first term this year.
Besides the academic challenge, there will be financial pressures: part-time jobs are hard to come by, and their families may not be able to provide the financial back-up they might have expected.
University leaders, academics and support staff will have to put an enormous effort into supporting students this term, monitoring the progress and attendance of every individual and trying to meet their emotional, learning and financial needs.
Whether they succeed or fail will be apparent in their drop-out statistics.
And my advice to marketing teams would be to take a look at these at the end of the first term. If your university has excelled, and your students want to stay on, that is a powerful and relevant story to tell.