U-Multirank, the global ranking of different university activities, has a new financial and management structure which means it can move away from total dependency on the European Union’s Erasmus+ programme.

The Spanish Santander Group and the Bertelsmann foundation from Germany have together agreed to share the cost and running of U-Multirank with the European Commission.

Erasmus+ funding will be cut by 50% over the next two years, but the EU is not abandoning ship, with Tibor Navracsics, European Commissioner for Education, Culture, Youth and Sport, saying they are “committed to the success of U-Multirank and its continued independence as a transparency tool which helps prospective students make informed study choices.”

Will doubters be won over?

European Commissioner Tibor Navracsics

Whether that will be enough to win over the doubters about the value of U-Multirank is questionable.

For the European-backed ranking has never been everyone’s cup of tea, particularly in the UK higher education establishment where the majority of universities appear to ignore the findings and decline to actively participant in data collection, which probably doesn’t help their performance outcomes.

Back in 2012, after the idea of U-Multirank was conceived, Times Higher Education’s ranking editor Phil Baty reported that the influential House of Lords European Union Committee questioned whether it would be “a waste of taxpayers’ money.”

David Willetts, the then UK universities and science minister, went further and suggested U-Multirank could be viewed as “an attempt by the European Commission to fix a set of rankings in which (European universities) do better than they appear to do in conventional rankings.”

And the cynicism is still there!

For as soon as my report, under the headline U-Multirank gets new backers as EU cuts funding appeared in The PIE News (26 October, 2017), the registrar at the University of Nottingham Paul Greatrix tweeted, via @registrarism: “is there any evidence anyone uses this v expensive ranking?’

Fortunately for the consortium running U-Multirank, the leading German philanthropic foundation Bertelsmann Stiftung and the giant Santander banking group seem to think so and have joined the board and pledged €1 million each over the next two years.

Useful message about political independence

Joint UMR project leader Frank Ziegele

U-Multirank co-founder, Dr Frank Ziegele, says: “We don’t expect any major changes in our approach to the rankings, but it sends out a useful message about our political independence.

“Right from the start we did face criticism, especially from the UK, that U-Multirank was too close to the European Commission and the European political project.”

So what is U-Multirank?

U-Multirank claims to be the largest global university ranking and showcases nearly 1,500 universities from 99 countries.

It was launched with Erasmus+ programme funding in 2013 to provide an alternative approach to comparing university performance and reckons it is more student-orientated than traditional league tables.

Importantly, U-Multirank allows users to create their own ‘readymade’ rankings for what interests them – everything from studying abroad to knowledge transfer and co-publications with industry.

According to Frank Ziegele: “The new partners particularly like the idea of an information tool for students to support international mobility.”

To my mind, many of the critics find it hard to understand that U-Multirank is not really looking at the same things as traditional rankers like the Times Higher, QS or ARWU (Academic Ranking of World Universities).

Rather than give an overall score and the top 200 or 500 ‘best’ universities in the world, U-Multirank attempts to compare ‘like-with-like’ and grades universities as ‘A’ for ‘very good’ and E for ‘weak’ in different areas.

French institutions, for example, tend to do very well for internationalisation – which includes inward and outward mobility in U-Multirank.

Arbitrary weightings in traditional rankings

Rankings critic Simon Marginson, director of the Centre for Global Higher Education at the UCL Institute of Education in London, touched on some of the benefits of the U-Multirank (and Leiden and Scimago) approach – and weaknesses in the mainstream rankers – in a recent talk, which University World News covered under the headline: ‘Overall outcomes of university rankings are ‘junk’.’

Marginson was particularly critical of ‘multi-indicator rankings’ and the use of arbitrary weightings to give an overall score.

Rankings critic Simon Marginson

He suggested the specific data is often buried in a single overall score and such league tables lack transparency.

He also suggested there was often an agenda behind the mainstream global rankings.

The ARWU Shanghai rankings were launched in 2003 as a benchmarking exercise to demonstrate the gap in science between China and America and just focuses on research.

The Times Higher Education rankings in 2004 wanted to differ from the ARWU and position British universities well for the global student market, said Marginson, with the result this year that Oxford and Cambridge are top of the pile “though outside the UK almost everyone thinks Harvard is number one.”

I’m not sure about Marginson’s suggestion that the mainstream global league tables produce junk. They certainly offer a means of benchmarking institutions as a whole – but their focus on research makes them less useful for first-degree students trying to find a good university in a different country in their chosen discipline.

Judging apples and pears for the best fruit

Traditional rankers often seem to be judging the quality of apples and pears to find the best fruit.

But global university rankings are much admired by governments around the world, from Russia and China to the Anglophone world, and many countries strive to try to have five or whatever number of their own universities in the Top 100 and are spending fortunes on various ‘excellence’ type initiatives on their ‘best’ universities.

But there is also a place for the U-Multirank subject-by-subject and activity-by-activity approach. Perhaps it is more useful for mobile students and staff as it allows very specific indicators to be compared. Just the kinds of thing that would-be students, or staff looking for partner universities, might be interested in!

I also like the way U-Multirank finds hidden gems in universities that wouldn’t normally make the overall rankings.

But I wish its clunky website was updated and made more user-friendly. Hopefully, the new partners can share their expertise here and get that sorted.

For more about the new U-Multirank partners, see my story in The PIE News

And for the University World News story ‘Overall outcomes of university rankings are ‘junk’, see here