University rankings

Controversial and powerful: university rankings under scrutinity

There are a number of worldwide university rankings, which are often used as a guide for future education and career progression. These include, among others, the ranking of The Times Higher Education (THE), the QS World University Rankings, the Academic Ranking of World Universities (ARWU) , also known as the Shanghai Ranking, and the very recently launched U-Multirank, funded by the EU. While some few universities from Western Europe and North America still dominate most of these rankings, there is a trend for the emergence of young universities from newly-industrialised countries such as China and India.

This change in ranking reflects a change in policy priorities. “Many of the emerging economies, especially in East Asia, have prioritised the development of world class universities as a key strand of their economic growth strategies. This means that they have been investing heavily in drawing in talent and building the facilities to compete on a global stage,” explains Phil Baty, editor of the THE ranking, and editor-at-large of Times Higher Education magazine, in the UK. He adds: “In contrast, many European countries, hit by austerity, have been cutting public spending on higher education.” He believes that investment is key to success of world class universities.

Targeted investment has resulted in a sizeable shift in the global order of universities. “There has been a lot of attention in recent years in spending funding into higher education and particularly into research activity and research output [in these emerging economies],” points out Ellen Hazelkorn, director of research and enterprise and dean of the graduate research school at the Dublin Institute of Technology, and director of the Higher Education Policy Research Unit (HEPRU), as well as a policy advisor to the Higher Education Authority in Ireland. “The culmination of these factors is shifting the global order; not so much when you look at the top hundred, but when you look on the body of 400 or 500 you get the shifts in the number of institutions and countries.”

University ranking have gained influence over the years. “Largely managed by non-state organisations in publishing industry or within universities themselves, ranking has become a form of regulation as powerful in shaping practical university behaviour as the requirements of States, “says Simon Marginson, professor of international higher education at the University of London, UK. While some national cultures emphasise competition and position more than others, Hazelkorn finds that institutional rank has become a primary performance measure for research university rectors, vice-chancellors and presidents. She adds: “It is the one universal performance indicator, more important than student numbers—only some universities want to grow—or revenues—money is a means to university status.”

Rankings have their own limitations too. Rankings like THE, QS or ARWU use different indicators but, Hazelkorn and Marginson agree, they are mainly research focused—as they account for academic reputation, Nobel Prizes or bibliometric indicators—while there is no direct measure of teaching quality or learning achievement for instance.

According to the experts, these traditional rankings differ in terms of data quality, materiality, objectivity, reliance and comprehensiveness. The latest ranking available, U-Multirank launched earlier this year and supported by the EU, is somehow different because it includes a wider range of universities’ activities rather than solely research. And it does not provide league tables but allows the users to compare universities in the aspects they are mostly interested in.

Such approach may present some advantages. “When rankers use broad bands not league tables, this reduces the normalising effects of rankings, the tendency to bifurcate the ranked units between winners and losers, and the tendency to close off ranking competition to all but the established institutions,” notes Marginson. However, “it is doubtful whether this can still meet the widespread desires for positional hierarchy.”

Multirank has been praised but also critised for being complicated and meaningless but expensive.

By comparison, “the traditional university rankings just seem to be simple, at first glance,” says Frank Ziegele, director of the Centre for Higher Education Development (CHE) in Gütersloh, Germany, and co-developer of the ranking. All experts agree that university rankings are not perfect. “All rankings have their limitations; there are very important aspects of a university’s performance, such as transforming the life chances of individuals or contributing to democracy, where no useful or internationally comparable data exists. Also, they are all, of course, based on the subjective views of their compilers in terms of the range and balance of different performance indicators, which can dramatically influence the final pecking order.

Therefore, the key thing for anyone using any ranking to make career choices is to take a good look at the methodology. “Make sure you know what is being measured, and why”, Baty says. Scientists who are looking to change universities should find out which key factors are important according to their own criteria and what they want to achieve on a national or international level, instead of trying to be best in all categories. According to Ziegeler, scientists can use U-Multirank to do benchmarking in order to find the indicators they need to boost. And they can also use university ranking results for their own self-promotion, as they climb up the career ladder.

Ziegele emphasises that in the case of U-Multirank, the subjective choice is made by the user. This ranking offers a broad range of indicators, which of course still mean a choice, but they are sufficiently broad to cover all kinds of individual interests. “University rankings can support people’s decisions by providing information, but people have to make the decisions themselves,” Ziegele concludes, “U-Multirank allows this, but most traditional rankings mislead people only to trust the position in the league tables.”

Read the letter to the Editor for that article:

Featured image credit: CC BY 2.0 by Martin Fisch

Janna Degener

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.