|
Münster (upm/kk).
Ranking organisations assess reputation, standing, quality of teaching and research activities.<address>© AdobeStock - Sichon</address>
Ranking organisations assess reputation, standing, quality of teaching and research activities.
© AdobeStock - Sichon

When it comes to rankings, data management is key

The role of international university rankings for universities

Ranking organisations assess reputation, standing, quality of teaching and research activities. Universities that make it to the top ranks in one of the major international university rankings are attractive to prospective students and funding providers, have a good chance of recruiting the best academic minds and finding excellent cooperation partners worldwide. This is how the major ranking agencies, such as the QS World University Ranking, the Times Higher Education Ranking or the Shanghai Ranking, advertise their products. But do these marketing messages really reflect reality? International university rankings have been on the market since the early 2000s, tailored to the Anglo-Saxon university system. No wonder the USA and the UK in particular place near the top of the international leaderboards in most rankings.

“German universities were initially reticent about rankings,” says Sarah Spiegel, head of the International University Rankings project at the German Rectors' Conference (HRK). “This has changed in recent years. Even if rankings are primarily marketing tools, at best they can benefit universities.” According to Sarah Spiegel, good rankings are particularly beneficial for internationalisation, as they strengthen the appeal of Germany as a research destination for foreign academics and students.

Even if rankings are primarily marketing tools, at best they can benefit universities.
Sarah Spiegel

The University of Münster has been systematically addressing this topic since 2015. It created an office for the coordination of rankings and adopted its own ranking policy in 2019. “To achieve ranking success, it is crucial that universities consider sustainable and tailored strategies,” explains Dr Linda Schücker, ranking coordinator at the University of Münster. With its ranking policy, the University of Münster is pursuing precisely this path. It comprises a systematic and criteria-led approach both in terms of active participation in rankings and the dissemination and utilisation of the results, for example to identify particularly successful subjects at the University of Münster.

The rankings differ to some extent in their focus and methodology. Some compare entire universities with one another, others only assess individual departments or subjects, some focus on scientific output such as the Nature Index, and others include sustainability indicators such as the relatively new THE Impact Ranking. In most rankings, the overall result is usually based on the weighting of different indicators. “Some rankings require participating institutions to provide the data, while others are based on available information, such as citations or publicly accessible statistical data. In the latter case, the institution is not free to decide whether it wants to participate or not – it simply appears in the ranking. This is the case with the Shanghai Ranking,” points out Sarah Spiegel.

The Rectorate of the University of Münster particularly welcomes initiatives that aim to develop, test and establish academic benchmarking procedures – such as the Leiden Ranking – and welcomes efforts to support such initiatives through participation. “The quality standards of the respective ranking agencies are also very important to us. Among other things, this includes methodological transparency which discloses the procedure for calculating the rankings,” explains Linda Schücker.

Rankings are often the target of criticism. On one hand, a great deal of differentiated informative value is lost when average values are calculated from highly heterogeneous data. On the other, qualitative parameters have to be reduced to numerical values in order to make comparisons possible at all. Some student representatives argue that rankings fuel competition, which has no place in education. There are also increasing reports that some institutions are not playing by the rules. For example, the so-called citation cartels in China and Saudi Arabia. This involves academics publishing low-quality papers in which they repeatedly cite the articles of their colleagues, thereby artificially raising the rank of their own institutions.

“We are keeping an eye on these negative developments. That’s why high-quality data management and accurate submission to the ranking agencies are essential,” states Linda Schücker. Universities need to sensitise their academics to this issue and the corresponding rules, such as correctly disclosing affiliation with a research institution. “There are a number of pitfalls and constant changes or innovations on the part of the ranking agencies. The HRK invites all ranking coordinators from member universities to an annual network meeting where they can stay up-to-date on such developments and discuss their experiences with the various rankings,” says Sarah Spiegel. The next symposium will take place on 23-24 September at the University of Münster.

Further information