Whether university rankings are valuable information or rubbish depends on what ranking, what it measures, and how you use it.
University rankings became popular over the latter half of the 20th century and they now fill a niche in our data-hungry modern society. But their actual value is contested.
Proponents of university rankings believe that prospective students find them a useful point of data for their educational considerations, that they serve a purpose as internationally recognised markers of research excellence and prestige, and that they can attract significant investment by governments, corporations and non-government organisations.
Critics of university rankings claim they are subjective, that what they measure is arbitrary, that they contribute to an industry environment of competition instead of collaboration, and that they encourage universities to disregard attributes that aren’t measured and divert resources to those that are.
The three major international rankings are the Academic Ranking of World Universities (ARWU), the QS World University Rankings and the Times Higher Education (THE) World University Rankings. These rankings systems prioritise delivering a bite-sized and media-friendly comparator of institutions’ academic reputations, how well established they are, and their representation in research citations. None of them attributes more than a third of any university’s score to teaching quality.
The QS World University Rankings’ methodology derives rankings from 40% academic reputation via survey, 20% each faculty student ratio and citations per faculty, 10% employer reputation and 5% each international faculty ratio and international student ratio. Clearly, this might be useful to you if you are a researcher seeking industry information.
But, of course, research isn’t the only responsibility of a university. Its role is complex and multifaceted. One of the commonly cited use-cases for rankings is that prospective students can use them to decide which university they attend—and if that’s so, it’s probably a bad idea.
For example, “quality of education,” is one of the ARWU criteria. It’s measured in “alumni of an institution winning Nobel Prizes and Fields Medals” and weighted at only 10%. This might surprise a prospective student looking to attain a law degree and be admitted to practice.
The only major ranking system that emphasises teaching quality is the THE World University Rankings, in which 30% of a university’s score comes from five separately weighted factors that fall under “teaching”.
The truth is that rankings are excellent at measuring some attributes and bad at measuring others, and their value depends on understanding what it is they actually measure. So, before you cite a university ranking, look up its methodology and find out what it actually measures—otherwise it’s just a number.
ResearchMaster is an industry-leading research management system. Find out more.