College Ranking Systems Assessment
Report for Vanderbilt University is another example of the THE-QS Duopoly
Last year, Vanderbilt University, in Tennessee, commissioned a report on university rankings from NORC at the University of Chicago (formerly National Opinion Research Center), which has published a variety of social science studies.
The report discusses five rankings, three national and two global. I will focus on the two global ones, which are the Times Higher Education (THE) World University Rankings and the QS World University Rankings.
It is regrettable that THE and QS have acquired something close to a duopoly over global rankings. The International Ranking Expert Group (IREG) lists 16 global university rankings, two of which have now ceased publication, in addition to several regional, specialised, and business school rankings. Unfortunately the media, university leaders, governments, and the global economic elite usually consider only QS and THE, and ignore other global rankings such as the Center for World University Rankings (CWUR) World University Rankings, Webometrics, the Shanghai Academic Ranking of World Universities (ARWU), the ex-Russian Round University Rankings(RUR), the National Taiwan University Rankings, the Moscow International University Ranking (MosIUR), Leiden Ranking, SCImago Institutions Rankings, and, somewhat surprisingly, the US News Best Global Universities (BGU). These rankings are in most respects as good as QS or THE, and in some respects better. Occasionally, BGU or the Shanghai Rankings are mentioned, but generally QS and THE are the international rankings that count for American higher education.
The report addresses four primary issues related to rankings: lack of clarity about the underlying concepts, problems of data quality, subjectivity in certain ranking decisions, and insufficient characterisation of uncertainty.
These are all relevant problems with the THE and QS rankings, and it is commendable that the report points them out and suggests possible ways in which they might be mitigated. However, the report does not mention the other global rankings, some of which have merits that QS and THE do not. Leiden Ranking, for example, provides confidence intervals for the metrics that measure high-quality publications. URAP, Shanghai and NTU only use objective public data. SCimago and MosIUR assess universities’ social impact, and CWUR attempts to capture the quality of faculty and graduates with public data.
This omission is odd since Vanderbilt performs better in all the above rankings than it does in the QS world rankings and in most of of them it does better than it does in THE. Heres is Vanderbilt’s rank in the most recent edition of these rankings:
CWUR; 54
Webometrics; 60
National Taiwan University Rankings; 62
US News Best Global Universities; 63
Shanghai ARWU; 66
MosIUR; 85
THE WUR; 90
SCImago; 100 [university sector]
URAP; 111
Leiden Ranking; 130 [publications]
QS WUR; 248.
The report ignores those rankings where Vanderbilt does well and mentions two where its performance is relatively poor, although still world-class by any standard. It seems that this diminshes the value of an otherwise interesting discussion.