Citations and Rankings
Citations are popular with university rankers, although researchers and administrators often view them with disdain and suspicion, at least in private. The famous global rankers all have at least one indicator related to citations. Times Higher Education (THE) has four. Other international rankings like Scimago Institutions Rankings, University Ranking by Academic Performance (URAP), Round University Rankings (RUR), and National Taiwan University (NTU) Rankings also use citations to measure research impact or research quality.
There are several issues related to citations as a measure of research capability. They are highly gameable, although a cynic might argue that other metrics can also be gamed, and they might be distorted by self-citation or citation circles.
They can, if not constructed carefully, produce quirky and implausible results.
If rankers attempt to measure research quality by the number of citations per paper, they run into the issue of equivalency across fields. In a much-cited field, say cancer research, a paper might well receive 100s of citations within a month. In contrast, a groundbreaking philosophy paper would be lucky to get a dozen citations over a few years.
THE started using field-weighted citations in 2010 when they started their own data collection for their “robust, transparent and sophisticated “ rankings and decided that absolute citation counts were a sin. Accordingly, they started normalizing citations to the world average of subject areas. They ran into trouble almost immediately, when Alexandria University appeared to be the fourth best university in the world for research on the strength of papers self-published and self-cited by an “independent researcher”, Mohammmed El Naschie.
THE did a bit of tweaking, and soon Alexandria began sliding down the charts. But before long, more anomalies started cropping up as universities began to reap the benefits of multi-author papers from CERN projects, and various improbable universities started appearing at the top of the citations table. THE fixed that by excluding papers with more than a thousand contributors (I’m not sure the word author is appropriate here). However, the problem reappeared with universities benefiting from participation in the Global Burden of Disease Study, which typically has hundreds but not thousands of contributors.
For several years implausible places appeared at the top of the citations metric: Tokyo Metropolitan University, Florida Institute of Technology, Anglia Ruskin University, Cape Coast University, Federico Santa Maria Technical University, An-Najah National University, St George’s, University of London, Brighton and Sussex Medical School, University of Reykjavik, Aswan University, the University of Perediniya, Arak University of Medical Sciences, to name just the more obvious ones.
Finally, in 2023 for the 2024 rankings, THE overhauled the citations “pillar”, halved the weighting for citation impact, and added three metrics that supposedly measured research strength, excellence, and influence. That year and in 2024 the rankings began to look more sensible at the top, although going down a bit there were some European and Australian universities with citations scores in the 90s even though the research environment scores were much lower, raising the question of how such poor environments could produce research of such a high quality.
But things appear different when we look at the country level in the 2025 world rankings. In India, the top universities for research quality include Chitkara University, Saveetha Institute of Technical and Medical Sciences, Shoolini University of Biotechnology and Management Sciences, Malaviya National Institute of Technology, and Lovely Professional University, all of which have high scores for research quality and very low scores for research environment. Meanwhile, the Indian Institute of Science, widely regarded as the country’s premier research center, languishes in 50th place for research quality.
University heads and the media were generally pleased with India’s performance, although India Research Watch thought differently:
“Times Higher Education Rankings make no sense at all. No wonder old IITs boycott it. Even IISc should boycott it! IISc is ranked 50th in research quality!!!
Can you believe such a ranking? If yes, hats off to your delusional power.
Look at their rankings of Indian Universities for Research Quality. Chitkara University is Number 1 in India in research quality. The 'infamous' Saveetha is second. Saveetha has already been caught inflating their citation count and flagged multiple times by international agencies like Science and Retraction Watch.”
But that is not all. Looking at Australia, we see that Australian Catholic University and University of Technology Sydney are riding high for research quality, ahead of Melbourne and the Australian National University.
In Egypt, Future University in Egypt is leading in research quality, well ahead of Cairo University and the American University in Cairo.
In Japan, Juntendo is ahead of Tokyo and Aizu, Yokohama City University, and Fujita Health University are ahead of Kyoto University.
Until THE can produce valid and reliable rankings—particularly for citations or publications—their results should be treated with scepticism by students, researchers, and the public. But, I suspect, they are now too valuable for the Western elite and selected universities elsewhere to make any meaningful changes.