Ranking organizations: Who does the ranking?
Rankings by magazines
Rankings were first introduced by magazines as a service and an attraction to their readers. However, the ranking was done primarily for commercial reasons. Just how economically appealing these tools have become can be judged by the number of magazines and newspapers that publish rankings on a regular basis:
- USA: US News and World Report, Washington Monthly, Newsweek, Forbes, Business Week, Wall Street Journal
- Canada: Mac Lean's Magazine
- United Kingdom: Financial Times, The Times, Times Higher Education Supplement, The Guardian, Economist
- Germany: Spiegel, Fokus, Wirtschaftswoche, Karriere
-
France: Le Nouvel Observateur, Libération
Educational authorities and institutions
Educational authorities have become active in the field of rankings (or ratings). This can be seen in countries like the UK with their official "teaching quality assessment" and "research assessment exercise" or in the Netherlands where research ratings are carried out by scientific academies and the university rectors' conference. In other countries, specialized educational institutions have launched their own ranking lists: the teaching and research ranking lists established by the German "Zentrum für Hochschulentwicklung" (CHE), which is published in "Die Zeit"; the Shanghai ranking list by Jiao Tong University in China; the National Research Council's ranking list of research doctorate programmes in the US; or more recently, the Leiden ranking list by the Centre for Science and Technology Studies (CWTS) at Leiden University in the Netherlands.
Media rankings, winners, losers and fluctuations of results
The interests motivating these groups and the ways they exert influence are quite different. Ranking organizations from the media side are driven by their readers' market. They favour clear-cut results in the form of league tables, where winners and losers contend with one another in a somewhat dramatic "rankings game", even at the cost of oversimplification or pseudo-accuracy. Large swings in the rankings may even add to this drama. However, it is hard to imagine how major swings in rankings from one year to the next can correctly portray real changes at universities. Take for example the THES ranking: where Osaka, Japan went from position 69 in 2004 to 105 in 2005 and back to 70 in 2006; where Ecole Polytechnique, France, went from position 27 in 2004 to 10 in 2005 and to 37 in 2006; where the University of Geneva went from not being ranked in 2004, to position 88 in 2005 to position 39 in 2006. Differences like this, which cannot be plausibly explained, point towards major methodological flaws and raise doubts as to the reliability of the ranking list. They may not pose much of a problem for the economic success of the publishing magazine, but they may very well cause major problems for the universities involved.
Ranking lists established by academic institutions may be more reliable
The ranking organizations from the educational side are more interested in influencing change and development within the university system. They tend to be focussed on specific activities and regions. The methodologies applied are usually more refined, debated and transparent. Educational authorities may link their ratings to influential financing decisions, as in the UK and Dutch case. Other educational institutions like the German CHE and the Shanghai ranking list rely on public influence and market pressures to exert influence.