A common attitude to league tables of international universities was summed up by Valerie Pecresse, France’s Higher Education Minister, when she stammered in fury last February that the problem with rankings was that they existed. Despite the strong attention paid by students, university staff and employers to the two internationally recognised rankings (from Times Higher Education-QS and Shanghai Jiao Tong University), many education departments and ministers have tried long and hard to maintain the charade that all university qualifications are equal, and that no one institution offers a markedly better product than any other.
But the “if I just ignore it, maybe it will go away” approach to international rankings at the political level seems to be on the wane. France is proposing that one objective of its tenure of the European Union presidency should be the establishment of “rankings that do justice to the quality of our qualifications”.
Next month, in the context of its EU presidency, France will host a conference in Paris on the international comparison of education systems. It is roundly hoped that the conference will work out the basis of at least one possible model for ranking European institutions, according to institutions may shine brighter.
To date, there has been little detail about the rankings’ criteria or form. Ms Pecresse has said vaguely that the criteria should consider “the quality of education, research, the campus and its facilities”. Recently debate has strayed dangerously towards the possibility of ranking universities by student numbers, which, unless your interest is in long cafeteria queues, will assist no one.
Many bodies, including the European Commission and the Organisation for Economic Co-operation and Development, have cited the rankings compiled by the German higher education reform think-tank, the Centre for Higher Education Development (CHE). At present, it operates in Germany, Austria and Switzerland. A trial in the Netherlands and some Belgian institutions produced mixed results.
The CHE compares faculties and is silent as to overall comparisons. Rather than offering a Top of the Pops-style numerical list, it clumps institutions according to “top”, “medium” and “bottom” categories. It shows trends (improving or slipping), and gives an indication of each faculty’s reputation for research, the quality of its resources and the overall study experience. Its approach to assessment is, however, not vastly different from that of Times Higher Education-QS, which considers peer attitudes and the publications and citations of academic staff.
The model may work well for prospective students who have made some preliminary decisions. For those who know in which country or city they want to study, the CHE rankings provide some useful guidance between neighbouring institutions and may help to identify which course best suits an individual, in particular at undergraduate level.
But the CHE model, if adopted, will neither displace the two accepted international rankings nor minimise the impact of their findings in the world’s press or academic circles. Times Higher Education-QS and Shanghai, and the attention they command, will continue driving students’ choices, particularly at postgraduate level where the competition for the best students is the greatest, and go on influencing the internal and external reform of universities with a view to lifting rankings.