University rankings (and how to manipulate them)

Helsinki University Performance

Whow. Helsinki University was again able to improve its position in the Shanghai Ranking and is now number 56. However, even the University’s own press release had kind of a gloomy tune as if this was the last blossom before the final decay. It is true that the basis of the current success has patiently been built by the researchers over the last decades. Nevertheless, when politicians now draw the conclusion that the cuts of the educational budget are responsible for this success, they are probably right.

Perhaps the cuts actually did contribute to the surge in productivity: All the scientists that unexpectedly were suddenly unemployed obviously continued to work furiously on their current projects to get their current manuscripts ready for publication. Papers are the overwhelmingly dominant currency in the academic job market. And these unemployed scientists might publish faster than before since unemployed scientists have no teaching duties…

However, more importantly, the Helsinki numbers of the individual Shanghai Ranking indicators for 2015 and 2016 differ significantly only the per capita academic performance (publications/number of academic staff) and the number of highly cited scientists. Since the job cuts at Helsinki Universities have been going on already for a while even before the massive cuts of this spring, it appears reasonable to claim, that the 56th position is partly due to the job cuts, which did influence directly and rapidly the number of academic staff.

The Shanghai Ranking emphasizes research, whereas in other rankings, Helsinki has even been dropping. E.g. in the QS World University Rankings, Helsinki dropped from position 67 (2014/15) to 96 (2015/16). Between these two rankings, the QS team made “improvements” to their ranking system and these were almost exclusively responsible for the drop if you look at the data closely. They “normalized” the citation ratio per faculty according to the subject, and life science got badly punished by being a “high citation field”.

Very simply put, a citation to a paper in Arts and Humanities now counts about 40 times as much as a citation to a paper in Life Science/Medicine. Each of the 5 big fields (Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences, Social Sciences & Management) contributes now equally with 20% to the final rating in the citation per faculty category (detailed explanation of the normalization from here: http://content.qs.com/qsiu/Faculty_Area_Normalization_-_Technical_Explan...).

Why on earth do they punish a field for having a bigger impact? Papers in the humanities are not cited as much as papers in life science. Maybe this could be due to the fact that their average publication is not as relevant for their own field as is a life science publication for the life science field? I agree that there are cultural differences between fields, but who is to say how much is due to such cultural differences versus low quality or irrelevant research? Exactly that’s what QS World University Ranking’s did by making up “normalization” factors essentially claiming equal relevance.

However, all such rankings are easily manipulated. E.g. the citation to faculty (member) ratio can easily improved by outsourcing services that were previously performed by faculty scientists (and that’s what Helsinki University is just doing at the moment). Similar to the impact factor for journals, university rankings are losing their relevance if the goal is to score high and not to excel at research and teaching (Goodhart's law: When a measure becomes a target, it ceases to be a good measure).

The Shanghai Ranking has been criticized much. It undervalues teaching since teaching performance doesn’t enter directly the calculation at all, and it favours big universities since 50% of the rating are based on sheer numbers which are not normalized to account for the size of the university. Nevertheless it it the only reproducible (and therefore scientific) dataset from the three big rankings (https://link.springer.com/article/10.1007/s11192-012-0801-y) since both the QS and the Times Ranking rely heavily on opinion and reputation surveys, which are notoriously difficult to reproduce.

What is the take-home message? Helsinki University is probably as good as it used to be during the last decade and the big fluctuations are just artifacts resulting from changed ranking methodologies or resulting from administrative adjustments to the financial realities brought upon us by the Finnish voters in the 2015 parliamentary elections.