The global rankings like Times Higher, Shanghai and QS all suffer major shortcomings in their methods, which are unclear and keep changing, and in the strong focus on reputation, statistical outliers and volume over quality. The most objective part of them consists of an indication of the field weighted citation impact (FWCI), which shows which proportion of your scholarly output belongs to the most cited in their field – which normalises both for the size of the unit and the publication practice in the field. In the Netherlands, we recognize the Leiden based CWTS as one of the most reliable – it is not a Ranking, but a transparent tool to calculate this proportion of highly cited publications. CWTS uses data from the Web of Science.
Of the other Rankings, Times Higher is the clearest and consistent in also looking at this FWCI, but based on the data of the Elsevier Scopus database. One would expect that the FWCI information from both systems would yield only marginally different scores – but that is not the case.
A comparison of the citation scores between the Aurora universities – looking at both FWCI scores and how they changed over the last two years – shows that 2 Aurora universities improved their citations performance in Web of Science, but not in Scopus; 2 Aurora universities the other way round: up only in Scopus. Two went down in both and three went up in both. So performance in Web of Science and in Scopus correlate positively in 5 cases, but negatively in 4. Let’s call it a draw – and hope more research is done on the quality of citations’ research.