Ranking season is upon us with the QS rankings of subject areas (not, as is commonly though, Departments) now revealed. Again we find that despite the hype Irish universities are stronger in Arts and Humanities than in the STEM areas. This is in stark contrast to the financial flows to these areas and in even starker contrast to the government and regulatory thrust. Evidence of sustained internationally recognised quality in the AHSS (arts, humanities and social science) area does not translate into funding, support or recognition. Perhaps its time it did?
Tag Archives: rankings
So, the THES world rankings are out and there will no doubt be the usual wailing and fingerpointing. Why are no Irish universities in the top 10? the top 100? etc etc (hint…its expensive to be there….) .
Yes, there are issues in what the rankings tell us, and I will return to that. But for the moment, look at the chart here. Its a graph of the number of top 200 universities per country scaled to millions of population. If we were the same size as the larger island next door this suggests we would have approx the same number as they do. Is UK higher ed in a terminal crisis (feel free to not answer!).
The reality is that we have world class universities. We need to invest more in them. The only question is where the money comes from. But bear in mind – there are 16,000 or so universities in the world. The top 400 where we have 5 ( in addition to TCD and UCD in the top 200 we also have UCC, NUIG and NUIM) represents the top 2.5%. The top 200 is the top 1.25%.
How many of the reporters, commentators, pundits, critics and analysts who will opine on the awful state of universities are in the top 3% of their profession in the world? How many of the politicians….?
Irish universities are world class. Really. So says the Shanghai rankings, the ones that perhaps are most heavily weighted against the strengths of Irish unis. How do we know? Well the 2013 rankings season kicks off with the Shanghai Academic Rankings. Expect mutterings along the lines of “sure its terrible, we have no universities in the top 100 of the Shanghai rankings” and “bloody academics, useless, wastes of money, look no Irish university in the top 100, world class me eye” and so on. No doubt similar wailing will happen when the THES rankings come out, with slightly different metrics.
Lets leave aside the peculiarities of rankings. Lets leave aside the odd rankings methodology of Shanghai which is very very heavily skewed towards a particular set of sciences. Lets look at the big picture.
There are approx 12000 universities in the world , defined as one that have some web presence. In the 2013 Shanghai data TCD comes in in the 200-300 region, UCD and UCC in the 300-400. Thats three universites in the “top” 3%. They have been there since 2011 so this is consistent. In the broader based Times Higher we have TCD at 110 and UCD at 187, two in the top 200 or top 2%.
Question : how many Irish institutions, public or private are consistently ranked by peers, customers, and input providers as being in the top 2% of their peers in the world?
Answers on a postcard please.
The Leiden world university rankings are out. We’re ranked at 48, up from 63 last year, and have therefore moved into the top 50 in the world in this ranking. UCD is ranked 281, UCC 181. Which is nice…but dont get too carried away.
The Leiden methodology uses research and collaboration metrics only – no ‘reputation’ surveys and no attempt at those problematic/questionable ‘teaching’ metrics used by the THE . So, its like all of these, partial and incomplete. But in so far as it goes it is very solid The ranking is presented based on research impact, but views by other indicators are also available. See here for the full set .
Leiden Ranking – Research impact in Europe:
We rank at 9 in Europe. UCC is at 79 in Europe. Queen’s University Belfast is at 102 in Europe. UCD appears at 121 in Europe.
Leiden Ranking – Biomedical and Health Sciences:
We’re at 36 in the world and 8 in Europe in Biomedical and Health Sciences. UCD 243 worldwide, UCC 119.
Leiden Ranking – Life and earth sciences:
TCD is at 165 in the world and 70 in Europe. UCD 243, UCC 148 worldwide
Leiden Ranking – Mathematics and computer science:
TCD is at 73 in the world and 14 in Europe, UCD 187, UCC 358
Leiden Ranking – Natural sciences and Engineering:
TCD is 48 in the world and 10 in Europe, UCD 255, UCC 241.
Leiden Ranking – Social Sciences and Humanities
TCD is 114 in the world and 31 in Europe, UCD 248
Making public or institutional decisions on the basis of no measures is bad. It is not however necessarily the case that making them on the basis of flawed measures is better. In the parable of the blind men and the elephant a number of blind men feel various parts of an elephant. Each can only describe the part they themselves feel, and of course the impression of the animal is scattered and incomplete. Its only when they pool their knowledge they get a better picture
Similar issues beset university rankings. They are many and varied and no one ranking (or perhaps any ranking) can be complete. A new report from the European University Association (available here ) on the rankings issue evaluates both the overall rankings concept and the main ranking systems.
Here is the Main conclusion section (my emphasis).
4. Main conclusions
1. There have been significant new developments since the publication of the first EUA Report in 2011, including the emergence of a new venture, the Universitas 21 Rankings of National Higher Education Systems, methodological changes made in a number of existing rankings and importantly a considerable diversification in the types of products offered by several rankings providers.
2. Global university rankings continue to focus principally on the research function of the university and are still not able to do justice to research carried out in the arts, humanities and the social sciences. Moreover, even bibliometric indicators still have strong biases and flaws. The limitations of rankings remain most apparent in efforts to measure teaching performance.
3. A welcome development is that the providers of the most popular global rankings have themselves started to draw attention to the biases and flaws in the data underpinning rankings and thus to the dangers of misusing rankings.
4. New multi-indicator tools for profiling, classifying or benchmarking higher education institutions offered by the rankings providers are proliferating. These increase the pressure on and the risk of overburdening universities, obliged to collect ever more data in order to maintain as high a profile as possible. The growing volume of information being gathered on universities, and the new “products” on offer also strengthen both the influence of the ranking providers and their potential impact.
5. Rankings are beginning to impact on public policy making as demonstrated by their influence in the development of immigration policies in some countries, in determining the choice of university partner institutions, or in which cases foreign qualifications are recognised. The attention paid to rankings is also reflected in discussions on university mergers in some countries.
6. A growing number of universities have started to use data compiled from rankings for the purpose of benchmarking exercises that in turn feed into institutional strategic planning.
7. Rankings are here to stay. Even if academics are aware that the results of rankings are biased and cannot satisfactorily measure institutional quality, on a more pragmatic level they also recognise that an impressive position in the rankings can be a key factor in securing additional resources, recruiting more students and attracting strong partner institutions. Therefore those universities not represented in global rankings are tempted to calculate their likely scores in order to assess their chances of entering the rankings; everyone should bear in mind that not all publication output consists of articles in journals, and many issues relevant to academic quality cannot be measured quantitatively at all.
The takeaway I get is one of slight unease : the rankings are here to stay, and they re self admittedly flawed and nascent metrics of even that which they purport to measure, but these rankings are being taken much more seriously than they might be on cold analysis. The face validity is greater than the construct validity. And yet we see UCC has been outed as having taken great pains to ensure that its ranking on one metric was improved, a measure described in an influential online publication as “rigging the rankings?”. It is highly improbable that UCC are alone in the world in taking such measures.