Defining a Top University
There are three major national rankings of universities in the UK – the Complete, Guardian and Times University Guides. These have slightly different takes on what makes a good university, and there are points of massive disagreement, but by looking at where they do agree it is possible to get an overall idea of which are Britain's best universities for undergraduate study and put together a combined ranking.
The first element of agreement is on the top spot. The unanimous vote for Britain's best university is Cambridge. Second place is also unanimous, with Oxford hot on the heels of its ancient rival. After that discrepancies start to arise, but surprisingly although there is no agreement on the top ten, there is on the top thirteen. The same eleven universities fill third to thirteenth place in all three guides, although only six of these are ever-present in the top ten. This can be seen in Figure 1 showing the Guardian and Complete University Guide scores – this group of eleven is clearly separated from the rest (only those in the top 50 of one of the guides are shown here).
Figure 1
So which institutions are these? In order of their average position, and in descending order:
St Andrew's: Scotland's oldest university is its only representative in the group. It takes third place in two thirds of the guides, and third place overall. Will and Kate's Alma Mater is also the top-ranked non-Russell Group institution.
Imperial: London's specialist science and engineering school is fourth in two of the tables, giving it fourth on average position.
Durham: England's third oldest university has the smallest scatter in position of any but the top two, placing fifth or sixth in all of the guides. This helps it to fifth place on average position.
LSE: A placing outside of the top ten from one guide drags it down to sixth on average position, although the LSE is above Durham in the other two.
Warwick: The Midlands p!ate-glass university places between sixth and eighth in the guides, unsurprisingly putting it in seventh on average position.
Surrey: The home counties' favourite, and another non-Russell Group university, makes the top ten in two guides, taking eighth place on average.
Exeter: The south-west is represented by the home of the Met Office, at ninth the lowest-ranked institution to make all three top-tens.
Bath: Another south-western university, and the third non-Russell Group institution, fills tenth spot, with two of the guides putting it in the top ten.
Lancaster: The north-west's top university makes the top ten in two guides, but slips to eleventh on average position.
UCL: London's Global University does much better on international rankings, only making the top ten in one table and placing twelfth on average.
Loughborough: The famous sporting university isn't so bad academically either. It doesn't break the top ten in any guide, pushing it to thirteenth overall, but it does place above UCL in two of the guides.
What is most striking about this list is the absence of many of the traditional 'top' universities. In particular among the 11 British universities in the Times Higher Education's World Reputation Rankings top 100*, King's College London, Edinburgh, Manchester and Bristol are missing from this top thirteen list, while UCL – fourth out of UK institutions by reputation – fails to make the top ten in two of the three league tables.
*The London Business School also makes the reputation top 100, but as a specialist postgraduate institution is omitted from the rankings.
It's also striking how many institutes from outside the Russell Group, often thought of as "Britain's Ivy League", feature in this top group. It's notable that Exeter is the only Russell Group university in the group not to make the top 100 worldwide by reputation, while none of the non-Russell Group institutions are included. All of those included in the top 100 reputation rankings from outside the top 13 are also members of the Russell Group.
Very few of Britain's major cities are represented on the list. According to the 2011 census, the ten largest conurbations outside of London are Manchester, Birmingham, Leeds, Glasgow, Liverpool, Southampton, Newcastle, Nottingham, Sheffield and Bristol. All of these have Russell Group institutions, but none make the top group here. (Southampton comes closest, ranking 14th in two of the three guides.)
The Influence of Research
It would not be surprising if research had some influence on league table position. The universities with more research will, at least in general, have more income, and thus better facilities. Both the Times and Complete university guides explicitly include research as a measure in their league table, although the Guardian does not.
In order to measure research meaningfully, it is necessary to account for the size of the university. For research power, I use the Research Fortnight analysis of the REF results (as given in the
Guardian), normalising this by the size of the undergraduate population to get a measure of research intensiveness, i.e. the balance between undergraduate teaching and research at the institute*. The correlations between research power and university size can be seen in Figure 2. From this and Figure 3, three groupings can be seen in the data: the "golden triangle" universities of Oxford, Cambridge, UCL, Imperial and the LSE (KCL, although often listed as part of this group, is not included by this analysis) with a research intensiveness score of 0.8 - 0.9, the "research intensive" universities representing the bulk of the pre-1992 institutions with a score of -0.1 - 0.6, and the "teaching focused" universities made up primarily of post-1992 institutions below -0.1. (The lone data point near the top with 8.7% of all undergraduates is the Open University.) The 24 Russell Group universities are the 24 with the highest research power, all having 1.4% or more of the national share (log research power share > 0.14), but not the 24 with the highest research intensiveness.
* Research Intensiveness score = log (research power share/undergraduate share); undergraduate share calculated from
HESA figures for 2012-13.
Figure 2
Figure 3
The orange points in Figure 2 (and subsequent figures) indicate the top two, Oxford and Cambridge, the red dots indicate the next eleven, while the blue dots indicate other universities. Figure 3 shows that there is a link between research intensiveness and high positions in the league tables: all of the 'golden triangle' institutions make the top group, and there are no representatives with a 'research intensiveness score' below 0.16 – they are all in the top 75% of the research intensive universities.
The three universities in the 0.5 - 0.6 bin are particularly notable for their absence from the top of the league tables: these are Edinburgh, KCL and Bristol. As noted before, these three universities (along with Manchester, which has a high volume of research diluted in this analysis by a large student population) have a high international reputation; it is likely this is linked to their research rather than their league table position.
As Figure 4 shows, research intensiveness is clearly linked to league table position. But it is not the whole of the picture. What other factors go into making a top university?
Figure 4
City Size
In the early nineteenth century, arguments raged about whether universities should be in small town like Oxford or Cambridge, where students could study with little distraction - or in the large cities, where students were plentiful. Despite Durham being founded according to the first of these ideas, it was the later that drove the founding of the civic 'redbrick' institutions. But now Durham is in the top group and all of the redbricks are missing. Could there be something in this idea after all?
Figure 5 certainly seems to show that smaller towns are more likely to host to universities. Looking at universities that make the top 50 in any of the league tables, of the 11 that are in towns or conurbations smaller than 120,000 at the 2011 census, only four are not in the top group. These are Bangor, Falmouth, Stirling and Kent. None of these rank in the top 40 by research intensiveness, with Stirling (1.12) coming closest to the levels of research seen in the to institutions.
Figure 5
Of the six top institutions found in towns larger than 120,000, five are the 'golden triangle'. Only Warwick, which is actually in the middle of nowhere despite technically being on the outskirts of Coventry, breaks this pattern.
Figure 6 shows that this relationship between high league position is not that tight, and only persists down to about 30th in the combined league table, below this come a lot of universities in smaller towns that don't do well enough in research intensiveness to aim for high league table positions. It also doesn’t cover London institutions (at the top of the plot), which are either ‘golden triangle’ or lower down the league tables. It is only in combination with research intensiveness that we really see an effect.
Figure 6
Using the correlation seen in Figure 4 between research intensiveness and league table position, I worked out the expected research intensiveness from the rank in the table and calculated the residuals. This is shown in Figure 7, plotted against the population of the host town. This shows that there is indeed a relationship between the residuals and the host town population for the top 30 institutions, albeit one with a large scatter. Below the top 30 it is hard to see any correlation.
Figure 7
Using this correlation, I corrected the research intensiveness for the effect of population and used this new score to re-rank the 55 universities that feature in the top 50 of any of the three tables. The results for the top 30 are shown in Figure 8, along with the performance of the other ranking methods (all re-ranked to 1 - 55).
Figure 8
It is clear from Figure 8 that research intensiveness + population size (blue) is not as good a predictor of average league table rank as any of the individual university guides. However, it is a better predictor than research intensiveness (cyan), which is, in turn, a better predictor than research power (purple) alone (which is almost uncorrelated with league position). The median absolute deviations (half of the institutions will be closer than this) for the top 30 and all 55 institutions considered are given in Table 1
Table 1
Median Absolute Deviation |
Research Power |
Research Intensiveness |
Predicted Rank |
Guardian |
Complete |
Times |
Top 30 |
12 |
8 |
4.5 |
2 |
1 |
1 |
All 55 |
9 |
7 |
5 |
4 |
1 |
1.5 |
Outliers from the predicted rank using research intensiveness + population includes Surrey and UEA, where the relatively small populations of the host towns only partly compensates for their low rank in research intensiveness; Birmingham and Leeds, where the relatively large populations of the host cities over-compensate for their research intensiveness; and Bristol, Reading and Edinburgh where the population makes virtually no difference to their outlier status based on research intensiveness. The use of population does, however, remove the discrepancy between research intensiveness and rank seen in KCL, Kent and Loughborough, and halves the discrepancy for UCL.
Conclusions
It is possible to use the league tables to compile a list of thirteen universities that all of the university guides agree on. These consist of two groups: the 'golden triangle' universities, defined as having a very high research power for their size, and universities that have a relatively high research power for their size and are located in towns of less than 120,000 people (ignoring Warwick).
Overall research power bears little relationship to league table position. This is the determinant for Russell Group membership - with the 24 universities in the group being the 24 universities with the highest research power. It therefore appears that, for an undergraduate, choosing a Russell Group institution purely on the basis of its membership of the group is not a wise decision – some Russell Group institutes fare quote poorly on the league tables.
Controlling research power for undergraduate size to find which institutions are more research intensive gives a better estimate of league table position. This is fairly easy to understand (at least at a simplistic level) – the extra income from research means more income per student and helps to attract top staff. A university with high research power but a large student population spreads that more thinly than one with a smaller student population. This analysis also shows quantitatively the division of universities into golden triangle, research intensive, and teaching focussed.
The last element identified here is a bit surprising, however: it appears that smaller host town populations correlate with league table position, at least for the top 30 institutions. Taking this into account does a better job that research intensiveness alone in predicting position on the league table. The reasons for this are not obvious, but it could be that (as put forwards in the 19th century) students in small towns have fewer distractions and so are better able to concentrate on their university work. It could also be that the university is more important to the local economy, and thus more valued locally, or possibly that housing prices are lower giving students a better chance at getting decent housing and increasing their satisfaction. Possibly it’s something else entirely, or a combination of all of the above.
Vice-chancellors wishing to propel their institution up the league tables thus have among their possible options:
1) Growing the research power of the university – everyone's trying to do this, and it's hard.
2) Reducing the student population to increase research intensiveness – this can have unfortunate consequences on tuition income.
3) Reducing the population of your town – this can get you in trouble with the police!