Governance & Strategic Planning

League tables

This page presents a high level overview of higher education rankings/league tables.

University rankings and league tables

National/regional University rankings are well established in the UK, Canada, the US, Australia and Asia. World University rankings have also now become part of the annual higher education calendar. Although the merits of many aspects of their methodology can be debated, it is clear that rankings can have a significant impact on the reputation and perceived standing of institutions. The following table presents the University of Edinburgh's institutional position in the main UK and World rankings since 2007:

World league tables

League table 2015 2014 2013 2012 2011 2010 2009 2008 2007 Year began
QS 21 17 17 21 20 22 - - - 2010
THE-Times Higher Education 24 36 39 32 32 40 - - - 2010
THE-QS - - - - - - 20 23 23 2004
Academic ranking of world universities (Shanghai Jiao Tong University) 47 45 51 51 53 54 53 55 53 2004

UK league tables

League table 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 Year began
Complete University guide 19 20 21 18 16 13 11 11 21 16 2007
Guardian 22 20 18 19 15 16 15 7 9 7 1999
Times   22 22 22 14 15 11 14 18 13 1993
Sunday Times   - - - 39 27 14 15 15 14 1999

Notes to the tables

  • The Times merged both their rankings into one ranking in 2013
  • The Complete University Guide (CUG) is published online by Mayfield University Consultants who produced the Times League Table until 2006. From 2008 to 2010, the CUG was also published by the Independent newspaper.
  • The THE-QS world ranking partnership ended with the publication of the 2009 ranking; QS have continued with the same methodology, adding refinements. With a new methodology, THE published in collaboration with Thomson Reuters between 2010 and 2014 and have now partnered with Elsevier for the 2015 ranking.


  • UK rankings traditionally comprise a range of different quantitative measures of input, process and output including: entry standards, student satisfaction, student:staff ratio, academic services/facilities expenditure per student, research quality, proportion of 1sts/2:1s, completion rates, and student destinations.
  • To construct World rankings, however, compilers have to use measures that translate as reliably as possible across different countries/regions - i.e. for which comparable data, which are not inextricably linked to national prosperity or other local/regional issues, can be found. For this reason, the measures used in the World University rankings are quite different to those used for UK rankings and cover prizewinner affiliations, research bibliometrics, student:faculty ratio, peer review responses, and percentage international students/staff.
  • Ranking methodologies have always been open to criticism due to the inherent arbitrary nature of weighting scores from different measures and then summing these to give an overall measure of ‘quality’. Some measures are also more controversial and open to bias (e.g. by institution size/subject profile/location etc) than others.
  • Finally, comparing one institution’s overall score (and hence rank position) against that of another institution can produce statistical reliability issues: analysis of ranking methodologies has shown that small changes in methodology, source data, or indicator weighting can result in large changes in an institution’s relative position within a ranking. Some compilers have tried to mitigate some of these issues by, for example, weighting data by subject mix, and by employing z-score methodology to statistically spread the data for individual measures.



Tracey Slaven

Deputy Secretary, Strategic Planning

Mr Jim Galbraith

Senior Strategic Planner

Related Links