In 2008 I compiled various “league tables” of university maths departments mainly measures of volume and quality of activity. The motivation was partly that the newspaper league tables specific mathematics were unhelpful and personally I wanted to understand how our own fairly new School of Mathematics compared to others. The website detaining this comparison proved very popular with over a 1000 “hits” each month and plenty of emails to me on the topic. Prospective maths undergraduates proved to be one the largest category of readers (unsurprising as this is the largest group of potential users). I heard also from prospective Ph.D. students, young academics and heads of maths departments. Some of the latter were alerted to the website by their dean! This new version for 2009 is prompted mainly by the publication of the 2008 Research Assessment Exercise (RAE2008). I have also responded to requests by readers for information not included in the last version, notably by readers of the internet forum The Student Room (TSR).
Why do newspaper league tables produce strange rankings for specific subjects such as mathematics? It seems to me there are three main reasons 1) Data is often long out of date. An example of this is the use of the Teaching Quality Assessment typically carried out in 19992000. 2) Data that is unreliable in the way it is divided in to specific subject areas. A prime example is “spend per student” on a specific degree subject. 3) Data that is methodologically flawed or open to manipulation, for example the student satisfaction survey. Of course some of the data used is flawed in two or all three of these ways! Let us look more carefully at some examples.
Spend per student. I think it would be possible to make a reasonable estimation of the money spent by a university directed towards students studying specific subjects, especially if those subjects were taught entirely within one school or department. However such a study would take several personweeks of work by accountants, with complete access to university accounts, for each subject and each university. Such an undertaking is of course beyond the reach of a newspaper. Universities themselves tend to account for money under headings such as “salary costs”, “capital projects”, “maintenance” etc. For example when the University of Manchester allocated about £40m to construct the Alan Turing Building, half of which is occupied by the School of Maths, it is unlikely to show up on the league tables as £20m spent on maths students. In other universities where maths is taught in a larger school that includes for example physics or computer science it will not be easy to disaggregate spending specific to maths. One suspects that many of the costs of educating undergraduates are simply apportioned in the ratios of the number of students studying each subject. Moreover the spend per student, as well as being open to manipulation by university administrators cunning enough to apportion the money in a league tablefriendly way, is a component of league table scores in which a university could elevate its ranking simply by burning £50 in the name of mathematics students.
National student survey. This standardised survey suffers mainly from the fact that on the whole the students report that on the whole they are “satisfied” with their education. Students on the whole tend to be relatively loyal to their course and institution, and also have little evidence to make an objective comparison with the same course at other universities. The range is therefore compressed and so small changes in reported “level of satisfaction” make a disproportionate difference to league tables, especially when the number of students on that course is small. Moreover students of often know (and in some cases in the news have been told!) that the results of the survey are used to compile league tables, and that potential employers look at these league tables. The students can therefore “raise the value of their degree capital” by filling in the survey in a more positive light. On the other hand the student may also feel a sense of duty to genuinely say what they feel is deficient in their course in the hope that it might be improved. The tension between these competing effects makes the survey result rather unhelpful.
Destinations of Leavers from Higher Education ( DELHE). This is a paper and phone survey conducted by the university, on behalf of the higher education statistics agency, six months after graduation. The return rate of this survey depends partly on the university's effort in keeping up to date contact details of students, as well as priming them to expect the survey rather than treating it as just another annoying marketing survey. Anecdotal evidence suggests that improving the return rates improves the number returned as employed in graduate jobs. There is also some freedom at interpreting jobs as being graduate positions, and I suspect some universities slant this in their favour. One systematic problem with this survey is that six months is often too early to get a clear picture of graduate destinations. Plenty of students travel and while doing so might take on casual jobs, others simply concentrate on their finals instead of job hunting, so take temporary work while they are still working on finding a long term position. Finally it is possible that students may take the same approach I suggested in the SSS, of giving more positive answers to better the league table position of their Alma Mater. DELHE survey data used in news paper league tables is also often out of date.
UCAS points on entry. This data, from HEFCE and reproduced on the UniStats web site, is in some sense formulaic as there is a set UCAS tariff for each grade and each qualification, it is not clear that the data is reported in a consistent way. For example pupils at independent schools more often take General Studies at Alevel although this is often excluded from a university offer. This might raise the UCAS scores for universities that recruit more from independent schools. Also Scottish Universities admitting students mainly on Scottish Highers tend to result in larger numbers of points. Finally for mathematics, especially Cambridge Oxford Warwick and Imperial (COWI), entrance to a mathematics degree requires a STEP paper, or something similar such as an AEA or Oxford entrance exam. The difficulty of getting a certain grade on a STEP paper is hard to judge against UCAS points gained from other subjects.
Examples.
Here is a screenshot of the Guardian table for mathematics taken on 27/12/2008
Table
1: The Guardian University Guide for Mathematics 2009
This table (the link, while it is valid, is here) is sorted on the first column, the “average teachings score”. A link is provided to the methodology. the average teaching score is described as follows “For each subject that it teaches, an institution is given a Guardian score, based on these seven measures. The Guardian score for each institution in each subject is derived from a weighted average of the standardised scores for each measure, converted into a scale in which the highest ranked institution gets 100 points and all institutions receive a positive score.” Anyone familiar with mathematics departments in the UK will be surprised to see that Portsmouth and Sheffield Hallam are such desirable places to study mathematics, comparable with Imperial and Warwick for example. We note also that the Guardian does not include data on research performance as they deem that this is not helpful for undergraduates choosing where to study. This is a position I completely disagree with and we will return to this issue. Also we are aiming in this article to be useful to a wider range of interested parties.
The Complete University Guide as published by the Independent does include research scores in the form of the 2001 RAE. We will not dwell on those results as they are now superseded by RAE 2008. But a limitation of using RAE data is that it is performed infrequently so tends to give a historical perspective. Although we are lucky to have the latest RAE results there is clearly a need for indicators that can be updated more regularly, and we will give some below. Here is part of their table (link here )
Table 2: The Independent/Complete University Guide table for maths 2008
Here we see that missing data has been treated in the most favourable possible way, including Dundee only submitting on on of the three RAE units of assessment (Applied Mathematics) and having a missing return for the DELHE survey, and ranking better than Imperial.
When one narrows the range of interest down to one specific subject area it is possible to obtain data for a range of indicators of both quantity and quality of activity in mathematical sciences departments. For example in the case of publications MathSciNet (MSN) the American mathematical Society's database of mathematical sciences publications we can identify individual authors (as distinct from authors with similar names), and the institution code gives the department to which they were affiliated when they wrote the paper. We can also look at grants awarded in mathematics from the EPSRC, and at national mathematics prizes as useful measures. The allocation of money in the form of the Doctoral Training Account is also a useful annual peer reviewed indicator, and is obviously related to the number of home PhD students taken on in each department.
Why should a prospective undergraduate or indeed postgraduate student be worried about the department's finances? One reason is that you are investing three years of your life in a programme in that department and at the very least you want to choose one with a secure future. Departments close or are absorbed in to larger amalgamations of departments (a process that is often called Salfordization, perhaps a little unkindly to Salford), largely for financial reasons. A department generates income from three main sources 1) Students, this has two components – funding from HEFCE (or its equivalent in Scotland (SFC), Wales (HEFCW) or Northern Ireland (DEL)) in proportion to the number of home students and fees from overseas students. For many maths departments the bulk of their income is from students. 2) QR funding, allocated on the basis of the RAE scores and the number of staff 3) “Overheads” (now called Full Economic Costing) from research grant income. Not only does a healthy income, especially relative to other departments in the university, indicate that that the department will flourish, but depending on the way money is distributed in that university funds from QR and overheads can be used to improve everybody's life in the department. This might include scholarships and bursaries for undergraduate and postgraduate students, more postgraduate demonstrators to help in problems classes, hiring really good staff when the opportunity arises, providing more or better administrative support. Some universities, most especially Oxford and Cambridge, have sources of income. Colleges fund bursaries, academic posts and provide tutorial support for example. Rarely, maths department gain substantial funding from industrial and commercial partners.
Recent closures of departments include maths at Bangor and Hull, Chemistry and Music at Exeter and Physics at Reading. Usually a combination of factors determine these closures including declining student recruitment and research performance lower than average for that university. I don't want to over emphasise this. Closures of departments are quite rare and students number in maths are currently rising. On the other hand the economics seem to favour larger departments.
Prospective students of joint honours degrees should naturally also check the health of the other department in which they intend to study.
The research assessment exercise 2008 looks mainly at the papers published by university academics over the period 20012007. Mathematical Sciences are divided in the three “Units of Assessment” (UOE) . Pure Mathematics, Applied mathematics and Statistics & Operations Research. The papers published are scrtinized by academics who form a “subpanel” for each of these units, and judged in one of the categories 4* (Quality that is worldleading in terms of originality, significance and rigour) down to Unclassified (Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment). A “quality profile” is given for each unit of assessment, roughly the percentage of the work (and that is typically four papers per academic) in each category. The mathematical sciences subpanels together with computer science form “Panel F”. This panel attempted to make the quality profiles comparable between these for units. It is important to note that you cannot reliably compare quality profiles for mathematical science with other subjects. For example Music might have set a quite different standard for 4*. Actually the papers constitute only 70% quality profile. The “research environment” contributes 20%, this includes many things mentioned on the submission written by each department including things like research students, infrastructure (libraries, nice new maths buildings etc), research income. A final 10% is allocated to esteem indicators, things like how many FRSs and prize winners on the staff. At the time of writing we do not know the breakdown of each quality profile.
Three main methods are popular for comparing the quality profile as follows. Let a_{k }be the percentage of k* outputs then the Grade Point Average (GPA) is simply the weighted sum 4 a_{4 }+ 3 a_{3 }+ 2 a_{2} + a_{1} . Clearly this is meant to measure quality rather than quantity of activity. The “Research Power” is simply the product of the GPA and the number (N) of (full time equivalent) research active staff submitted – a combination of quality and quantity. Another approach is a “medal table”, in the same spirit as the Olympic games. Treating 4* output as gold medals, the gold medal medal score is simply N a_{4 }. (Such an approach was taken by the Russell Group in this press release) . The Gold and Silver medal score is N (a_{4 }+a_{3}). As mathematical sciences covers three units of assessment we have combined the results for each UOE weighted in the obvious way.
The RAE profiles will determine the “QR” funding proportional to be a weighted sum favouring 4* more than research power. The formula is 7 a_{4 }+ 3 a_{3 }+ 1 a_{2} . We call this the weighted power. However it does not produce a radically different ranking from Resarch Power
The number N of submitted staff was used tactically, as universities had some discretion as to who they include as research active. Some institutions chose to submit a large number of staff, for example declaring some postdoctoral staff without a lecturing post as being “independent researchers or perhaps including staff with lower quality outputs, betting on maximising research power and eventually their funding. Others chose to exclude staff with outputs judged to be of lower quality to maximise their GPA. In effect N is a mixture of quantity and quality, as it is the number of staff thought to be good enough to be submitted in the RAE. Unfortunately we do not currently have the total numbers of academic staff in each department, which is a shame as it would help in computing for example staff student ratios.
One feature that struck me immediately is that on the whole larger departments score better, even in the GPA which is meant to measure quality independent of size. We will come back to this point as well as looking at the exceptions to this trend. The ranking on indicators that do include size are reasonably consistent for the top six. On research power 1) Cambridge, 2) Oxford, 3)Bristol, 4)Warwick 5)Imperial 6)Manchester. The same on weighted power and gold + silver medal score. On gold medal score 1) Cambridge, 2) Oxford (but as near as matters equal), 3)Warwick, 4)Bristol 5)Imperial 6)Manchester. On these measures Cambridge and Oxford are significantly ahead of the next three. One significant feature compared to my study of research performance in 2008 is that Bristol should firmly be now be added to the COWI cluster of top universities for mathematical sciences. In terms of research grant income it has been far ahead of COWI for at least the last year and the RAE now confirms its membership of that club. The order is still debatable but let us for the moment coin the acronym COWBI (at least it is pronounceable as something like “Coby”). Despite Manchester being in sixth place on these measures I will resist the desire to talk up my own department! I think that Manchester is in a small cluster snapping at the heels of COWBI. It is also worth noting that Manchester is much bigger than the outofdate data HESA statistics indicate and is on a rising trajectory in both size and quality, but perhaps that might well be said of some of our near rivals. The RAE results must be treated with the caution that they are already out of date. The survey period 200107 does not capture recent trends. For example the School of Maths in Manchester was only formed in 2004, and under one roof from 2007. Many other significant changes have happened in other departments during that period. For example the £13m OxfordMan institute of Quantitative Finance announced in 2007 and the $25m Oxford Centre for Collaborative Applied Mathematics announced in 2008. This emphasises the need for indicators of research quality and quantity that respond more rapidly, a point we will return to.
The ranking by GPA should be treated with caution, especially for universities that have not submitted in all three UOE, and those with apparently low numbers. The GPA ranking 1) Oxford, 2)Cambridge, 3)Imperial, 4) Warwick, 5) Bristol, 6) Portsmouth, 7) Aberdeen, 8) Bath, 9) Surrey, 10) Edinburgh, 11) Manchester, 12) Nottingham, 13) KCL, 14) Durham, 15) Southampton. A plot of GPA against N shows an interesting phenomenon. With a few outliers GPA and size seem to lie approximately on a rising curve. The labels are the ranking by research power. Taking a guess at the form of the curve plotting Log10 (32.2 GPA) against N approximately straightens the graph.
(see below, this time with labels). We see on both these charts that Cambridge and Oxford are outliers, and that Bristol Warwick and Imperial also seem to form a cluster. I plotted a regression line and roughly speaking those above the line “punch above their weight”, ie there GPA is better than one might predict from their size. This raises the question of why size should be linked to performance. I hazard to fairly obvious reason. One is that growth follows success, with universities putting more resources in to departments that do well both in recruiting good students and in getting grants and doing well in the RAE. The other is that there are economies of scale, for example it is easier for staff to take regular sabbatical leave and administrative duties are shared out more thinly in a larger department. So we have two positive feedback mechanisms with delay. Of course there are plenty of other factors at work.
The indistinct cluster after COWBI Manchester, Edinburgh, Southampton, Nottingham
Leeds, Durham, Bath, which takes us down to 13^{th} place is reasonably useful grouping. I am tempted to include HeriotWatt who submitted to the RAE jointly with Edinburgh as the Maxwell Institute.
The following table summarises universities where mathematical sciences performs better than would be expected from their size. The difference between GPA and my regression line. Ordered by research power. Bath for example submitted in all three units of assessment, and so seems genuinely to punch above its weight. However on the 2006 undergraduate numbers (excluding the open university) bath is 7^{th} biggest raising some interesting questions. They did seem to submit most of their staff, so I suspect that other departments have increased their undergraduate numbers faster than Bath. Portsmouth has always stood out among universities promoted from Polytechnics in 1992. With only 16 staff submitted it is tiny and its research is of course strong only is specific areas. They only submitted in Applied Mathematics. Aberdeen is another example of “small but good”, in this case submitting only in Pure Mathematics, and Surrey in Applied.
Some departments are much better than the average in that university, and others noticably worse. We call them Salients and Reverse Salients. For salient and reverse salient maths deaprtments with respect to teh RAE GPA see separate page
On the whole other indicators of research that combine quality and quantity tell roughly the same story as research power. We will look at them with some cautions about their interpretation.
Mark Muldoon kindly wrote a Perl script that extracts the total value of EPSRC research grants in mathematics (and statistics) departments. It runs every month on a cron job so should update automatically (and will break if EPSRC change their web site). The latest version of the table here. As the script is to some extent written by hand I had to type in a couple of lines for each university. If one department not listed suddenly gets loads of grants it will not show up unless I add it! You can check the results individually on the EPSRC Grants on the Web site. Go to institution and the select maths department. if departments have been Salfordized and do not show up as a separate maths or stats department we cannot easily inlude them in this table. One would have to go through each grant and decide if it was really mathematical science or eg computer science or physics or whatever else is in that university's School of ThisThat and TheOther. Large programme grants over £1m of course have big effect and we see currently Bristol, Warwick, Oxford, Imperial, Manchester, then Cambridge. There have been some changes in order in the last three over the year. Edinburgh in 7^{th} place includes a grant of £4.7m for a high performance computing centre. Those departments with several millions of pounds include large projects, and we see again the effect of size, a certain critical mass being required to mount a successful bid for a large grant. Sheffield comes out better on this measure than the RAE with a £2.2m grant in Statistics. Sheffield also holds the £0.7m grant to run the national MAGIC postgraduate teaching project. more on that later.
As noted EPSRC grants come with “overheads” a portion of which, depending on the system in each university, go in to the coffers of the departments and funding things other than just that research project. Some limitations of this table. In many cases in applied mathematics, and perhaps statistics as well, a mathematician might have a substantial role in a project as coinvestigator, the principle investigator being in another department. As we do not know what proportion of the project is allocated to the coinvestigator such grants do not show up on the table, although they do generate revenue for the maths department. Also there are other funding mechanisms. For example the funding in Oxford from Man Investments and from King Abdullah University of Science and Technology or the industrial funding at Manchester. There are also important mechanisms (especially important in Pure mathematics) such as Leverhulm and Royal Society fellowships. These are of course were taken in to account in the RAE.
The EPSRC Doctoral Training Account is the main mechanism by which home (as oppose to overseas) PhD students in mathematics get funded. In mathematics this allocation is decided by a panel of mathematicians, whereas in other subjects a formula based on grants held is used. This reflects the fact that plenty of good mathematics is done without grants. The data for the 2007 allocation is on the EPSRC website. Some funds go to non Mathematical Sciences departments that hold mathematical sciences grants. Obviously this is interesting to potential PhD students as it shows the departments that have funding, and an idea of the number of students. But of course it does not reflect the numbers of overseas students who have funding from their own country. Some departments will fund postgraduate bursaries from other sources as well. The table should also be of interest to academics thinking of moving to a UK mathematics department, and of course prospective undergraduates looking for a department with a lively mathematical culture.
The table does not tell such a different story from the RAE or research grants, however it is decided by a peer review panel informed by the latest information on funding and the success of PhD students as well as other factors, and updated each year. As such it is useful to study between RAEs.
How many Fellows of the Royal Society currently work in each maths departments? Its pretty hard to get this data, but probably having a few is some kind of sign of a healthy research active department! Here is the data I have (not counting emeritus and retired). There is no systematic list so I may have missed some people. Also at Cambridge DAMTP arguably includes some people, Hawking for example, better known as physicists than mathematicians. However even with a tight definition of mathematician Cambridge still comes top if the FRS table.
Fellows of the Royal Society in UK maths departments (incomplete)
University 
FRSs 
Notes 
Cambridge 
Barlow, Turok, Coates, Gowers, Kelly, 
Gowers is a Fields Medallist 
Ball, J. Ockendon, Silverman, Segal, Birch, Kirwan, James,
Lyons 

MacKay, Reid, Stewart, Preiss 

Imperial 
Donaldson, Hayman, Atkinson, 
Donaldson is a Fields Medallist. 
Manchester 
Higham, Taylor, Wilkie 
Paris is a Fellow of the British Academy 
Bristol 
Green, Wooley, 

Liverpool 
Rees 
Mazya is Member of the Royal Swedish Academy of Sciences 
Fields Medalist[WP] are very rare so not really useful for league tables. Probably more important taking a long view is that a department has a culture that will produce one. For example Donaldson and Atiya got their Fields Medals while at Oxford. That said the research culture of a department is likely to be improved by having one there.
It is good for ones research papers to get cited and ISI Highly Cited researchers are ones that have got cited a lot, an indication that their work is valued by other researchers. Some topics in mathematics, for example numerical analysis, seem to generate more citations than others. But maybe that means these areas are more useful anyway?
ISI Highly Cited researchers in UK Maths Depts
University 
Highly cited researchers 
Oxford 
Ball, Cox, Quillen, Silverman,Trefethen, 
Cambridge 
Clayton, Spiegelhalter, Fokas, Lickorish, 
Warwick 
Roberts, Stewart, MacKay 
Imperial 
Donaldson, Liebeck 
Bristol 
Goldstein, Green 
Manchester 
Higham, Hammarling (hon post), Dongarra (fractional post) 
LSE 
Atkinson 
King's College London 
Davies 
UCL 
Dawid 
Open University 
Jones 
Queen Mary 
Smith 
Glasgow 
Titterington 
The London Mathematical Society Whitehead prize is interesting in that it is an early career prize but most of the recipients go on to great things. The first prize was in 1979 so most of the recipients are still active, and from 1999 four prizes are given, meaning that there are a good few around. Whitehead prize winners do credit to where they studied and did their early work. It is also interesting to note the departments that are able to attract them to a permanent post. So in some way these gifted young mathematicians “vote with their feet” as they would usually have a choice of position in the UK. The table indicates the number of Whitehead prize winners working in the mathematical sciences departments of each university. This includes of course many older mathematicians many of whom will have other honours and prizes. Bristol here stands out in that Browning and Snaith from Bristol both won their prize in 2008, which I take as an indication of Bristol's upward trajectory.
University 
No. Whitehead prize winners 
The University of Oxford 
14 
The University of Cambridge 
10 
The University of Warwick 
10 
Imperial College of Sci 
6 
The University of Manchester 
4 
The University of Bristol 
4 
University College London 
3 
The University of Edinburgh 
2 
University of Durham 
2 
The University of Bath 
2 
The University of Glasgow 
2 
The Open University 
2 
The University of Nottingham 
1 
The University of Leeds 
1 
HeriotWatt University 
1 
The University of Sheffield 
1 
Queen Mary and Westfield College 
1 
The University of Liverpool 
1 
The University of Newcastle 
1 
The University of York 
1 
The University of Strathclyde 
1 
King's College London 
1 
Cardiff University 
1 
The University of Aberdeen 
1 
The University of Dundee 
1 
Data supplied by HESA gives number of undergraduate and postgraduate students registered in 200607. The very high figure for the Open University includes any student taking mathematics modules, but most do not go on to complete a mathematics degree. The table below lists those above 500. Contacting departments directly and what can be deduced from university web sites indicates that undergraduate numbers have risen overall, but not uniformly across universities. Manchester and Warwick both have around 500 undergraduates on mathematical sciences single and double honours in their 200809 first year. Certainly Manchester has increased dramatically over the last three years. Oxford and Cambridge tend to have far fewer joint honours students. Bristol has two mathematics departments (one called Engineering Mathematics) and it is interesting that they had so few undergraduates in 2006. One would have expected them to increase in line with their success. For undergraduates usually a larger department (more importantly more staff, but that usually follows students) means large first year lectures but more choice of final year options.
1 
The Open University 
3045 
2 
The University of Warwick 
1140 
3 
The University of Manchester 
830 
4 
The University of Oxford 
805 
5 
The University of Bath 
765 
6 
The University of Cambridge 
700 
7 
Imperial College of Science, Technology and Medicine 
700 
8 
The University of Leeds 
650 
9 
University of Durham 
640 
10 
Queen Mary and Westfield College(#13) 
590 
11 
University College London 
570 
12 
The University of Bristol 
530 
113 
The University of Birmingham 
525 
14 
The University of Edinburgh 
520 
15 
Loughborough University 
505 
16 
The University of Nottingham 
505 
The UniStats web site gives some out of data but interesting information including the mean total UCAS points the students have on entry, the percentage of each degree classification (1^{st} ,2.1, etc) , the percentage of male and female students and the percentage of overseas students. Have included this in my grand table as it somewhat tedious to extract it from each course and university from UniStats.
Recall that the UCAS tariff gives 120 points for an A grade at Alevel, 100 for a B and so on, with half as many for the equivalent at AS. For example AAAaa is 480, AAAAa is 540 (the average at Cambridge and Oxford). The UCAS tariff includes points for some other qualifications but not for the STEP paper used as the entrance criterion for Warwick and Cambridge. It is not clear if there are differences in what is included at the different institutions. In any case the UCAS tariff does not reflect the difficulty of the offer precisely, especially for COWI.
I have combined much of the data for this study in a big table opening in another window. To sort by any column click on the heading twice. The explanation of each heading follows. 


Russell/etc 
R indicates a member of the Russell Group, 94 indicates 1994 Group, P indicates the Million+ group and other former Polytechnics. 
FRS 
Number of Fellows of the Royal Society who are members of the academic staff of the maths department(s) (not retired) 
ISI HC 
Number of full time equivalent ISI highly cited authors in the maths department(s) 
No Wh P 
Number of LMS Whitehead Prize winners in the maths department(s) 
UG 
Number of undergraduate students in mathematical sciences according to HESA data for the academic year 20062007 
UG/N 
UG numbers as above divided by no of staff submitted as cat A in 2008 RAE 
PG 
Number of postgraduate students in mathematical sciences according to HESA data for the academic year 20062007 
PG/N 
PG numbers as above divided by no of staff submitted as cat A in 2008 RAE 
Grants 6/12/08 
Total value of EPSRC grants held with the PI in the maths departments on 6th Dec 2008 
G/N 
Grant value / no staff in 2008 RAE 
DTA 2007 
Percentage of total EPSRC Doctoral Training Account in mathematical sciences allocated to the university in 2007 
DTA/N 
As above divided by no staff in 2008 RAE 
MSN 
Number of papers with authors in the maths departments in MathSciNet in 2007. 
MSN/N 
Number of papers/ no staff in 2008 
no cat a 
Number of full time equivalent category A staff submitted in the 2008 research assessment exercise. 
GPA 
“Grade Point Average” of 2008 RAE over Pure, Applied and Statistics & OR 
Rpow 
RAE 2008 “Research Power” is GPA* no of staff submited 
UCAS 
Average UCAS points on entry to mathematical sciences according to UniStats. Data from UCAS and HEFCE and obviously out of date 
grad emp 
According to the DELHE survey the percentage gaining a graduate job within six months of graduation 
SSS 
Percentage of students “satisfied” in the rather spurious “student satisfaction” survey. 
male 
Percentage of male students according to UniStats 
O/s 
percentage of overseas students 
1st 
percentage of maths sci students getting a first class degree, according to unistats 
2.1 
percentage of maths sci students getting an upper second class degree, according to unistats 