In conclusion it can seen that these rankings closely resemble the rankings listed in The Times.
- Experimentation to find suitable values of the minimum weights
(Appendix b)
The aim with this table was to achieve a “non-biased” ranking that kept all weightings at a minimum level, bar one, in order to ascertain the maximum efficiency a university could achieve if maximum weighting was placed solely on that one variable. Initially the maximum weighting for each category had to be calculated. This was weighting was increased until the program would not run. A table was then constructed illustrating the efficiency of each university when measuring that one category on a maximised weighting. Table 2 was then constructed ranking each university by their efficiencies under each category.
From this table it can be seen that Oxford performs consistently well over all 9 factors though it can be seen that it’s weakness lies in Students-Staff ratio where it slips a few places from 1st/2nd to 4th out of the 12 Universities. Despite this, it has still managed to rank 1st overall.
Cambridge and Imperial also consistently score highly in all criteria, whereas Surrey, Southampton and Bath perform worst on average over the overall categories. The exception being in facility spend, where Bath, Southampton and Surrey perform considerably better than their average ranking.
Universities within these two extremes for example Edinburgh and Bristol, appear to fluctuate
between themselves without residing on either extreme. The remainder seem to fluctuate significantly.
This model will be used as a template in order to understand further models investigated.
- Further analysis
Appendix c
There are various different ways in which DEA in this example could have been investigated: A set number of criteria could have been chosen individually and investigated to see how they affect the rankings of Universities through both their individual and combined adjustments.
Instead it was chosen to look at 3 ‘types of person’ looking to go to university as it stands to reason that different people have different motives for choosing universities. This was analysed to see if the rankings changed from that of The Times.
I looked at those whom might want a high value degree in order to get a good job, those wishing to pursue a PHD, and those wanting a good time at university and good job afterwards.
In all cases the weightings were maximised so that efficiency was maximised? A subjective view was taken as to which variables would be used and the size of these corresponding weightings in order to represent what I thought these types of people would find important to them.
3.2.1 Wanting a high value degree in order to get a good job
4 categories were considered over the others with regard to this group, and weighted with regard to perceived importance amongst each other. (see appendix c)
Teaching Assessment –considered the most important factor impacting on an individual’s education, and was subsequently weighted the highest.
Graduate Prospects –this was deemed to be second in importance as not only is it a clear indicator of how sought after the graduates are, but also in value in terms of employment.
Good Honours – again this is an important part of the degree and it was therefore decided that this should be ranked 3rd as it provides a tangible value to the degree.
Student-Staff Ratio – this factor, closely linked with the first, was included as it may consider the education process having a large/small lecture number. This is ranked last because it can be argued that this not make the largest of differences.
From Appendix c it can be seen that the top 3 universities have not moved in league table position. This is because all 3 universities have performed consistently well in the 4 categories selected. It is easier to see this in appendix b where each category’s weighting is maximised individually. Here it can be seen that Oxford tops the table in 2 of the 4 selected variables, (one of which is the most heavily weighted –Teaching). It is weakest at ‘student-staff ratio’, yet because this is weighted the least it is not enough to pull it from the top spot.
Further, it can be seen that Cambridge comes top in the second most heavily weighted category and also performs consistently well in the other 3 categories. Imperial College consistently comes in 3rd bar 1 of the categories explaining why it hasn’t moved.
The only other University not to move is Surrey University (12th), though it should be noted from Appendix b that by solely using a maximised weighting on Graduate Prospects it could move up to 8th. It performs consistently poorly in the other 3 variables.
The differences, however, lie in the middle of the rankings. The greatest shift in the position in the league table is St Andrews, which has shifted from 6th to 11th. This is due to the adverse weighting of the 4 variables against it’s strengths. Appendix b shows where it’s strengths and weaknesses lie in relation to the other universities. In this scenario the strongest variable: ‘students-staff ratio’ is weighted with least importance of the 4 variables, and one of it’s weakest variables is weighted as one of the most important (2nd), accounting for its slip in ranking in the tables.
Edinburgh is the other university to see a significant slip in the rankings through these weightings. (From 4th to 7th). Again this is likely to be weighting in a “weak” area of the University”, in this instance ‘graduate prospects’(Appendix b).
Conversely Bristol and Bath are both seen to have climbed 4 places in the rankings. This is largely because they both score highly in ‘graduate prospects’, with Bristol also placing 3rd in ‘good honours’.
Therefore students looking to go to university for these reasons would be considering Oxford, Cambridge and Imperial. This is the same ranking as given by The Times but my original DEA model indicates that UCL be considered as well as they are joint 1st ( alongside Oxford, Cambridge, Imperial).
Universities less likely to be considered by these students (under this model) would be St Andrews, Edinburgh and Surrey. Though it should be taken into account that both the Times ranking and the Original DEA model both ranked St Andrews 6th.
3.2.2 Doctorate
This time 2 categories were chosen to represent those people whishing to undertake a PHD.
These were ‘research assessment’ and ‘library / computing spend’. Whilst the weightings were designed to represent the importance associated with each category, I was limited by a maximum weighting of 0.0005 for ‘Library/Computing spend’, though the ‘research assessment’ was able to be increased significantly further. (This is because the maximum weighting is much higher than that of ‘library/computing spend’.
Research assessment – This is obviously one of the most important areas for a PHD student when considering their choice of university. It was therefore weighted with the higher out of the 2 chosen (0.01598)
Library / Computing spend – Closely linked with the above variable, it was considered that this too was an important factor to take into consideration if a PHD student were comparing universities. This is largely due to the library being where a considerable amount of their research is undertaken. (0.0005)
If looking at appendix c, it can again immediately be noticed that like the prior models the top 3 universities have not moved, neither has the 12th ranked university.
Appendix b illustrates the ranking of universities according to a single maximised weighting of ‘research assessment’ (0.1119)if all the other weightings are maintained at a minimal 0.0001 level. From this it can be seen that Oxford, Cambridge and Imperial ranking 1st,2nd and 3rd consecutively, this is the same ranking used in this scenario, and is likely to be the case because a very high weighting associated with this variable. Whilst the ‘library/computing spend’ weighting was at its maximum (0.0005) the ranking of these 3 universities was Oxford, Imperial, Cambridge, but because of the inferior weighting assigned to this variable it was not enough for Imperial to take 2nd place in the ranking away from Cambridge.
Surrey came consistently 11th over these 2 variables, but because the 12th places were taken by separate universities it meant that Surrey was consistently worse overall and therefore assigned 12th place in the rankings. Surrey was also consistently 11th/12th over the majority of the individually maximised variables.
Significant differences again appear in the middle of the Times rankings. St Andrews again appears to be a source of controversy; once again they have dropped, this time 3 places according to this scenario. In the Times ranking they placed 6, and with these weightings they now place 9th. When looking at appendix b it can be seen that in relation to other universities (and indeed itself across the categories) it is ranked low over these 2 categories. The worse of the 2 is also weighted the highest accentuating the difference even further.
Conversely Southampton increased in the rankings by 3 places to now sit at 8th compared to the Times ranking of 11th.
Therefore a student considering undertaking a PHD using the above weightings might consider looking at applying to Oxford, Cambridge, Imperial as these are the highest in this specific model. Following from those, Edinburgh and UCL might also be considered as they have taken each others place in the rankings and still remain high in this model. However from the Times rankings St Andrews might be the next option. This model illustrates that for a PHD student, possibly this might not be the best university for their requirements as arguably the two main factors to be considered, significantly change for the worse the position it has in the rankings. According to this model the universities that would least be considered are York, Bath and Surrey.
3.2.3 Wanting a good time at University and good job afterwards
This scenario was chosen to investigate as I was keen to see what a strong weighting on either ‘Library/IT spend’ or ‘facilities spend’ would do to the rankings. The reason being that the budgets and/or spending in these areas can vary yearly. For example if a university were to invest in 2 major new computer labs one year, are they likely to be investing as much money in IT the next year?
The following 2 categories were chosen to represent the above scenario:
Graduate Prospects – These were chosen and weighted the most important (0.0019) as it was deemed that they be a key indicator of the numbers of people successfully finding employment straight after their graduation.
Facilities Spend – This was weighted 2nd as it was considered that Facilities might include Student Union, activities, sports facilities, and generally things that might provide an improvement to the student environment, and so giving the student a “good time”.
When looking at Appendix c it can be seen that the rankings move around considerably when compared with the other models. Possibly this might be due to the reason given above.
Imperial has moved up 2 places in the rankings to 1st. Its strongest asset (facilities spend) is weighted with the highest of the 2 weightings. At an initial glance it might be queried as to why Oxford is not 1st as it rankings consistently 2nd in these 2 categories with 2 different universities taking first places (Imperial and Cambridge). However on looking closer at the efficiencies it can be seen that Imperial is that much more efficient than Oxford in ‘facility spend’ (a difference of 0.117 in efficiency), that the 0.023 that Oxford is better at in ‘graduate prospects’ is clearly not enough to deter Imperial from 1st.
By far the greatest differences are seen at the lower end of the rankings where it can be seen that the bottom 3 universities: Bath, Southampton, and Surrey have moved up to 7,6 and 5 places respectively to place them in 3rd, 5th, and 7th.
Based on these results students “wanting a good time at university and a good job afterwards” might look at Imperial as the top choice, followed by Oxford. The 3rd best option is Bath (perhaps most surprising of all when considering it’s prior ranking was 10th), and whereas the other 2 previously lower ranking universities Southampton and Surrey might not previously be viewed as high options, now lie in 5th and 7th positions respectively and so might have closer attention paid to them. Edinburgh, UCL and St Andrews are now not likely to be as considered as they move either 4 or 5 places down the rankings. This just goes to show how the entire list of universities can drastically change ranking with a factor likely to fluctuate year upon year.
- Conclusion
In conclusion to all of the above scenarios, Oxford appears to be the best option as it deviates from 1st only once (when compared to The Times rankings) where it ends up 2nd in the table (when looking at the “good times at University and good job afterwards” scenario). Cambridge and Imperial appear to consistently perform well. However even these change in the final scenario.
People cannot be put into boxes, and in a sense DEA could be considered a victim of its own creation; by eliminating so much of the subjectivity associated with it, it creates an initial stage of subjectivity; that of deciding the weightings. Individuals have different needs and goals and therefore a model such as this needs to be used cautiously as evidently there are various other factors that need to be taken into consideration, many of which are unquantifiable without bias.