Thursday 24 May 2018

graduate school - Rankings for university departments


The Internet is littered with information about university rankings and comparisons that it is very easy for an applicant to lose track of what exactly he wants to know. Apparently rankings which compare all departments of a university and have a single rank are not very useful from an applicant's perspective. Yet these are the most common rankings one can find. I know of the FT rankings which order MBA schools. It does rank B-schools in terms of research too, but clumps different branches (Accounting, Finance, Operations, etc) into one.




  1. For other departments are there reputable and reliable portals where one can get to know rankings based on various criteria like published research, number of graduates, time taken for graduation, etc?




  2. Are there no well-established ranking systems for academic departments? For example, these ranking systems could be similar to those in cricket or football. Such as periodic updates are made with every publication and every citation, and possibly journal reputation could also be brought into picture. In a way, would that not enhance the competitiveness in academia among similar departments?






Answer



The big difficulty with devising formal ranking systems based on numerical measures is that, outside of a handful of areas like sports, anything we can measure is at best a proxy for what we really care about. It may start off as a pretty accurate reflection, but anyone judged on this basis will quickly discover how to manipulate it.


For example, universities in the US are often judged partially by the fraction of students they accept. Of course, ambitious universities have adapted by advertising to encourage more applications, with no intention of accepting these applicants, but just to lower the acceptance rate by increasing the denominator.


Similarly, universities are also judged based on "yield," the fraction of admitted students who attend. That sounds at first like a pretty good measure of popularity, but it creates an incentive to game the system by rejecting students you think are likely to choose another university in the end, and universities do just that.


Time taken to graduation can be gamed by kicking out students who are taking too long. Employment rates can be gamed by offering ten-week temporary jobs to unemployed students, timed to coincide with the measurement of employment rate (really!). All sorts of things can be gamed.


And this is not just a theoretical problem. It occurs all the time in practice (if you are in the US, then your school is very likely doing some of these things), and some people are extremely upset about it.


You might think scholarly measures based on citation counts would be less subject to this, but they are not. There are plenty of corrupt journals where editors put pressure on people to cite papers, or even use totally fraudulent methods like publishing review articles that carefully cite every paper they have recently published. People have been caught seriously distorting their journals' impact factors by doing this, and I'm sure there are other, more clever editors who are getting away with it. If professional success depends on influencing a number, then people will discover ways to influence it.


In summary, there's an awful tension between transparency and resistance to fraud: if you explain how your ratings work, then people will manipulate them. Nobody has any idea how to avoid this, and the net effect is that serious scholars do not waste time trying to compile numerical rankings. For the most part, the only people who do are those who are naive, trying to make money in unscrupulous ways, or trying to promote a cause through carefully chosen rating methods. The rankings they produce are not worth paying attention to.


P.S. Polls of expert opinion are generally much better than rankings based on numerical measures, but even they have their problems. For example, the U.S. News college rankings are based partly on asking college presidents to rank other schools. Clemson University manipulated the rankings by rating all competing schools as below average, no matter how good they were. I suspect they weren't the only ones to do this - the amazing thing isn't that it happened, but rather that we ever found out.



No comments:

Post a Comment

evolution - Are there any multicellular forms of life which exist without consuming other forms of life in some manner?

The title is the question. If additional specificity is needed I will add clarification here. Are there any multicellular forms of life whic...