Friday 26 January 2018

grades - How to measure entropy of exam results


In this answer to a quora question, the answerer mentions how the 'entropy' of a set of exam results can be used to measure how well the exam differentiates between students.


Should I be computing the entropy of my students' exam results? How do I do it? How should I interpret the entropy information?


Edit: How is entropy related to standard deviation?



Answer



Entropy measures how much information you learn on average about each student from the exam results. For example, imagine an exam on which everyone gets a perfect score. In that case, you would learn nothing, so the entropy is zero. If half the students get one score and half get another, then you learn one bit of information about each of them. If you want to assign meaningful grades on the usual U.S. scale, you'll need at least several bits of entropy, and the 3.5 or 4 bits mentioned in the quora answer sounds reasonable to me.



The idea behind the answer you link to is perfectly reasonable: if your exam results have low entropy, then that basically means they are clumped together on too few possible scores, and you don't have enough ability to distinguish between students. On the other hand, I don't see much point to actually computing a mathematical measure of entropy (e.g., Shannon entropy), except perhaps for fun if you enjoy that sort of thing. Instead, you can just look at the range of scores and judge how well they distinguish between students. Think about how you might assign grades, and you'll rapidly see whether you run into problems, without any need for mathematical computations.


Furthermore, doing it by entropy is a little subtle anyway. Strictly speaking, Shannon entropy pays no attention to the distance between scores, just to whether they are exactly equal. I.e., you can have high entropy if every student gets a slightly different score, even if the scores are all very near to each other and thus not useful for distinguishing students. The quora answer obliquely refers to that (in the discussion of bins), but still this means you can't just compute a single number without thinking.


So I'd view entropy more as a metaphor than a number most professors should compute.


No comments:

Post a Comment

evolution - Are there any multicellular forms of life which exist without consuming other forms of life in some manner?

The title is the question. If additional specificity is needed I will add clarification here. Are there any multicellular forms of life whic...