I am extremely upset about RateMyProfessors.com’s list of the Top 50 Hottest Professors. First, I didn’t make the list, and I’m the hottest damn professor I know! (With co-blogger Tom a close second, of course.) And second, the ranking system doesn’t make a bit of sense.
The professors are ranked based on the total number of student evaluators who have given them “chili peppers,” without taking into account the total students who have evaluated the professor at all. As a result, the system gives an advantage to professors who teach more classes, who teach larger classes, or who teach at colleges where students make greater use of RateMyProfessors.com. The rankings may not correlate at all to the fraction of students who actually find the professors hot.
The more sensible approach, of course, is to look at the percent of student evaluators who found a professor hot (possibly excluding professors without enough total evaluations to constitute a good sample). So I created the graph below using chili peppers as a fraction of all evaluations for the (alleged) Top 50.
Notice the absence of anything resembling a downward trend. In fact, there’s actually a slight upward trend. And a full 60% of professors on the list scored better than the professor ranked as #1 (sorry, Steve Joordens!). This is an outrage, and I hope RateMyProfessors.com will rectify it forthwith. Without accurate hotness ratings, how will prospective college students know where to enroll?