Where’s the Evidence? Do Employers React to Grade Inflation?

Cite this Article
Joshua D. Wright, Where’s the Evidence? Do Employers React to Grade Inflation?, Truth on the Market (June 23, 2010), https://truthonthemarket.com/2010/06/23/wheres-the-evidence-do-employers-react-to-grade-inflation/

All the rage around the law blogs this week is the question of whether law schools should be engaging in grade inflation.  The issue arises from time to time.  The NYT kicked off the discussion most recently with its story on the (gasp) retroactively applied bump given to Loyola LA law students.  You can’t miss the discussion on the law blogs (here, here, here and here).  The question of whether grade inflation helps or hurts students on the job market depends on how employers adjust to different curves between schools in their hiring practices as well as how they adjust to changes in the curve at a particular school.  One plausible hypothesis is that both differences between curves and changes within a law school don’t impact hiring decisions because legal employers adjust to the changes by relying on alternative measures like class rank or updating their priors on what a particular GPA means from a particular school in terms of the quality of the candidate.  Call this the irrelevance hypothesis.

The conventional wisdom seems to be that the irrelevance hypothesis doesn’t hold.   Let the anecdote stacking begin!  The NYT story itself  notes that “in the last two years, at least 10 law schools have deliberately changed their grading systems to make them more lenient … to rescue their students from the tough economic climate.”   This view appears to agree with the intuition of many a law blogger.  Over at the Conglomerate, Christine Hurt suggests that small and out-of-town firms may not know about the variance in grade distributions across schools and individual schools have the incentive to inflate.  Howard Wasserman notes that:

The problem is an (anecdotal) strong resistance in the legal market to do so. Part of the push to change here came because our dean’s conversations with people in the hiring market convinced him that GPA was the be-all-end-all and class rank did not matter. As a relatively new, lower-tiered school, firms are interested only in our very top students. But many firms seemed to say that a 3.3 GPA was not high enough for them to look at, even if that person was # 3 in the class.

There is a battle of intuitions and priors about the legal market going on here without much evidence.  One underlying assumption about legal employers, or some set of legal employers, is that firms are either not paying attention or are irrationally committed to metrics like GPA when more informative measures like class rank are available.  And of course, the fact that 10-12 schools have engaged in this experiment presumably with some educated guess as to the reaction of employers certainly contains valuable information.  As an interesting side note, the Loyola LA Dean announcement discussing the change goes the other direction, noting that the fact that employers pay “very close attention” to these numbers is a reason for the change!  Of course, even if the irrelevance hypothesis passes muster, it does not strictly follow that moving the mean GPA upward is a bad thing for reasons unrelated to job outcomes.

But lets get back to the basics: invariably, the cost/benefit analysis for grade inflation comes down to whether these changes are having an impact on the job market.  Are employers adjusting?  Is there a difference in the reaction time of out of town and local firms?  Does the adjustment have a short-term effect while employers figure it out and then things return to normal?  Do schools that inflate multiple times pay a reputational penalty on the job market as employers get frustrated with the gamesmanship?  Does a student who finishes with the same class rank but higher GPA than a student at a similarly ranked law school better off in some meaningful way?  Do students at the top of the class suffer from these experiments in inflation as it makes it more difficult for them to stand out from their peers on GPA terms?

The NYT story reports that at least 10 schools have made these changes over the last decade.  One source reports the number is closer to 12.  These changes seem like excellent opportunities for empirical testing of some of the underlying assumptions floating around out there both about consumers and producers in the legal education market.  The anecdotes have some limited usefulness here and all, but the same anecdotes have been around for quite some time, and perhaps there an opportunity to move beyond anecdote and toward empirical analysis here?  Interest in the reform of legal education and outcomes seems to be on the rise in academic circles, though this is not an area I’m too familiar with.

Do we have data that can help us answer these questions?  What’s the evidence?