A dean of admissions at a law school regularly runs into his university’s president in the parking lot, and the president always asks the same question: “How are our LSATs going?”
That anecdote is one of many in Engines of Anxiety: Academic Rankings, Reputation and Accountability, which the Russell Sage Foundation just published. The book provides example after example of how the law school rankings of U.S. News & World Report lead admissions officers, law deans and university presidents to obsess about the pecking order and standardized test scores, which are seen as the speediest way to move up in the rankings. There have been plenty of analyses of the negative impacts of rankings, but this one is based on more than 200 interviews -- all anonymous to encourage openness -- with admissions officers, deans and others about how they view and try to game the U.S. News rankings. The authors suggest their analysis has relevance for other rankings, including the U.S. News undergraduate reviews.
The authors of the book are two sociology professors: Wendy Nelson Espeland of Northwestern University and Michael Sauder of the University of Iowa. In interviews, both said that, as sociologists, they were attracted to the topic because of their interest in the quantification of quality, the use of numbers to create status and the anxiety of professionals over how they are evaluated.
In their research, they heard a lot of bashing of U.S. News rankings. Admissions deans said the rankings were “evil,” or they just exclaimed, “Hate them.” One admissions dean at a top law school called U.S. News, based on its rankings, “a half-assed, shitty magazine.” These same law school officials, however, described how they do whatever is necessary to do well in the rankings.
Robert Morse, who heads the college rankings division at U.S. News, declined to comment on the book, saying he had not yet read it. He did respond to questions from the authors, who said he provided data that helped their research.
The LSAT Above All Else
In interviews with law school admissions directors and faculty members who serve on admissions committees, the authors found an overwhelming focus on Law School Admission Test scores -- above everything else and sometimes regardless of other indications of whether an applicant would be a good or bad law student or lawyer. All producers of standardized admissions tests (and most admissions officials) say such exams should be used in concert with other measures and not dominate the process. The new book suggests this is not the case, in part because LSAT scores count more than undergraduate grades (12.5 percent vs. 10 percent) in the U.S. News methodology.
One faculty member of an admissions committee described being told by the dean of admissions that he would not reject anyone with LSAT scores above a certain level -- regardless of other indications of an applicant’s appropriateness. The admissions director said he needed these high-LSAT applicants “to keep the numbers up.” Further, faculty members described being told by an admissions dean pushing for high-LSAT students, “I don’t care what the committee says.”
Other admissions directors reported law schools giving them specific targets for an LSAT average, typically higher than the average from previous years, that they are instructed to produce.
While the pressure is most intense at law schools that aren’t at the top of the prestige hierarchy, the trend extends even to those on the top, the book says. (While protecting anonymity, the book frequently identifies the tiers of the law schools from which admissions and other officials are quoted.)
“You’re not going to be able to push your [grade point average] up very much, and the GPA doesn’t count as much as LSAT anyway,” said one faculty member involved in admissions work at a top law school. “And what [my law school] has done is basically focus its entire decision making on [the] LSAT score. It hasn’t done this formally, but the dean basically controls who is on the admissions committee and makes sure the people on the admissions committee will admit people primarily on the basis of LSAT.”
The point about this being a policy that is not public is common among law schools in the book. Generally, no law school admits to an emphasis on the LSAT along the lines that admissions deans (and law school deans) freely admit to in the book. (In many ways, the findings of the book mirror those of Julie Posselt’s Inside Graduate Admissions, published in January by Harvard University Press, which noted that way Ph.D. admissions committees at top graduate programs focus on the GRE far more than they admit to in public.)
By focusing on LSAT scores, admissions officials said they realized they were making decisions that depressed the enrollments of black and Latino applicants, who on average earn lower LSAT scores than do white and Asian applicants. In part this happens by awarding more and more non-need-based scholarships based on LSAT scores, meaning that large awards are being made to wealthier white and Asian applicants (compared to the pool as a whole).
But admissions directors told the authors that largest way the LSAT emphasis hurts black and Latino applicants is that law school officials fear they are judged by LSAT averages -- not just in the U.S. News points awarded for test scores, but in the general reputation of a law school. (And the U.S. News methodology gives 25 percent of a ranking to a “peer assessment” in which law school officials are polled about other law schools.)
“The most pernicious change is that I know a lot of schools who have become so driven by the LSAT profile that they’ve reduced the access of people who are nontraditional students,” said one law school official quoted in the book. “The higher [the] echelon you are, the more worried you are that if you let your student [LSAT] numbers slide to reflect your commitment to diversity, you’re going to be punished in the polls for that.”
The pressure on doing well in the rankings, the book says, quoting deans and others, isn’t just a matter of law school deans putting pressure on admissions directors. Deans report a trickle-down impact in which university presidents pressure them, in part by citing trustee and alumni pressure they receive.
Likewise, the book says law schools will do just about anything to game the system. For instance, U.S. News also gives points for whether law graduates find jobs and whether they find jobs for which a law degree is needed. With the market tightening for new lawyers, especially from nonelite law schools, law school career center officials reported being under pressure to get students jobs, any jobs, rather than focusing on which positions would be a good fit.
One director of a career center told the authors the pressure has reached the point that career counselors might say: “Can you get a job in the beauty salon painting nails until these numbers are in?” (The time reference is to when the law school would report enrollment levels.)
U.S. News and the American Bar Association have toughened their rules on reporting job placement in recent years such that a beauty salon job doesn’t count in the same way as a job at a law firm. But Sauder, in an interview, said such shifts don’t seem to scare law school officials. “It’s a continual game,” he said. “Where U.S. News tries to improve the measure, people find ways to game the measures.”
Could Rankings Be Improved?
The conclusion of the book argues that most of what the authors document in law schools applies in various ways to other parts of higher education in the United States and the world.
Still, the authors said, there could be ways that the law school rankings, and other rankings, might be improved.
Espeland said in an interview that she thought one of the biggest problems with rankings was the use of ordinal numbers to suggest a precision that doesn’t exist at all. This misleads prospective students and also creates more pressure on law school officials, she said.
The greatest pressure she saw in the interviews was at law schools that could be either 49th or 51st in U.S. News and were scrambling to do whatever they could to get the former rating instead of the latter -- even if the measures had nothing to do with actual quality. There is a sense that being in the top 25, or top 50, matters a lot. And the ordinal rankings convey a false sense that the 49th-best law school (per a ranking) is actually better than the 51st, she said.
Sauder suggested several ways to improve rankings. One would be to divide law schools by mission. Currently, he said, a law school that saw diversifying the law profession or producing more lawyers who would be engaged in helping low-income people likely would be ranked low, as such a mission would be advanced by looking beyond LSAT scores. Such institutions might be better compared to one another, he said.
Further, Sauder said, rankings would be better if they had competition. He has done other projects studying business school rankings, and while he found many of the same problems, the existence of multiple rankings appears to relieve the pressure to conform to the standard of any one of them, he said.
Still, Sauder doesn’t hold out much hope for reform. He hopes drawing attention to the impact of rankings may make educators more reflective about them, but he doesn’t expect them to disappear. “Rankings are here to stay,” he said.