Building a Better Admissions Test
Most standardized admissions tests -- from the SAT and ACT to those used for admission to graduate and professional schools, such as the Law School Admission Test -- promise one thing: to predict academic success in the first year enrolled. Most standardized tests also face growing skepticism because white and Asian students tend to outperform, on average, black and Latino students.
What if a standardized test managed to predict much more than first-year success? And what if there existed the possibility of having standardized tests that didn't have ethnic or racial gaps, but better predicted long-term success?
Researchers at the University of California at Berkeley have been engaged in a long-term research project to produce such tests for use in its law school -- and they think they have a model that does those things exactly: predicts success as a lawyer (not just as a first-year student) and finds success across demographic groups. Given that law schools exist to produce lawyers, not first-year law students, Berkeley officials think their findings are significant and they are now releasing them for public view and -- they hope -- for testing on a national scale.
While Berkeley is at this time calling for more research and not an abandonment of traditional admissions processes, the report it has released suggests that the time is ripe for a major reconsideration of how law students are admitted.
"Rising numbers of law school applicants, concern over litigation and preoccupation with school rankings have pushed overemphasis on the LSAT to the breaking point," says the Berkeley report. "Definitions of 'merit' and 'qualification' have become too narrow and static; they hamper legal education’s goal of producing diverse, talented and balanced generations of law graduates who will serve the many mandates and constituencies of the legal profession. New predictors combined with existing LSAT measures could extend from prediction of law school success to prediction of professional effectiveness in law school admissions."
The Berkeley experiment involved a large-scale effort to identify the qualities that make good lawyers, and then to compare the correlation between scores on the Law School Admission Test as well as alternative measures that could be used for admission to later success as a lawyer. The alternative measures are a range of biographical, personality and "situational judgment" tests. What the researchers found was that while the LSAT correlates with first-year grades, as promised, it doesn't correlate with later success as a lawyer. Combinations of the other tests do correlate with success as a lawyer, as defined by having various qualities of success measured in the study, and without racial and ethnic gaps.
In some ways, the experiment is similar to efforts at Tufts University, where traditional admissions measures have been supplemented by questions and exercises designed to capture a much broader range of talents and knowledge. Even the College Board is now working on non-cognitive measures of success.
The Berkeley research project has been going on in one way or another for about 10 years. Boalt Hall, Berkeley's law school, experienced a steep drop in black and Latino enrollments when California barred affirmative action in public higher education admissions decisions, and that in turn prompted a debate about definitions of merit and particularly standardized tests.
"To admit primarily on the basis of LSAT test scores and grades to a professional field that has great importance to our society, seemed short-sighted," the report on the new research says. "Lawyering requires a variety of talents and skills beyond those represented in these important, but limited, measures. Over subsequent years, the emphasis on the LSAT plus grades has actually grown with the advent of such highly publicized rankings as the U.S. News & World Report for whom entering class median LSAT scores are a key factor. These trends were playing out against a desire on the part of law schools to train a diverse population of legal practitioners, a goal that overemphasis on purely cognitive measures suppressed."
So the law school, accepting the premise that a national comparison is needed, had Berkeley professors devise the research project that has now been released. First they looked for models to "predict effective lawyering," by conducting a series of interviews and surveys with more than 2,000 Berkeley law alumni. Eventually, they focused in on 26 factors that relate to success as a lawyer. Then the researchers started compiling a series of additional tests that could be used in some way to evaluate prospective law students -- these included existing personality tests such as the Hogan Personality Inventory and the Motives, Values, Preferences Inventory. Customized tests were also developed, such as a review of situational judgment and an analysis of biographical information. The researchers obtained LSAT scores, law school grades and demographic information about 1,148 alumni.
Then, the research matched the scores that these alumni received on the non-cognitive tests and measures with law school performance and the LSAT. The results suggested that the non-traditional measures could be used to predict success as a lawyer, while LSAT scores could not do so. In particular, the research found strong correlations between the situational judgment and biographical analysis and success as a lawyer. Notably, there were not race or gender differentials that arose out of the new potential admissions criteria.
Ellen Rutt, associate dean for admissions at the University of Connecticut’s law school and chair of the Law School Admissions Council, which runs the LSAT, said that the Berkeley results were "interesting" and were being discussed by the council. She said it was "too early" to know where the research might lead.
At the same time, Rutt called the LSAT an "enormously powerful tool" at predicting the success of first-year law students, and questioned whether the information in the Berkeley study could do that. "What lawyers do on a day-to-day basis may be very different from what a first-year law student does," she said. Rutt said her hope was that this new testing approach was "not meant to supplant the LSAT," but perhaps to provide "useful information" on top of the LSAT.
Robert Schaeffer, public education director of the National Center for Fair and Open Testing, said he was intrigued by the new research and hoped for additional work on this method of testing. He said his organization "welcomes all attempts to develop new measures that are as good or better predictors of meaningful performance while imposing a less disparate impact on historically excluded groups."