The current existential threat to many law schools represents the canary in the coal mine for higher education.
Law schools have typically long enjoyed budget surpluses, and the universities in which they sit have benefited. But over the last few years, the financial situation of most law schools has reversed. Facing multiple years of declining enrollment and public support, alongside increasing costs and tuition discounting, law schools often are no longer a source of surplus revenue. Many law schools now are relying on financial support from their universities to stay afloat. This reversal is a harbinger for the rest of higher education, which is beginning to face some of the same challenges.
The steps law schools are taking -- in the hope they can survive just long enough for precrisis status quo conditions to return -- represent a doubling down on their traditional strategies. What’s so punishing is that because the precrisis status quo is gone forever, they are only worsening the overall outlook for the sector.
As we write in our new research paper published by the Clayton Christensen Institute, “Disrupting Law School: How Disruptive Innovation Will Revolutionize the Legal World,” the precrisis status quo is gone in large part because of the disruption of the traditional business model for the provision of legal services. Simply put, disruption is lessening the need for lawyers, which means law schools are producing too many lawyers for positions that increasingly do not exist.
Disruptions are bringing three significant changes in the legal services market.
First, from LegalZoom to Rocket Lawyer, more affordable, standardized and commoditized services now exist in an industry long dominated by opaque, highly customized and expensive offerings only accessible on a regular basis to a limited part of the population.
Second, from ROSS to the Practical Law Company, to e-discovery and predictive coding, disruptive innovations are allowing traditional law firms and general counsel’s offices to boost their productivity and perform the same amount of work with fewer lawyers. New technologies are able to do tasks that lawyers -- particularly entry-level lawyers -- performed traditionally. This is hollowing out the job market for newly minted lawyers.
And third, disruptive innovations are breaking the traditional rationale for granting lawyers a monopoly on the practice of law. Just as disrupters like Southwest Airlines and Uber changed who could operate in highly regulated industries, if a nonlawyer aided by software can provide the same service as a lawyer, then it is not the public but the lawyers who are being protected by the legal profession’s monopoly on the provision of legal advice.
State regulators of bar licensure are taking note. Some states are beginning to experiment with providing non-J.D.s limited licenses to provide legal services that until now only J.D.s could provide. The state of Washington was the first to license legal technicians -- non-J.D.s who are specially trained to advise clients in a limited practice area, in this case family law. Akin to a nurse-practitioner, under new regulations, a limited license legal technician (LLLT) can perform many of the functions that J.D.s traditionally performed. Only two years old, this new model is already gaining traction outside of Washington; the bars in California, Colorado, Massachusetts, New York, Oregon and Utah are each considering similar steps.
Because there are fewer jobs for lawyers, fewer people are seeking to enroll in law schools -- hence the crisis.
When disruption is afoot, incumbents typically remain tethered to their longstanding habits to sustain themselves. In the context of an increasingly competitive marketplace for law students, this is playing itself out in a quest to retain prestige in the legacy system for ranking law schools, the U.S. News & World Report rankings. Law schools continue to chase prestige by luring students whose LSAT scores and undergraduate grade point averages will help them move up the rankings. They are attracting students by offering tuition discounts -- during the 2013-14 school year just under 40 percent of law students paid full tuition.
But this push to retain prestige in turn reduces revenues and places the schools in a vicious cycle as the expenditures to remain competitive and improve continue to escalate, as has been true in all of higher education.
Lawsuits challenging the veracity of claims that law schools make around job placement are increasing, and if a verdict goes against a law school, the floodgates against them could open that much wider.
On top of all these challenges, higher education itself is, of course, seeing a variety of potential disrupters emerge, all powered at least in part through online learning.
To this point, disruptive innovators have not directly attacked law schools by offering new versions of a legal education. But were entities to emerge that paired online learning, with its flexibility and competency-based learning attributes, with place-based boot camp-type clinical experiences that trained students to practice law in a more affordable and practice-oriented fashion, the pressure on law schools would only increase.
We see four possible solutions for nonelite law schools.
First, launching an autonomous entity is a proven way to combat the impact of disruption. By harnessing an existing law school’s superior resources to pioneer the disruption and create enough separation so the parent entity’s existing processes and priorities do not stifle the new entity, a law school-based educational start-up could itself become the first disrupter.
Second, schools could use online learning technologies as a sustaining innovation to improve learning and control costs. By blending online learning with face-to-face instruction, law schools could incorporate more active learning and professional skills development into the existing three-year educational model.
Third, they could specialize by creating programs that allow J.D. students to focus deeply on a particular area of law. Students could learn core subjects through online, competency-based programs and their in-person experience would focus on extensive training in a particular area of law through experiential learning courses, live-client clinics, simulations, capstones, directed research and writing, moot court and trial advocacy exercises, and field placements.
And, finally, innovative law schools could build new, non-J.D. degree programs that specialize in training students for careers that combine elements from law, business and government -- in international trade, for example -- but do not fit neatly into existing law, business or government schools and are less time-consuming and expensive than, say, a joint J.D.-M.B.A. Or they could offer new credentials that prepare non-J.D.s for the many fields that intersect with the law but do not require a J.D. degree, such as regulatory compliance.
The future is coming for law schools; the question is whether law schools themselves will play a role in shaping that future or be shaped by the cascading circumstances surrounding them.
Michele R. Pistone is a professor of law at Villanova University's Charles Widger School of Law and an adjunct fellow at the Clayton Christensen Institute. Michael B. Horn is a cofounder and distinguished fellow at the Clayton Christensen Institute and a principal consultant for Entangled Solutions, which offers innovation services to higher education institutions.
The Education Department has released a new list of colleges and universities under heightened cash monitoring, which means that they are subject to far greater oversight than other colleges in federal student aid programs. A variety of compliance issues can land a college on the list. Most of the colleges on the list are for-profit institutions.
The latest list, which has 513 colleges on it, was current as of June 1 and may be found here. That total is down slightly from the 528 colleges on the March 1 list.
The Higher Learning Commission placed MacMurray College on probation and extended the probation of Urbana University at the agency's June meeting, the accreditor announced. The president of MacMurray criticized the accreditor's action as "unjust," saying the agency failed to give the college sufficient time to fix problems created under a previous administration.
The commission also granted accreditation to the reborn Antioch College, in Ohio.
A request to write letters evaluating other faculty for tenure and promotion means that other people think you are qualified to make this important assessment. It can also be terrifying, write Joya Misra and Jennifer Lundquist.
A national outcry regarding the cost of education and the poor performance of institutions in graduating their students has raised questions about the extent to which accreditors are fulfilling their mission of quality assurance. Politicians have expressed outrage, for instance, at the fact that accreditors are not shutting down institutions with graduation rates in the single digits.
At the same time, accreditors and others have noted that the graduation data available from the National Center for Education Statistics’ Integrated Postsecondary Education Data System, familiarly known as IPEDS, include only first-time, full-time student cohorts and, as such, are too limited to be the measure by which institutional success is measured -- or by which accreditation is judged. But simply noting this problem does nothing to solve it. The imperative and challenge of getting reliable data on student success must be more broadly acknowledged and acted upon. The WASC Senior College and University Commission (WSCUC) has taken important steps to do just that.
As is well known, IPEDS graduation rates include only those students who enrolled as first-time, full-time students at an institution. Of the approximately 900,000 undergraduate students enrolled at institutions accredited by WSCUC, about 40 percent, or 360,000, fit this category. That means approximately 540,000 students in this region, including all transfer and part-time students, are unaccounted for by IPEDS graduation rate data.
The National Student Clearinghouse provides more helpful data regarding student success: while including full-time student cohorts, part-time students are also considered, as well as students who combine the two modes, and data include information on students who are still enrolled, have transferred and are continuing their studies elsewhere or have graduated elsewhere. Six-year student outcomes, however, are still the norm.
Since 2013, WSCUC has worked with a tool developed by one of us -- John Etchemendy, provost at Stanford University and a WSCUC commissioner -- that allows an institution and our commission to get a fuller and more inclusive picture of student completion. That tool, the graduation rate dashboard, takes into account all students who receive an undergraduate degree from an institution, regardless of how they matriculate (first time or transfer) or enroll (full time or part time). It is a rich source of information, enabling institutions to identify enrollment, retention and graduation patterns of all undergraduate students and to see how those patterns are interrelated -- potentially leading to identifying and resolving issues that may be impeding student success.
Here’s how it works.
WSCUC collects six data points from institutions via our annual report, the baseline data tracked for all accredited, candidate and eligible institutions and referenced by WSCUC staff, peer evaluators and the commission during every accreditation review. On the basis of those data points, we calculate two completion measures: the unit redemption rate and the absolute graduation rate. The unit redemption rate is the proportion of units granted by an institution that are eventually “redeemed” for a degree from that institution. The absolute graduation rate is the proportion of students entering an institution who eventually -- a key word -- graduate from that institution.
The idea of the unit redemption rate is easy to understand. Ideally, every unit granted by an institution ultimately results in a degree (or certificate). Of course, no institution actually achieves this ideal, since students who drop out never “redeem” the units they take while enrolled, resulting in a URR below 100 percent. So the URR is an alternative way to measure completion, somewhat different from the graduation rate, since it counts units rather than students. But most important, it counts units that all students -- full time and part time, first time and transfer -- take and redeem.
Interestingly, using one additional data point (the average number of units taken by students who drop out), we can convert the URR into a graduation measure, the absolute graduation rate, which estimates the proportion of students entering a college or university (whether first time or transfer) who eventually graduate. Given the relationship between annual enrollment, numbers of units taken in a given year and the length of time it takes students to complete their degrees -- all of which vary -- the absolute graduation rate is presented as an average over eight years. While not an exact measure, it can be a useful one, especially when used alongside IPEDS data to get a more nuanced and complete picture of student success at an institution.
What is the advantage to using this tool? For an institution like Stanford -- where enrollments are relatively steady and the overwhelming majority of students enter as first-time, full-time students and then graduate in four years -- there is little advantage. In fact, IPEDS data and dashboard data look very similar for that type of institution: students enter, take roughly 180 quarter credits for an undergraduate degree and redeem all or nearly all of them for a degree in four years. For an institution serving a large transfer and/or part-time population, however, the dashboard can provide a fuller picture than ever before of student success. One of our region’s large public universities has a 2015 IPEDS six-year graduation rate of 30 percent, for example, while its absolute graduation rate for the year was 61 percent.
What accounts for such large discrepancies? For many WSCUC institutions, the IPEDS graduation rate takes into account fewer than 20 percent of the students who actually graduate. The California State University system, for example, enrolls large numbers of students who transfer from community colleges and other institutions. Those students are counted in the absolute graduation rate, but not in the IPEDS six-year rate.
As the dashboard includes IPEDS graduation rate data as well as the percentage of students included in the first-time, full-time cohort, it makes it possible to get a better picture of an institution’s student population as well as the extent to which IPEDS data are more or less reliable as indicators of student success at that institution.
Here’s an example: over the years between 2006 and 2013, at California State University Dominguez Hills, the IPEDS six-year graduation rate ranged between 24 percent and 35 percent. Those numbers, however, reflect only a small percentage of the university’s student population. The low of 24 percent in 2011 reflected only 7 percent of its students; the high of 35 percent in 2009 reflected just 14 percent. The eight-year IPEDS total over those years, reflecting 10 percent of the student population, was 30 percent.
In contrast, looking at undergraduate student completion using the dashboard, we see an absolute graduation rate of 61 percent -- double the IPEDS calculation. Clearly, the dashboard gives us a significantly different picture of student completion at that institution.
And there’s more. To complement our work with the dashboard, WSCUC staff members have begun work on triangulating dashboard data with data from the National Student Clearinghouse and IPEDS to look at student success from various angles. We recognize that all three of these tools have limitations and drawbacks as well as advantages: we’ve already noted the limitations of the IPEDS and National Student Clearinghouse data, as well as the benefit of the inclusion in the latter’s data of transfer students and students still enrolled after the six-year period. In addition, the data from both IPEDS and the clearinghouse can be disaggregated by student subpopulations of gender and ethnicity, as well as by institution type, which can be very beneficial in evaluating institutional effectiveness in supporting student success.
Pilot work has been done to plot an institution’s IPEDS and dashboard data in relation to the clearinghouse data, displayed as a box-and-whisker graph that provides the distribution of graduation rates regionally by quartile in order to give an indication of an institution’s success in graduating its students relative to peer institutions within the region. While care must be taken to understand and interpret the information provided through these data, we do believe that bringing them together in this way can be a powerful source of self-analysis, which can lead to institutional initiatives to improve student completion.
As noted, WSCUC has been working with the dashboard since 2013. While we are excited and encouraged regarding the benefits of the tool in providing a more complete and nuanced picture of student success, we also recognize that we have a great deal of work ahead of us to make the tool as useful as we believe it can be. After two pilot projects including a limited number of WSCUC-accredited institutions, the required collection of data by all WSCUC colleges and universities in 2015 revealed a number of challenges to institutions in submitting the correct data. The dashboard can be somewhat difficult to understand, especially for institutions with large shifts in enrollment patterns. And unlike National Student Clearinghouse data, dashboard data, at least at this point, cannot be disaggregated to reveal patterns of completion for various student subpopulations.
Such issues notwithstanding, we are encouraged by the value of the dashboard that we have seen to date and are committed to continuing to refine this tool. WSCUC staff members have given presentations both regionally and nationally on the dashboard, including one to IPEDS trainers to show them the possibilities of this tool to extend the data available nationally regarding student completion.
We are hopeful that other accreditors and possibly the NCES will find the dashboard a useful tool and, if so, adopt it as an additional completion measure for institutions across the country. In any case, we will continue to do this work regionally so as to not just complain about the available data but to also contribute to their improvement and usefulness.
Mary Ellen Petrisko is president of the WASC Senior College and University Commission. John Etchemendy is provost of Stanford University.
The Association of Public and Land-grant Universities on Wednesday said it backed an approach to the accreditation process where the agencies focus more time and energy on colleges that have problems than on those that don't. The U.S. Department of Education also recently has said it supports risk-based accreditation.
“Institutions with weaker outcomes often need a more careful review than schools with stronger track records that end up almost always being reaccredited to no one’s surprise,” said Peter McPherson, the group's president, in a written statement. “A risk-based accreditation process would ensure all schools are subject to a review, but that schools with a strong track record do not need to go through the same extensive and expensive review process as those with weak programs and poor student outcomes.”