Thumbnail-horizontal

Essay on how nonelite law schools can survive an existential market threat

The current existential threat to many law schools represents the canary in the coal mine for higher education.

Law schools have typically long enjoyed budget surpluses, and the universities in which they sit have benefited. But over the last few years, the financial situation of most law schools has reversed. Facing multiple years of declining enrollment and public support, alongside increasing costs and tuition discounting, law schools often are no longer a source of surplus revenue. Many law schools now are relying on financial support from their universities to stay afloat. This reversal is a harbinger for the rest of higher education, which is beginning to face some of the same challenges.

The steps law schools are taking -- in the hope they can survive just long enough for precrisis status quo conditions to return -- represent a doubling down on their traditional strategies. What’s so punishing is that because the precrisis status quo is gone forever, they are only worsening the overall outlook for the sector.

As we write in our new research paper published by the Clayton Christensen Institute, “Disrupting Law School: How Disruptive Innovation Will Revolutionize the Legal World,” the precrisis status quo is gone in large part because of the disruption of the traditional business model for the provision of legal services. Simply put, disruption is lessening the need for lawyers, which means law schools are producing too many lawyers for positions that increasingly do not exist.

Disruptions are bringing three significant changes in the legal services market.

First, from LegalZoom to Rocket Lawyer, more affordable, standardized and commoditized services now exist in an industry long dominated by opaque, highly customized and expensive offerings only accessible on a regular basis to a limited part of the population.

Second, from ROSS to the Practical Law Company, to e-discovery and predictive coding, disruptive innovations are allowing traditional law firms and general counsel’s offices to boost their productivity and perform the same amount of work with fewer lawyers. New technologies are able to do tasks that lawyers -- particularly entry-level lawyers -- performed traditionally. This is hollowing out the job market for newly minted lawyers.

And third, disruptive innovations are breaking the traditional rationale for granting lawyers a monopoly on the practice of law. Just as disrupters like Southwest Airlines and Uber changed who could operate in highly regulated industries, if a nonlawyer aided by software can provide the same service as a lawyer, then it is not the public but the lawyers who are being protected by the legal profession’s monopoly on the provision of legal advice.

State regulators of bar licensure are taking note. Some states are beginning to experiment with providing non-J.D.s limited licenses to provide legal services that until now only J.D.s could provide. The state of Washington was the first to license legal technicians -- non-J.D.s who are specially trained to advise clients in a limited practice area, in this case family law. Akin to a nurse-practitioner, under new regulations, a limited license legal technician (LLLT) can perform many of the functions that J.D.s traditionally performed. Only two years old, this new model is already gaining traction outside of Washington; the bars in California, Colorado, Massachusetts, New York, Oregon and Utah are each considering similar steps.

Because there are fewer jobs for lawyers, fewer people are seeking to enroll in law schools -- hence the crisis.

When disruption is afoot, incumbents typically remain tethered to their longstanding habits to sustain themselves. In the context of an increasingly competitive marketplace for law students, this is playing itself out in a quest to retain prestige in the legacy system for ranking law schools, the U.S. News & World Report rankings. Law schools continue to chase prestige by luring students whose LSAT scores and undergraduate grade point averages will help them move up the rankings. They are attracting students by offering tuition discounts -- during the 2013-14 school year just under 40 percent of law students paid full tuition.

But this push to retain prestige in turn reduces revenues and places the schools in a vicious cycle as the expenditures to remain competitive and improve continue to escalate, as has been true in all of higher education.

Lawsuits challenging the veracity of claims that law schools make around job placement are increasing, and if a verdict goes against a law school, the floodgates against them could open that much wider.

On top of all these challenges, higher education itself is, of course, seeing a variety of potential disrupters emerge, all powered at least in part through online learning.

To this point, disruptive innovators have not directly attacked law schools by offering new versions of a legal education. But were entities to emerge that paired online learning, with its flexibility and competency-based learning attributes, with place-based boot camp-type clinical experiences that trained students to practice law in a more affordable and practice-oriented fashion, the pressure on law schools would only increase.

We see four possible solutions for nonelite law schools.

First, launching an autonomous entity is a proven way to combat the impact of disruption. By harnessing an existing law school’s superior resources to pioneer the disruption and create enough separation so the parent entity’s existing processes and priorities do not stifle the new entity, a law school-based educational start-up could itself become the first disrupter.

Second, schools could use online learning technologies as a sustaining innovation to improve learning and control costs. By blending online learning with face-to-face instruction, law schools could incorporate more active learning and professional skills development into the existing three-year educational model.

Third, they could specialize by creating programs that allow J.D. students to focus deeply on a particular area of law. Students could learn core subjects through online, competency-based programs and their in-person experience would focus on extensive training in a particular area of law through experiential learning courses, live-client clinics, simulations, capstones, directed research and writing, moot court and trial advocacy exercises, and field placements.

And, finally, innovative law schools could build new, non-J.D. degree programs that specialize in training students for careers that combine elements from law, business and government -- in international trade, for example -- but do not fit neatly into existing law, business or government schools and are less time-consuming and expensive than, say, a joint J.D.-M.B.A. Or they could offer new credentials that prepare non-J.D.s for the many fields that intersect with the law but do not require a J.D. degree, such as regulatory compliance.

The future is coming for law schools; the question is whether law schools themselves will play a role in shaping that future or be shaped by the cascading circumstances surrounding them.

Michele R. Pistone is a professor of law at Villanova University's Charles Widger School of Law and an adjunct fellow at the Clayton Christensen Institute. Michael B. Horn is a cofounder and distinguished fellow at the Clayton Christensen Institute and a principal consultant for Entangled Solutions, which offers innovation services to higher education institutions.

Image Source: 
iStock

The academy polices the gender presentation of scholars (essay)

I have graduate school to thank for the years of tension between my queer gender identity and the norms and expectations of academe, writes Eric Anthony Grollman.

Job Tags: 
Section: 
Topic: 
Editorial Tags: 
Show on Jobs site: 

The problem with Trump's proposal on student loans and the liberal arts (essay)

The most significant challenge facing higher education today is our growing economic segregation. College completion rates for those at the lowest socioeconomic rungs continue to lag far behind those of their wealthier peers, not only due to diminished financial resources but also because of a lack of social and cultural capital. Redressing this phenomenon will require offering an education that prepares each and every student for success in work and life, while inspiring them to take seriously their social responsibilities in a society plagued by persistent inequities.

In fact, the board of directors of the Association of American Colleges and Universities, where I serve as president, expanded the organization’s mission in 2012 to embrace both inclusive excellence and liberal education as the foundation for institutional purpose and educational practice. The addition of inclusive excellence as one of AAC&U’s foundational principles reflects the ideal that access to educational excellence for all students -- not just the privileged -- is essential not only for our nation’s economy but, more important, for our democracy. Democracy cannot flourish in a nation divided into haves and have nots.

The equity imperative as an essential component of educating for democracy has been at the forefront of my mind during the past few weeks of nonstop coverage of the Republican and Democratic National Conventions. I have been particularly focused on the potential impact of various higher education policy proposals on AAC&U’s objective of advancing a public-spirited vision of inclusive excellence as inextricably linked to liberal education.

While higher education issues were pretty much absent from the Republican convention speeches, an earlier proposal by Donald Trump, developed by Sam Clovis, his educational policy adviser, to restrict eligibility for student loans in order to make it more difficult for those at “nonelite colleges” to major in the liberal arts previously caught my attention. Indeed, I am convinced that, if enacted, it would risk exacerbating what Thomas Jefferson termed an “unnatural aristocracy,” where only the wealthy gain the benefits of the kind of broad and engaged liberal education that Clovis himself insists is the absolute foundation for success in life.

Trump’s proposal makes at least two serious errors about the value of a college degree in today’s world. It assumes, first, that one’s undergraduate major is all that matters and, second, that only some majors will prepare students for success in the workplace. The evidence from AAC&U’s own surveys of employers, and from many economists, suggests that this is simply not the case. As noted in the title of our 2013 report of employers’ views, “It Takes More Than a Major,” more than 90 percent of employers agree that “a graduate’s ability to think critically, communicate clearly and solve complex problems is more important than their undergraduate major.” Students can develop such cross-cutting skills in a wide variety of chosen disciplines, if the courses are well designed and integrated within robust, problem-based general education programs.

A student’s undergraduate experience, and how well the experience advances critical learning outcomes, is what matters most, with 80 percent of employers agreeing that all students need a strong foundation in the liberal arts and sciences. A liberal education fosters the capacity to write, speak and think with precision, coherence and clarity; to propose, construct and evaluate arguments; and to anticipate and respond to objections. And it offers what employers value the most: the ability to apply knowledge in real-world settings, to engage in ethical decision making and to work in teams on solving unscripted problems with people whose views differ from one’s own. In a globally interdependent yet multicultural world, it is precisely because employers place a particular premium on innovation in response to rapid change that they emphasize students’ experiences with diverse populations, rather than narrow technical training.

The data confirm what we already know: students in all undergraduate majors can and should gain the outcomes of a broad liberal education. Therefore, we need to be vigilant in rebutting accusations of irrelevance and illegitimacy leveled specifically at the liberal arts and sciences and to recognize those charges for what they are: collusion in the growth of an intellectual oligarchy in which only the very richest and most prestigious institutions preserve access to the liberal arts traditions. Trump’s ostensible presumption that college is only about workforce training is dangerous to our democratic future.

Of course, it is unclear whether a proposal to use student loans to steer students away from certain majors could be implemented, given the challenges of predicting career trajectories based on majors and types of institutions. (After all, I was a philosophy major who began at a community college under funds from the Comprehensive Employment Training Act, Pell Grants and Perkins Loans.)

Still, in order to restore public trust in higher education and destabilize the cultural attitudes at the basis of Trump’s policy proposal, we need to demonstrate in a more compelling way to those outside of the academy, Democrats and Republicans alike, the extent to which we actually are teaching students 21st-century skills, preparing them to solve our most pressing global, national and local problems within the context of the workforce, not apart from it. To do so, our institutions of higher education must come together to engage in an honest assessment of our effectiveness and undertake a collaborative exchange of best practices. Our shared commitments to equity, democratic and economic vitality, and inclusive excellence demand nothing less.

Lynn Pasquerella is president of the Association of American Colleges and Universities.

Section: 
Editorial Tags: 
Image Source: 
Getty Images

Myanmar universities gain some autonomy

Section: 
Smart Title: 

Universities appear to be gaining some autonomy.

Democratic platform spurs excitement for advocates of free community college

Smart Title: 

After taking a backseat to debate over free tuition at four-year public colleges, free community college advocates see chance to build momentum.

Questions to ask before accepting a full-time faculty job (essay)

Keysha Whitaker highlights four pieces of advice she now wishes she’d had.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Source: 
iStock/kokouu

Lincoln University's decision to suspend its history major ignores W.E.B. Du Bois's important vision (essay)

Lincoln University -- a historically black university located in Jefferson City, Mo. -- suspended its major in history on its 150th anniversary. Explaining why that step was necessary, the president of the university emphasized, “We must make decisions like these as we look toward the future and the needs of the changing workforce.” Embedded within that statement is a declaration about higher education and its purpose: higher education should make good, high-paid workers. We should step back and ask whether this is really what we want from higher education.

Since I took my first academic position in 2010, I have continually heard in the news media, from visiting speakers and many other people that transforming students into employees is the purpose of higher education. Whenever I hear this, I cannot help but recall one particular graduate seminar when we discussed the writings of Marxist Louis Althusser. The discussion turned to higher education, and some people in the class claimed higher education was little more than part of a plot to provide good and obedient workers to the bourgeoisie. At the time, I thought that was overly reductive. I mean, we were talking about the supposed conspiracy of the bourgeoisie in class at an institution of higher education; surely this was not part of the plan.

Once I got my first academic job, however, I learned that this really was the perennial question in higher education. What should our general education curriculum look like? On which majors should we focus our resources? The answer was always put in the form of another question -- what do employers want from our graduates?

Perhaps because of the rising costs of higher education, politicians have increasingly said that the point of higher education is for students to make lots of money in their chosen careers. Is that what we want from higher education? Maybe a better question would be is that the only thing we want from higher education?

In her recent article in The American Historian, Nancy F. Cott indicates it is hard for humanities degrees -- like history -- to compete with degrees related to engineering if the only significant variable is potential earnings. One study found that throughout their careers, engineers consistently earned more than graduates in the humanities. But then, not everyone wants to be an engineer. As Cott phrased it, neither would we really want “to see an educated world populated by engineers only.” The fact is people educated in the humanities go on to important, although often not quite as lucrative, careers in education, government, law and a host of other interesting and relevant occupations.

Since students enter into significant debt to earn their diplomas, it seems reasonable for students to expect some return on their often significant investments. I hope as we review what we value in education, however, we do not simply ask which majors lead to the most lucrative careers.

Du Bois and Shaping Lives in the Present

What is higher education for? Should it exist solely for the purpose of manufacturing workers who make the greatest amount of money? It’s not a new question. It’s one that the renowned African-American historian W. E. B. Du Bois wrestled with in his speech commemorating Lincoln University’s 75th anniversary in 1941. He worried that the temptation would “come and recur to make an institution like this, a means of earning a living or of adding to income rather than an institution of learning.” Du Bois believed the kind of students Lincoln produced would end up changing the world for the better -- that it would be Lincoln students who would “show the majority the way of life.” Not from privileged and “powerful groups which from time to time rule the world have come salvation and culture,” he said, “but from the still small voice of the oppressed and the determined who knew more than to die and plan more than mere survival.” In short, Du Bois hoped that Lincoln would become “a center where the cultural outlook of this country is to be changed and uplifted and helped in the reconstruction of the world.”

Why did Du Bois believe that students at a university like Lincoln would be so influential? Du Bois recognized the power of history to shape lives in the present, and he rightly believed that this nation needed more diverse students if the status quo was ever going to change. In Du Bois’s day, history was being used to justify violence against African-Americans. In 1915, the original version of The Birth of a Nation premiered in the United States. In that movie, President Woodrow Wilson’s book History of the American People was regularly quoted. Audiences around the country saw Wilson declare through this movie that Reconstruction had been a misguided failure during which “the negroes were the office holders, men who knew none of the uses of authority, except its insolences.”

Wilson and many other people in the academy were part of what eventually became known as the Dunning School of Reconstruction History. For William Dunning, the historian for whom the broader school was named, Reconstruction was a failure because great numbers of the recently emancipated slaves “gave themselves up to testing their freedom. They wandered aimless but happy through the country.”

According to Dunning, it was Southern whites who “devoted themselves with desperate energy to the procurement of what must sustain the life of both themselves and their former slaves.” Lesson learned: black political participation meant misery for all, but exclusive white control meant the best for both black and white Southerners. The Dunning School of Reconstruction History justified the exclusion of black people from politics, and it implicitly justified the violence used to maintain that exclusion.

W. E. B. Du Bois labored to contradict those impressions. In his now widely read The Souls of Black Folks, Du Bois argued that it was not the irresponsible silliness of black people that doomed Reconstruction but rather the impossible problems facing the recently freed slaves. Reflecting upon the failure of efforts to make Southern African-Americans truly free, Du Bois noted that the Freedmen’s Bureau could not even “begin the establishment of goodwill between ex-masters and freedmen,” and perhaps most important, it could not “carry out to any considerable extent its implied promises to furnish the freedmen with land.”

Adding to the impossible challenge was the fact that much of the legislation created during Reconstruction was intended to punish the white South rather than empower the recently emancipated. As viewed by Du Bois, black equality was a cudgel used to punish the rebellious South rather than a goal in and of itself. Without any real support for black equality in either the North or the South, how could we expect anything but failure from Reconstruction? Because of those failures, black people suffered under the weight of white supremacy.

White historians largely ignored Du Bois’s conclusions for years; it was not until higher education expanded to include a wide swath of the American population -- due in large part to the GI Bill -- that more historians came to accept what he had long argued. Today, the vast majority of historians of Reconstruction accept his premise that many capable black politicians participated in the Reconstruction. Many worked to expand roads and education to include a plurality of the Southern population. At the time, their opponents saw this as waste and corruption, but the vision of those black politicians more closely aligned with our own expectations. We -- like they -- expect our governments to maintain public roads and public education. History looks different from the bottom up.

Reversing Dominant Narratives

Du Bois did not mention the degree in history specifically in his speech in 1941, but his life’s work demonstrated the importance he placed upon the historical imagination. He correctly predicted that making the academy more diverse would change the world for the better. History has been used to justify white supremacy, and it has been used to undermine it as well. As the population of historians changed, so too has the accepted narrative of the academy. That’s why Du Bois did not ask what majors earned the most money upon graduation but had a loftier vision for Lincoln’s future. America needed impassioned graduates from schools like Lincoln. Someone had to help reverse the dominant narratives prevalent in 1941 about black inferiority.

On Lincoln University’s 75th anniversary, Du Bois provided a powerful argument in favor of empowering Lincoln’s students to go and change the world. I fear that the end of history at Lincoln University means students will have less ability to do so in the future. That saddens me, because our national history is particularly relevant today. In 2016, a reinterpretation of The Birth of a Nation is set to debut and likely make radically different claims than its 1915 namesake. Why did the creators of this new movie -- which will document the slave rebellion led by Nat Turner -- give it that name? In 2016, some people have suggested that the civil rights movement of the 1960s was relatively short and its goals were largely accomplished. How then do we explain the emergence of the Black Lives Matter movement? Do these protesters fail to understand just how racially progressive our country has become? In 2016, some politicians have suggested that the United States is a nation founded by white ideas -- or “Western civilization” -- and people of color are guests. Are they right?

Our history as a nation has been used to answer those kinds of questions, and someone is going to be answering these questions in the future. In addition to asking what employers want our graduates to do, we should also ask whom we want to answer such important questions.

Graduates -- whether in the humanities, sciences or engineering -- will continue to get relevant and interesting jobs. Some will get paid more than others. In finding the right major, students will have to make strategic choices about what they want for their lives. Having spoken with many students, I know many are not so single-mindedly focused upon profit. Many have more philanthropic purposes in mind for their education. By so circumscribing the range of possibilities, however, we are creating a future in which Lincoln’s graduates will be able to get jobs but maybe not make history.

J. Mark Leslie is an associate professor of history at Lincoln University.

Image Caption: 
Lincoln University

Review of Robert Legvold, "Return to Cold War"

Historical analogy is a blunt and clumsy tool, and one serving better as a rhetorical device than as a method of analysis. The so-called law of the instrument -- i.e., “if all you have is a hammer, everything looks like a nail” -- applies to historical analogy with double force. And not just because the stock of examples is usually narrow and cliché addled, as with the entirely too familiar Munich Pact formula: “X is the new Hitler; Y’s policy resembles that of Neville Chamberlain in 1938; therefore doing Z would exhibit Churchill-like foresight.” Nearly always the analogy is blatant propaganda on behalf of Z. You never find it used for heuristic purposes, such as determining who the current Wernher von Braun might be.

The deeper problem is that historical analogy is always just on the verge of a cognitive short circuit. Finding patterns in the world is one of the evolutionarily adaptive knacks of the human brain, but we’re still learning to test and fine-tune it -- an especially difficult prospect when the patterns we find (or think we find) belong to the realm of human action. What looks like historical parallel from one angle may well turn out to be self-fulfilling prophecy. This can be a problem, especially if large weapons systems come into play.

While never so dramatic as analogies drawn from the Weimar-to-Nuremberg continuum, framing contemporary geopolitics as a Cold War-like standoff between two superpowers has been a regular temptation over the years -- at least, for the one superpower left standing. The main candidates to take the erstwhile Soviet Union’s place have been China and the global jihadist movement, with Putin-era Russia as a more recent nominee.

Indeed, books and articles with “New Cold War” in the title began appearing even before the old one was quite finished -- indications, perhaps, of a wish for a certain degree of familiarity and continuity between eras, a recognizable and navigable lineup of affiliations and hostilities. The passing of a quarter century has also made the bipolar thermonuclear quagmire of an earlier era look more orderly and stable than the anarchic system of free-floating multilateral anxiety that prevails today.

For the past couple of weeks, I was on the verge of reading Return to Cold War (Polity) by Robert Legvold, a professor emeritus of political science at Columbia University, but then kept putting it off. Perhaps it was the lack of a question mark in the title: Return to Cold War sound like an imperative. The cover shows an upside-down dove, depicted as if in the middle of a kamikaze dive or following airborne contact with a very high wall. The whole thing seemed designed to squelch any flicker of optimism that had somehow survived the day's news.

But once I actually opened the book, I found such apprehensions were misplaced: Legvold is not given to simplistic analogy nor does he indulge any notion that a return to long-term, two-sided geopolitical stalemate is possible, much less desirable. If relations between the United States and Russia have deteriorated to the point that comparisons to the Cold War status quo are appropriate, it is only within the limits defined by the absence, as yet, of ideological differences that call for a fight to the death of one system or the other. The deterioration was not inevitable, and even with it underway, there have been episodes of cooperation, albeit growing fewer and narrower as the mutual distrust continues. The common denominator between the countries has been the failure to assess things at all equitably: “If one searched for a leader, policy maker or politician on either side who included somewhere in her or his analysis thoughts about missteps or failings on both sides, the quest would have been in vain.”

Not that foresight was impossible. Legvold quotes a striking comment by George F. Kennan, author of the American policy of containment at the start of the Cold War. “Expanding NATO,” wrote Kennan in 1997 in The New York Times, “would be the most fateful error of American policy in the entire post-Cold War era. Such a decision may be expected to inflame the nationalist, anti-Western and militaristic tendencies in Russian opinion; to have an adverse effect on the development of Russian democracy; to restore the atmosphere of the Cold War in East-West relations; and to impel Russian foreign policy in directions decidedly not to our liking.” By no means is that the key explaining the entire course of the past 20 years, but as predictions go, it has its merits.

The author's presentation is succinct, lucid, fairly dispassionate and almost incessantly even-handed. I got the sense that he wrote it as if addressing an assembly of the policy-making elites of both sides, pointing out the confluence of blunders and rationalizations that worsened steadily to create a situation that, if not necessarily irreversible, now looks likely to continue in the same direction for some time to come.

Editorial Tags: 

The importance of building a team of wise people with competing viewpoints (essay)

In this hypothetical case study, Barbara McFadden Allen, Ruth Watkins and Robin Kaler explain how college leaders can -- and must -- surround themselves with a team of wise people with competing viewpoints.

Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

How to hold students' attention in the classroom (essay)

Brain Matters

Creating classroom experiences that grab and hold students' interest is not only good teaching, it's good science, writes Karen Costa.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Source: 
iStock

Pages

Subscribe to RSS - Thumbnail-horizontal
Back to Top