Politics (national)

Romney and Harvard program put new spin on '60s higher ed

Higher education of the 1960s usually brings to mind student rebellion and campus unrest. Berkeley and Mario Savio are often invoked to symbolize the era of colleges and the counterculture.  But this is distorted because it is incomplete. Why not have a collective student memory that includes Mitt as well as Mario?

This seems counterintuitive to the counterculture  -- but only because we have overlooked all the innovations that were taking place on American campuses in these tumultuous years. I want to make the case to add seats on the historical stage of higher education – especially with the upcoming November presidential election. 

To truly understand the long-term legacy of the 1960s, we need to include Harvard’s Joint M.B.A. and J.D. program as the institution -- and its famous alumnus, Mitt Romney, as the individual – that also are part of the higher education lyrics when boomers are “Talkin’ ‘Bout My Generation.”

Scott McKenzie attracted a lot of listeners in 1967 when he sang, “If you come to San Francisco, be sure to wear a flower in your hair....” Mitt Romney, however, was not listening and went in a different direction – politically and geographically. In 1969 he left San Francisco (well, Stanford and Palo Alto) and -- after a detour to France -- headed east to graduate school at Harvard for its brand-new joint M.B.A./ J.D. program, which was founded that year. The rest is history -- and no less than the higher education of perhaps our next president.

Put aside such artifacts as Steven Kelman’s memoir about student protest at Harvard in 1969-70 in his book, Push Comes to Shove. Forget James Simon Kunen’s The Strawberry Statement and its provocative subtitle, “Notes of a College Revolutionary.” Above all, suspend from memory the images of Harvard first-year law students as depicted in Hollywood’s The Paper Chase. It’s time to reconstruct the early years of Harvard’s joint M.B.A./J.D. program and its students, which were a powerful, albeit low-profile counter to the counterculture.

Harvard’s joint program brings to mind the academic equivalent of epoxy cement -- two ingredients (law school and business school) each of which is rigorous in its own right, and when mixed, create an incredibly hard bond -- probably impervious to broad humane or societal considerations. Two articles over the past six months in the New York Times provide some insights into both the joint program and into Romney as a graduate student: Jodi Kantor’s “At Harvard, A Master’s in Problem Solving,” and Peter Lattmann and Richard Perez-Pena's “Romney, at Harvard, Merged Two Worlds.”

As Lattmann and Perez-Pena wrote: “One of the most exclusive clubs in academe is a Harvard University dual-degree program allowing graduate students to attend its law and business schools simultaneously, cramming five years of education into four. On average, about 12 people per year have completed the program — the overachievers of the overachievers — including a striking number of big names in finance, industry, law and government.  ...In addition to Mr. Romney, founder of Bain Capital, the roughly 500 graduates include Bruce Wasserstein, who led the investment bank Lazard until he died in 2009; leaders of multibillion-dollar hedge fund and private equity firms like Canyon Capital Advisors, Silver Lake Partners and Crestview Partners; high-ranking executives at banks like Citigroup and Credit Suisse; C. James Koch, founder of the Boston Beer Company; and Theodore V. Wells Jr., one of the nation’s top trial lawyers.”

No doubt these graduate students were smart and worked hard. Beyond that, it’s important to note some characteristics that accompanied this program and its work ethic. First, the formal curriculum pulled inward rather than outward.

Second, Romney as a student in the joint program tended to screen out external events as distractions. According to Kantor, “And unlike Barack Obama, who attended Harvard Law School more than a decade later, Mr. Romney was not someone who fundamentally questioned how the world worked or talked much about social or policy topics. Though the campus pulsed with emotionally charged political issues, none more urgent than the Vietnam War, Mr. Romney somehow managed to avoid them.” Kantor reinforces this depiction by quoting one of Romney’s law school study partners, who recalled, “Mitt’s attitude was to work very hard in mastering the materials and not to be diverted by political or social issues that were not relevant to what we were doing.”

The program pushed toward intensive insularity using the case study pedagogy that relied on no books or contextual sources – all at a time when genuine interdisciplinary, broad perspectives were finding some breathing space in prestigious professional schools elsewhere.  It’s too bad for the education of future business (and political) leaders that the joint program that started in 1969  did not consider the very different perspective offered by Earl Cheit, professor (and later, dean) of the business school at the University of California at Berkeley.  In 1964, with support from the Ford Foundation, Cheit invited five scholars outside the field of business to join him in conducting a workshop that for the first time brought together business school professors with others to explore and preserve “the connection between the intellectual adventure and the business adventure.”

What a contrast to the Harvard Business School’s case study approach! Cheit’s Ford Foundation program at Berkeley featured, first as talks and later as readings, a cornucopia of ideas and issues, led off by the economist Robert L. Heilbroner’s “View From the Top: Reflections on a changing business ideology.” John William Ward, historian and president of Amherst College, spoke about “The Ideal of Individualism and the Reality of Organization.”

Henry Nash Smith of Berkeley’s English Department discussed businessmen in American fiction in the “Search for a Capitalist Hero.” Historian Richard Hofstadter asked, “What happened to the Anti-Trust Movement?” The economists Paul Samuelson and Cheit himself analyzed changing roles of business in how managers cultivate social responsibility – and how American society balanced personal freedoms and economic freedoms in a mixed economy.” Guest speakers from France and Belgium provided American businessmen with perspectives on business in Europe.

Cheit’s knowledgeable involvement in exploring the past and future of higher education did not stop with this Ford Foundation business program. In 1974-75 he sought (and received) permission to teach a graduate course in the School of Education – one  in which he explored how it was that professional schools of business, agriculture, forestry, and engineering came to have a place in the American university. The course content and topic were so novel that it led to publication of a book by the Carnegie Commission, The Useful Arts and the Liberal Tradition

Once again, it showed that an intellectual and administrative leader in the business school could look outward within the multi-versity and reach outward to the larger society and the economy by complicating the questions rather than doggedly seeking to solve business problems. Cheit also was one of the leading economists to sound an alert to the deteriorating financial condition of the nation’s colleges and universities in his 1971 book on higher education’s “new depression.”

In contrast, what were the aims and goals of the Harvard joint program? One observation provided by NYT reporters is revealing: “But former students and professors say it makes sense that a group of overachievers would be drawn to financial markets, a hypercompetitive field with the promise of immense riches.”

Really? Why were these overachievers necessarily confined to these goals?  What if the teaching and discussion had included some consideration of ethics, public good, and social responsibility – along with pursuit of individual prosperity?  It’s important to remember that there were good alternatives.  For example, Cheit’s Berkeley approach with the Ford Foundation project was to create curiosity, exploration and reasonable doubt about our national obsession with business.  

The Harvard joint program, especially its business school component, emphasized the sharpening of decision-making tools, especially in finance.  Each, of course, has their place. But if a concern of a university is to ask, “Knowledge for what?,” it is Cheit’s Berkeley model more than Harvard’s joint program that is sorely needed for the thoughtful leadership, whether in business or politics, required for the early 21st century.  I’ll be thinking about that on my way to the polls on Election Day in November.

John R. Thelin is a professor at the University of Kentucky.  He was a graduate student at the University of California at Berkeley from 1969 to 1974. He is author of A History of American Higher Education (2011).

The credit hour causes many of higher education's problems, report finds

Smart Title: 

The credit hour wasn't supposed to measure learning, for good reason, according to a new report, which recommends how to revise or even drop the standard without leading to abuse.

Review of Howard Ball, "At Liberty to Die: The Battle for Death with Dignity in America"

Of the many strange things in Gulliver’s Travels that make it hard to believe anyone ever considered it a children’s book, the most disturbing must be the Struldbruggs, living in the far eastern kingdom of Luggnagg, not covered by Google Maps at the present time.

Gulliver’s hosts among the Luggnaggian aristocracy tell him that a baby is born among them, every so often, with a red dot on the forehead -- the sign that he or she is a Struldbrugg, meaning an immortal. Our narrator is suitably amazed. The Struldbruggs, he thinks, have won the cosmic lottery. Being “born exempt from that universal Calamity of human Nature,” they “have their Minds free and disengaged, without the Weight and Depression of Spirits caused by the continual Apprehension of Death.”

The traveler has no trouble imagining the life he might lead as an immortal, given the chance. First of all, Gulliver tells his audience at dinner, he would spend a couple of hundred years accumulating the largest fortune in the land. He’d also be sure to master all of the arts and sciences, presumably in his spare time. And then, with all of that out of the way, Gulliver could lead the life of a philanthropic sage, dispensing riches and wisdom to generation after generation. (A psychoanalytic writer somewhere uses the expression “fantasies of the empowered self,” which just about covers it.)

But then the Lubnaggians bring him back to reality by explaining that eternal life is not the same thing as eternal youth. The Struldbruggs “commonly acted like Mortals, till about thirty Years old,” one of Gulliver’s hosts explains, “after which by degrees they grew melancholy and dejected, increasing in both till they came to four-score." The expression “midlife crisis” is not quite the one we want here, but close enough.

From the age of eighty on, “they had not only all the Follies and Infirmities of other old Men, but many more which arose from the dreadful Prospect of never dying.” Forget the mellow ripening of wisdom: Struldbruggs “were not only Opinionative, Peevish, Covetous, Morose, Vain, Talkative, but incapable of Friendship, and dead to all natural Affection.”

It gets worse. Their hair and teeth fall out. “The Diseases they were subject to still continuing without increasing or diminishing,” Gulliver tells us. “In talking they forgot the common Appellation of Things, and the Names of Persons, even of those who are their nearest Friends and Relations. For the same Reason they never can amuse themselves with reading, because their Memory will not serve to carry them from the beginning of a Sentence to the end; and by this Defect they are deprived of the only entertainment whereof they might otherwise be capable.”

It is a vision of hell. Either that, or a prophecy of things to come, assuming the trends of the last few decades continue. Between 1900 and 2000, the average life expectancy in the United States rose from 49 to 77 years; between 1997 and 2007, it grew by 1.4 years. This is not immortality, but it beats dying before you reach 50. The span of active life has extended as well. The boundary markers of what counts as old age keep moving out.

From a naïve, Gulliverian perspective, it is all to the good. But there’s no way to quantify changes in the quality of life. We live longer, but it's taking longer to die as well. Two-thirds of deaths among people over the age of 65 in the United States are caused by three chronic conditions: heart disease, cancer, and stroke. The “life” of someone in a persistent vegetative state (in which damage to the cerebral cortex is so severe and irreversible that cognitive functions are gone for good) can be prolonged indefinitely, if not forever.

More horrific to imagine is the twilight state of being almost vegetative, but not quite, with some spark of consciousness flickering in and out -- a condition of Struldbruggian helplessness and decay. “I grew heartily ashamed of the pleasing Visions I had formed,” says Gulliver, “and thought no Tyrant could invent a Death into which I would not run with Pleasure from such a Life.”

Howard Ball’s book At Liberty to Die: The Battle for Death With Dignity in America (New York University Press) is a work of advocacy, as the subtitle indicates. The reader will find not the slightest trace of Swiftian irony in it. Ball, a professor emeritus of political science at the University of Vermont, is very straightforward about expressing bitterness -- directing it at forces that would deny “strong-willed, competent, and dying adults who want to die with dignity when faced with a terminal illness” their right to do so.  

The forces in question fall under three broad headings. One is the religious right, which Ball sees as being led, on this issue at least, by the Roman Catholic Church. Another is the Republican Party leadership, particularly in Congress, which he treats as consciously “politicizing the right-to-die issue” in a cynical manner, as exemplified by the memo of a G.O.P. operative on “the political advantage to Republicans [of] intervening in the case of Terri Schiavo.” (For anyone lucky enough to have forgotten: In 1998, after Terri Shiavo had been in a persistent vegetative state for eight years, her husband sought to have her feeding tube removed, setting off numerous rounds of litigation, as well as several pieces of legislation that included bills in the US Congress. The feeding tube was taken out and then reattached twice before being finally removed in 2005, after which Schiavo died. The website of the University of Miami's ethics program has a detailed timeline of the Schiavo case.)

The third force Ball identifies is that proverbial 800-pound gorilla known as the Supreme Court of the United States. Its rulings in Washington v. Glucksburg and Vacco v. Quill in 1997 denied the existence of anything like a constitutionally protected right to physician-assisted death (PAD). States can outlaw PAD -- or permit it, as Montana, Oregon, and Washington do at present. In the epigraph to his final chapter, Ball quotes a Colorado activist named Barbara Coombs Lee: “We think the citizens of all fifty states deserve death with dignity.” But the Supreme Court of the United States will not be making that a priority any time soon.

“The central thesis of the book,” states Ball, “is that the liberty found in the U.S. Constitution’s Fifth and Fourteenth ‘Due Process’ Amendments extends... [to] the terminally ill person's right to choose to die with dignity -- with the passive assistance of a physician -- rather than live in great pain or live a quality-less life.” The typical mode of “passive assistance” would be “to give painkilling medications to a terminally ill patient, with the possibility that the treatment will indirectly hasten the patient’s death.”

Ball notes that a Pew Research Center Survey from 2005 showed that an impressive 84 percent level of respondents “approved of patients being able to decide whether or not to be kept alive through medical treatment or choosing to die with dignity.”

Now, for whatever it’s worth, that solid majority of 84 percent includes this columnist. If the time for it ever comes, I’d want my doctor to supersize me on the morphine drip without breaking any laws. Throughout Ball's narrative of the successes and setbacks of the death-with-dignity cause, I cheered at each step forward, and felt appalled all over again while reading the chapter he calls “Terri Schiavo’s Tragic Odyssey,” although it did seem like the more suitable adjective would be “grotesque.” Tragedy implies at least some level of dignity.

The author also introduced me to a blackly humorous expression, “death tourist,” which refers to a person "visiting" a state to take advantage of physician-assisted suicide being legal there.

As a member of the choir, I liked Ball's preaching, but it felt like the sermon was missing an index card or two. As mentioned earlier, the book’s “central thesis” is supposed to be that the due-process guarantees in Constitution extend to the right to death with dignity. And so the reader has every reason to expect a sustained and careful argument for why that legal standard applies. None is forthcoming. The due-process clauses did come up when the Supreme Court heard oral arguments in 1997, but it rejected them as inapplicable. This would seem to be the point in the story where that central thesis would come out swinging. The author would show, clearly and sharply, why the Court was wrong to do so. He doesn't. It is puzzling.

Again, it sounds very categorical when Ball cites that Pew survey from 2005 showing 84 percent agreement that individuals had a right to choose an exit strategy if medical care were not giving them a life they felt worth living. But the same survey results show that when asked whether they believed it should be legal for doctors to "assist terminally ill patients in committing suicide," only 44 percent favored it, while 48 percent were opposed. With the matter phrased differently -- surveyors asking if it should be legal for doctors to "give terminally ill patients the means to end their lives" -- support went up to 51 percent, while 40 percent remained opposed. This reveals considerably more ambivalence than the 84 percent figure would suggest.

The notion that a slippery slope will lead from death with dignity to mass programs of euthanasia clearly exasperates Ball, and he can hardly be faulted on that score. A portion of the adult population is prepared to believe that any given social change will cause the second coming of the Third Reich, this time on American soil. (Those who do not forget the History Channel are condemned to repeat fairly dumb analogies.) But the slippery-slope argument will more likely be refuted in practice than through argument. Whether or not the law recognizes it, the right to make decisions about one’s own mortality or quality of life will exist any time someone claims it. One of the medical profession’s worst-kept secrets for some time now is that plenty of physicians will oblige a suffering patient with the means to end their struggle. (As Ball notes, this came up in the Supreme Court discussions 15 years ago.)

And the demand is bound to grow, as more and more of us live long enough to see -- like Gulliver -- that there are worse fates than death. Brilliant legal minds should apply themselves to figuring out how to make an ironclad case for the right a decent departure from this mortal coil. At Liberty to Die is useful as a survey of some obstacles standing in the way. But in the meantime, people will find ways around those obstacles, even if it means taking a one-way, cross-continental trip to the Pacific Northwest. There are worse ways to go.

Presidential race brings scrutiny for candidates on higher education

Smart Title: 

Romney gets scrutiny for praise of for-profit college led by campaign donor. Biden angers faculty by suggesting their salaries force tuition up. Gingrich attacks fluency in French.

Essay: Educated illegal immigrants are a net financial gain for U.S.

Ronald Reagan once said, “Don’t be afraid to see what you see.” The current flap over Gov. Rick Perry’s defense of in-state tuition for students whose parents are in the United States illegally drives us to take off the lid and take a peek. 

And what we see is that illegal-immigrant students pay back more than they take.

Daniel Griswold, an immigration expert at Cato Institute, wrote to me recently in response to my inquiry, “In 1997, the National Research Council published a major study on immigration. It found that an immigrant with a college education is a huge net plus for the United States.”

Griswold reports this finding of the NRC study: “Immigrants and their descendants represent a net fiscal gain for the United States. The typical immigrant and all of his or her descendants represent a positive $80,000 fiscal gain to the government. An immigrant with more than a high school education (plus descendants) represents a $198,000 fiscal gain, one with a high school diploma a $51,000 gain, and one with less than a high school education a $13,000 loss.”

Some will counter that college slots for illegal immigrants should be given instead to poor U.S.-born students. But most of these students cannot afford college. Tuition, for example, at Texas’ universities will average this year about $8,500, and the College Board projects that the average student’s living expenses will be $17,820 -- for a total of $26,320.  Multiplying this figure by five — now the Texas standard for number of years to graduation -- totals $131,600.

But total costs will be higher than this. In Texas between 1999 and 2010, average tuition and related fees at the state’s 10 largest universities rose by 120 percent.  Tuition and fee increases of 10 percent a year will raise the figure of $131,600 to $160,591 in five years.

Let us look at immigrant subsidies, using Texas A&M University as a representative example. In-state tuition there is $8,418, out-of-state tuition $23,808 -- a yearly subsidy to illegal immigrants of $15,390. The total for five years is $76,950, plus a 10 percent annual increase in tuition -- for a grand total subsidy of $93,957. Subtracting $93,957 from the $198,000 fiscal gain that the NRC study documented leaves a net gain of $104,043.

Presently, the Texas Higher Education Coordinating Board reports that Texas colleges and universities currently enroll slightly more than 1.5 million students. Hispanic enrollment numbers are up this year by 4.5 percent -- a very small increase in a state where 40 percent of all residents are Hispanic. 

The number of illegal immigrants enrolled in public four-year colleges and universities in Texas totals 4,000, while the number in community colleges totals 12,000 -- still a very small percentage in a state that is 40 percent Hispanic. The in-state tuition subsidy in community colleges to illegal immigrants is about $2,000 a year.  At Lone Star Community College, where I teach, in-state tuition is $1,744, out-of-state tuition $3,844.

This is part of a larger problem and pattern. An October study by the American Enterprise institute entitled “Cheap for Whom?” finds: “Average taxpayers provide more in subsidies to elite public and private schools than to the less competitive schools where their own children are likely being educated."

The dirty little secret that universities and state and federal legislators don’t want the public to know is that these universities and legislators are de facto agents of class warfare. Note the shocking disparity between the rich and the poor that AEI reports: “Among not-for-profit institutions, the amount of taxpayer subsidies hovers between $1,000 and $2,000 per student per year until we turn to the most selective institutions.... Among these already well-endowed institutions, the taxpayer subsidy jumps substantially to more than $13,000 per student per year.”

It is class warfare. AEI argues, “If the country is to retain its competitive edge, it must reverse the current policies that result in providing the lowest levels of taxpayer support to the institutions that enroll the highest percentage of low-income, nontraditional, and minority students -- the fastest-growing segments of the population.”

And this should include illegal-immigrant students, who are residents of the state and pay sales and property taxes. They will pay back more than they take.

Ronald L. Trowbridge, Ph. D is a senior fellow at the Center for College Affordability and Productivity, a research center in Washington.

Iowa caucuses provide teaching and researching opportunities

Smart Title: 

The state's political caucuses are presidential candidates' first opportunity to show their stuff. Iowa's professors, students and researchers also get a chance to dissect the caucuses for the rest of the nation.

Dreams Deferred?

The heated rhetoric surrounding immigration reform legislation in Congress threatens to drown out an important, bipartisan effort to resolve a decades-old inconsistency in federal immigration law concerning postsecondary tuition costs for undocumented students who have graduated from high schools in the United States.

The “DREAM Act,” which was incorporated into the Senate Judiciary Committee’s immigration reform bill last week, would allow states to provide in-state tuition for postsecondary education to undocumented students who have attended (for at least three years) and graduated from high school in their states.

Federal immigration law now prohibits them from doing so, though that has not stopped several states, including “red” states like Utah, Kansas, and Texas, from adopting such legislation in recognition of the fact that there are more than 50,000 of these students each year that graduate from high school as -- in nearly every way -- children of the American dream.

Of the DREAM Act, Sen. Jeff Sessions (R-Ala.) stated, “I find it inconceivable that we would provide greater benefits to persons who are here illegally than to American citizens. It makes a mockery of the rule of law."

However, Congress must ensure the debate over the education of undocumented students is actually grounded in the law, rather than rhetoric. Federal law related to this issue was interpreted more than 20 years ago by the United States Supreme Court’s 1982 Plyler v. Doe decision.

Plyler v. Doe involved a Texas law that effectively banned undocumented minor children from participating in public elementary and secondary education. The Court heard arguments that sounded quite similar to those used to deny in-state college tuition for the same students: that providing K-12 education rewards illegal immigration, that we should not give public benefits to those in the country illegally. The significance of this case is not that it settled once and for all the ideological arguments surrounding immigration. Rather, the Court created protective legal precedent for minor undocumented students by carefully examining the intersection of immigration law, the distribution of public goods, and individual rights as protected by the Constitution of the United States.

The Supreme Court’s decision addressed the question: Did a child break the law because the parents brought the child into the country illegally as a minor? The Supreme Court said “no.”

The Court ruled that such children, in fact, were entitled to equal protection under the law, one of America’s most cherished legal principles. As cited in the Court’s opinion, the Fourteenth Amendment to the Constitution provides that “[n]o State shall…deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”

As a population within the state’s jurisdiction, undocumented students were, therefore, entitled to equal treatment under the law. In the opinion of the Court, Associate Justice William J. Brennan Jr., wrote, “To permit a state … to identify subclasses of persons whom it would define as beyond its jurisdiction, thereby relieving itself of the obligation to assure that its laws are designed and applied equally to those persons, would undermine the principal purpose for which the Equal Protection Clause was incorporated in the Fourteenth Amendment.”

The Court further argued that federal immigration law, despite “sheer incapability or lax enforcement,” was not a justification for denying children equal protection and access to education.

In recognition of this principle, several state legislatures have passed laws to allow in-state postsecondary tuition for undocumented students who have attended public high schools in state for more than three years. They realize the legal “no-man’s land” these students occupy, and have sought to remedy it under the law.

The central relevance of the Supreme Court’s case to this debate over in-state tuition for undocumented students is that we cannot simply ignore what Justice Brennan called the “shadow population” of students who go about their daily lives and contribute to our society in the same way that we all strive to contribute. Moreover, we cannot deprive these students of the equal protection that our Constitution provides simply because they graduate from the high school setting where the Supreme Court has decided that it applies.

Though the issue is easy to weigh down with heated rhetoric, we hope that the law will, in fact, prevail, and that Congress will pass the DREAM Act. As Justice Brennan concluded, “[W]hatever savings might be achieved by denying these children an education, they are wholly insubstantial in light of the costs involved to these children, the State, and the Nation.”

Author/s: 
David Hawkins
Author's email: 
editor@insidehighered.com

David Hawkins is director of public policy for the National Association for College Admission Counseling.

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

Where Are the Students?

Last year, college students were the most fervent supporters of Obama’s bid for the presidency. Now, the U.S. Senate has taken up what Obama says is the defining legislation of his term: health care reform. Oddly, the voice of college students is nowhere to be found in the national debate -- most likely because the activist set does not realize how much is at stake for them personally.

It might seem that college students have little to worry about. Most full-time students in fact have health insurance right now. Two-thirds are covered through their parents’ insurance plans and another 7 percent are covered through a university plan, according to the Government Accountability Office.

But one thing is guaranteed: College students with the good fortune to have insurance right now will lose their current coverage soon after graduation. For those who are insured through their parents’ plans, they will be dropped after they leave school. And for students on a university plan, they will soon learn that the loyalty of their alma mater has limits: It does not extend to a lifetime of affordable health care.

What is a student to do? The current answer, unfortunately, is to get a job. And not just any job: a stable, full-time job with an employer that will offer them health insurance. That, in fact, is the bizarre reality of health care in the United States. We currently live in a system that presumes “employer-sponsored insurance,” in which you must have a steady paycheck before you can get affordable health care.

As college students surely know, however, the prospect of steady full-time work is looking worse than ever. The unemployment rate for young adults is up from 10 percent last year to a whopping 15 percent this year. For recent grads who have the good fortune to land a job, they will be more likely than older workers to work for small companies. But small employers are also the least likely to offer health insurance, and more small companies have dropped health insurance for their workers every year since 2000.

The alternative is to buy insurance individually rather than to bother with an employer. For recent grads in particular, it’s a pity that the cost of these plans is rising faster than wages. As workers just starting their careers, college students will most likely have the lowest earnings of their lifetimes. Short of a steady job or enough money and know-how to navigate the private insurance market, the Class of 2010 will get insurance under the current system only if they are poor or disabled. Only then would they get scooped up by a government safety net program: Medicaid. But it’s not clear that any college students aspire to that fate.

This scenario does not even take into account the existential question that college seniors may be pondering right now: whether they even want to follow the straight-and-narrow path from college to traditional career. Entrepreneurs, activists, travelers, farmers, parents, artists -- be warned: All of those opportunities would require verve, intelligence -- and the willingness to sacrifice good health if need be. It is little wonder that people in their 20s are more likely to be uninsured than any other age group in the U.S. today.

Right now, the U.S. Senate is debating a bill that could help change this situation for college students. But many senators are not yet convinced that Americans really want health care reform. Do college students?

It is a good time for students to think through their answers. For one thing, Obama is calling for a vote on the Senate bill before Christmas. No doubt, health care bills are complicated and boring -- not exactly end-of-term pleasure reading. But students might start with a blog by the director of the White House budget office, Peter Orszag.

Heading into winter break, students also have the chance to think through the health care debate on a more personal level. They can find out when their current coverage is going to end. For those on a parent’s plan, it may come as a shock to find that they will lose coverage on Commencement Day.

Over the holidays, college students can also chat up their grandparents and other older relatives. Polls consistently show that people over the age of 65 are the most resistant to health care overhaul -- in large part because they want to protect their Medicare coverage.

College students do have a major stake in the outcome of the health care debate. So whether on campuses or on their own, students would be wise to think through the issues -- not for Obama’s sake this time, but for their own.

Author/s: 
Laura Stark
Author's email: 
doug.lederman@insidehighered.com

Laura Stark is an assistant professor of sociology and science in society at Wesleyan University; she co-wrote this essay with several Wesleyan juniors and seniors: Suzanna Hirsch, Samantha Hodges, Gianna Palmer and Kim Segall.

Campuses and Interfaith Cooperation

Last week, leaders from higher education gathered at the White House for a conference on Advancing Interfaith Service on College Campuses. Senior administration officials from the Department of Education, the Corporation for National and Community Service and two White House offices – of Faith-based and Neighborhood Partnerships, and of Social Innovation – addressed the crowd of university presidents, professors, chaplains and students.

That the White House would hold a conference on interfaith cooperation is no mystery; President Obama made the topic a theme of his presidency from the very beginning. But why a gathering that focuses on campuses? I think there are four reasons for this:

  1. College campuses set the educational and civic agenda for the nation. By gathering higher education leaders, administration officials are signaling that they hope campuses make learning about religious diversity a mark of what it means to be an educated person. And just as campuses helped make volunteerism and multiculturalism a high priority on our nation’s civic agenda, staff in the Obama administration are hopeful that higher education can do the same for interfaith cooperation.
  2. College campuses are social laboratories that can illustrate what success looks like. While there may be frigid relations between some religious groups in politics and the public square, a college campus has both the mission and the resources (chaplains, diversity offices, religion departments, resident advisers) to proactively cultivate positive relations between Muslims and Jews, Christians and Buddhists, Hindus and Humanists. They can demonstrate cooperation rather than conflict.
  3. Campuses have the resources and mission to advance a knowledge paradigm – an orientation and body of knowledge that appreciates and possibility engages religious diversity. From Samuel Huntington’s clash of civilizations theory to stories of religious conflict on the evening news to the recent spate of bestsellers by ‘the new atheists,’ we are increasingly subject to a knowledge paradigm about religions being the source for violence, bigotry and ignorance in the world. While this paradigm should certainly be acknowledged, another one can be advanced: that diverse religions share positive values like mercy and compassion that can be acted on across lines of faith for the common good.
  4. Campuses train the next generation of leaders. Students who have a positive experience of the “religious other” on campus take that worldview into the broader society. Students who develop an appreciative knowledge of the world’s religions on campus educate their neighbors. Students who learn the skills to bring people from different faith backgrounds together to build understanding and cooperation together on the quad apply those skills with their religiously diverse coworkers.

President Obama has shown the way in each of the above categories, and college campuses are uniquely positioned to follow his lead.

In his inaugural address, Obama lifted up America’s religious diversity and connected it to America’s promise: “Our patchwork heritage is a strength not a weakness, we are a nation of Christians and Muslims, Jews and Hindus, and nonbelievers...." The message: Educated citizens should know of our nation’s religious diversity, and it is a civic virtue to engage this diversity positively.

College presidents in America could sound a similar note in speeches to the incoming freshman class.

In the advisory council for the Faith-based Office, the president created his own laboratory that models what positive relations between religiously diverse citizens. I had the honor of serving on the inaugural council (a new group of 25 is expected to be appointed soon). There were Orthodox and Reform Jews, Catholic and Protestant clergy, Sunni and Shia Muslims, Hindu civic leaders and Evangelical movement-builders. And that’s not all -- we were Republicans and Democrats, gay and straight, Mexican and Indian and white and African American. And we had to agree on a final report that went to the president.

College campuses could have an interfaith council that works on common projects.

In Cairo, the president advanced a new “knowledge paradigm” with respect to religious diversity. Eschewing the tired clash of civilizations theory, which falsely claims that religions have opposing values that put them in conflict, Obama highlighted the positive interactions between the West and Islam throughout the course of history, the many contributions Muslim Americans make to their nation, and the dimensions of Islam he admired such as the advancement of learning and innovation.

College campuses can have academic courses that do the same.

As a young adult, Obama was a community organizer working under a Jewish mentor, bringing together Catholic, Protestant and Muslim groups to start job training centers and tutoring programs on the South Side of Chicago. In this way, he acquired the competencies of leadership in a religiously diverse world. The president has signaled that he believes this is a valuable experience for today’s young adults, making interfaith cooperation through service a line in his Cairo address and a theme of the Summer of Service program.

College campuses, with the high value they place on service, leadership development and the positive engagement of diversity, are perfectly prepared to launch robust interfaith service initiatives.

Interfaith initiatives have been growing on campuses for several decades. The White House invited the vanguard of the movement to Washington, D.C., last week with a clear message: this administration appreciates what you have been doing, and we think you can do more. A movement goes from niche to norm when a vanguard recognizes its moment. For the movement of interfaith cooperation, this is the moment.

Author/s: 
Eboo Patel
Author's email: 
newsroom@insidehighered.com

Eboo Patel is the founder and executive director of Interfaith Youth Core, an organization that works with college campuses on religious diversity issues.

Pages

Subscribe to RSS - Politics (national)
Back to Top