Politics (national)

Review of Howard Ball, "At Liberty to Die: The Battle for Death with Dignity in America"

Of the many strange things in Gulliver’s Travels that make it hard to believe anyone ever considered it a children’s book, the most disturbing must be the Struldbruggs, living in the far eastern kingdom of Luggnagg, not covered by Google Maps at the present time.

Gulliver’s hosts among the Luggnaggian aristocracy tell him that a baby is born among them, every so often, with a red dot on the forehead -- the sign that he or she is a Struldbrugg, meaning an immortal. Our narrator is suitably amazed. The Struldbruggs, he thinks, have won the cosmic lottery. Being “born exempt from that universal Calamity of human Nature,” they “have their Minds free and disengaged, without the Weight and Depression of Spirits caused by the continual Apprehension of Death.”

The traveler has no trouble imagining the life he might lead as an immortal, given the chance. First of all, Gulliver tells his audience at dinner, he would spend a couple of hundred years accumulating the largest fortune in the land. He’d also be sure to master all of the arts and sciences, presumably in his spare time. And then, with all of that out of the way, Gulliver could lead the life of a philanthropic sage, dispensing riches and wisdom to generation after generation. (A psychoanalytic writer somewhere uses the expression “fantasies of the empowered self,” which just about covers it.)

But then the Lubnaggians bring him back to reality by explaining that eternal life is not the same thing as eternal youth. The Struldbruggs “commonly acted like Mortals, till about thirty Years old,” one of Gulliver’s hosts explains, “after which by degrees they grew melancholy and dejected, increasing in both till they came to four-score." The expression “midlife crisis” is not quite the one we want here, but close enough.

From the age of eighty on, “they had not only all the Follies and Infirmities of other old Men, but many more which arose from the dreadful Prospect of never dying.” Forget the mellow ripening of wisdom: Struldbruggs “were not only Opinionative, Peevish, Covetous, Morose, Vain, Talkative, but incapable of Friendship, and dead to all natural Affection.”

It gets worse. Their hair and teeth fall out. “The Diseases they were subject to still continuing without increasing or diminishing,” Gulliver tells us. “In talking they forgot the common Appellation of Things, and the Names of Persons, even of those who are their nearest Friends and Relations. For the same Reason they never can amuse themselves with reading, because their Memory will not serve to carry them from the beginning of a Sentence to the end; and by this Defect they are deprived of the only entertainment whereof they might otherwise be capable.”

It is a vision of hell. Either that, or a prophecy of things to come, assuming the trends of the last few decades continue. Between 1900 and 2000, the average life expectancy in the United States rose from 49 to 77 years; between 1997 and 2007, it grew by 1.4 years. This is not immortality, but it beats dying before you reach 50. The span of active life has extended as well. The boundary markers of what counts as old age keep moving out.

From a naïve, Gulliverian perspective, it is all to the good. But there’s no way to quantify changes in the quality of life. We live longer, but it's taking longer to die as well. Two-thirds of deaths among people over the age of 65 in the United States are caused by three chronic conditions: heart disease, cancer, and stroke. The “life” of someone in a persistent vegetative state (in which damage to the cerebral cortex is so severe and irreversible that cognitive functions are gone for good) can be prolonged indefinitely, if not forever.

More horrific to imagine is the twilight state of being almost vegetative, but not quite, with some spark of consciousness flickering in and out -- a condition of Struldbruggian helplessness and decay. “I grew heartily ashamed of the pleasing Visions I had formed,” says Gulliver, “and thought no Tyrant could invent a Death into which I would not run with Pleasure from such a Life.”

Howard Ball’s book At Liberty to Die: The Battle for Death With Dignity in America (New York University Press) is a work of advocacy, as the subtitle indicates. The reader will find not the slightest trace of Swiftian irony in it. Ball, a professor emeritus of political science at the University of Vermont, is very straightforward about expressing bitterness -- directing it at forces that would deny “strong-willed, competent, and dying adults who want to die with dignity when faced with a terminal illness” their right to do so.  

The forces in question fall under three broad headings. One is the religious right, which Ball sees as being led, on this issue at least, by the Roman Catholic Church. Another is the Republican Party leadership, particularly in Congress, which he treats as consciously “politicizing the right-to-die issue” in a cynical manner, as exemplified by the memo of a G.O.P. operative on “the political advantage to Republicans [of] intervening in the case of Terri Schiavo.” (For anyone lucky enough to have forgotten: In 1998, after Terri Shiavo had been in a persistent vegetative state for eight years, her husband sought to have her feeding tube removed, setting off numerous rounds of litigation, as well as several pieces of legislation that included bills in the US Congress. The feeding tube was taken out and then reattached twice before being finally removed in 2005, after which Schiavo died. The website of the University of Miami's ethics program has a detailed timeline of the Schiavo case.)

The third force Ball identifies is that proverbial 800-pound gorilla known as the Supreme Court of the United States. Its rulings in Washington v. Glucksburg and Vacco v. Quill in 1997 denied the existence of anything like a constitutionally protected right to physician-assisted death (PAD). States can outlaw PAD -- or permit it, as Montana, Oregon, and Washington do at present. In the epigraph to his final chapter, Ball quotes a Colorado activist named Barbara Coombs Lee: “We think the citizens of all fifty states deserve death with dignity.” But the Supreme Court of the United States will not be making that a priority any time soon.

“The central thesis of the book,” states Ball, “is that the liberty found in the U.S. Constitution’s Fifth and Fourteenth ‘Due Process’ Amendments extends... [to] the terminally ill person's right to choose to die with dignity -- with the passive assistance of a physician -- rather than live in great pain or live a quality-less life.” The typical mode of “passive assistance” would be “to give painkilling medications to a terminally ill patient, with the possibility that the treatment will indirectly hasten the patient’s death.”

Ball notes that a Pew Research Center Survey from 2005 showed that an impressive 84 percent level of respondents “approved of patients being able to decide whether or not to be kept alive through medical treatment or choosing to die with dignity.”

Now, for whatever it’s worth, that solid majority of 84 percent includes this columnist. If the time for it ever comes, I’d want my doctor to supersize me on the morphine drip without breaking any laws. Throughout Ball's narrative of the successes and setbacks of the death-with-dignity cause, I cheered at each step forward, and felt appalled all over again while reading the chapter he calls “Terri Schiavo’s Tragic Odyssey,” although it did seem like the more suitable adjective would be “grotesque.” Tragedy implies at least some level of dignity.

The author also introduced me to a blackly humorous expression, “death tourist,” which refers to a person "visiting" a state to take advantage of physician-assisted suicide being legal there.

As a member of the choir, I liked Ball's preaching, but it felt like the sermon was missing an index card or two. As mentioned earlier, the book’s “central thesis” is supposed to be that the due-process guarantees in Constitution extend to the right to death with dignity. And so the reader has every reason to expect a sustained and careful argument for why that legal standard applies. None is forthcoming. The due-process clauses did come up when the Supreme Court heard oral arguments in 1997, but it rejected them as inapplicable. This would seem to be the point in the story where that central thesis would come out swinging. The author would show, clearly and sharply, why the Court was wrong to do so. He doesn't. It is puzzling.

Again, it sounds very categorical when Ball cites that Pew survey from 2005 showing 84 percent agreement that individuals had a right to choose an exit strategy if medical care were not giving them a life they felt worth living. But the same survey results show that when asked whether they believed it should be legal for doctors to "assist terminally ill patients in committing suicide," only 44 percent favored it, while 48 percent were opposed. With the matter phrased differently -- surveyors asking if it should be legal for doctors to "give terminally ill patients the means to end their lives" -- support went up to 51 percent, while 40 percent remained opposed. This reveals considerably more ambivalence than the 84 percent figure would suggest.

The notion that a slippery slope will lead from death with dignity to mass programs of euthanasia clearly exasperates Ball, and he can hardly be faulted on that score. A portion of the adult population is prepared to believe that any given social change will cause the second coming of the Third Reich, this time on American soil. (Those who do not forget the History Channel are condemned to repeat fairly dumb analogies.) But the slippery-slope argument will more likely be refuted in practice than through argument. Whether or not the law recognizes it, the right to make decisions about one’s own mortality or quality of life will exist any time someone claims it. One of the medical profession’s worst-kept secrets for some time now is that plenty of physicians will oblige a suffering patient with the means to end their struggle. (As Ball notes, this came up in the Supreme Court discussions 15 years ago.)

And the demand is bound to grow, as more and more of us live long enough to see -- like Gulliver -- that there are worse fates than death. Brilliant legal minds should apply themselves to figuring out how to make an ironclad case for the right a decent departure from this mortal coil. At Liberty to Die is useful as a survey of some obstacles standing in the way. But in the meantime, people will find ways around those obstacles, even if it means taking a one-way, cross-continental trip to the Pacific Northwest. There are worse ways to go.

Presidential race brings scrutiny for candidates on higher education

Smart Title: 

Romney gets scrutiny for praise of for-profit college led by campaign donor. Biden angers faculty by suggesting their salaries force tuition up. Gingrich attacks fluency in French.

Essay: Educated illegal immigrants are a net financial gain for U.S.

Ronald Reagan once said, “Don’t be afraid to see what you see.” The current flap over Gov. Rick Perry’s defense of in-state tuition for students whose parents are in the United States illegally drives us to take off the lid and take a peek. 

And what we see is that illegal-immigrant students pay back more than they take.

Daniel Griswold, an immigration expert at Cato Institute, wrote to me recently in response to my inquiry, “In 1997, the National Research Council published a major study on immigration. It found that an immigrant with a college education is a huge net plus for the United States.”

Griswold reports this finding of the NRC study: “Immigrants and their descendants represent a net fiscal gain for the United States. The typical immigrant and all of his or her descendants represent a positive $80,000 fiscal gain to the government. An immigrant with more than a high school education (plus descendants) represents a $198,000 fiscal gain, one with a high school diploma a $51,000 gain, and one with less than a high school education a $13,000 loss.”

Some will counter that college slots for illegal immigrants should be given instead to poor U.S.-born students. But most of these students cannot afford college. Tuition, for example, at Texas’ universities will average this year about $8,500, and the College Board projects that the average student’s living expenses will be $17,820 -- for a total of $26,320.  Multiplying this figure by five — now the Texas standard for number of years to graduation -- totals $131,600.

But total costs will be higher than this. In Texas between 1999 and 2010, average tuition and related fees at the state’s 10 largest universities rose by 120 percent.  Tuition and fee increases of 10 percent a year will raise the figure of $131,600 to $160,591 in five years.

Let us look at immigrant subsidies, using Texas A&M University as a representative example. In-state tuition there is $8,418, out-of-state tuition $23,808 -- a yearly subsidy to illegal immigrants of $15,390. The total for five years is $76,950, plus a 10 percent annual increase in tuition -- for a grand total subsidy of $93,957. Subtracting $93,957 from the $198,000 fiscal gain that the NRC study documented leaves a net gain of $104,043.

Presently, the Texas Higher Education Coordinating Board reports that Texas colleges and universities currently enroll slightly more than 1.5 million students. Hispanic enrollment numbers are up this year by 4.5 percent -- a very small increase in a state where 40 percent of all residents are Hispanic. 

The number of illegal immigrants enrolled in public four-year colleges and universities in Texas totals 4,000, while the number in community colleges totals 12,000 -- still a very small percentage in a state that is 40 percent Hispanic. The in-state tuition subsidy in community colleges to illegal immigrants is about $2,000 a year.  At Lone Star Community College, where I teach, in-state tuition is $1,744, out-of-state tuition $3,844.

This is part of a larger problem and pattern. An October study by the American Enterprise institute entitled “Cheap for Whom?” finds: “Average taxpayers provide more in subsidies to elite public and private schools than to the less competitive schools where their own children are likely being educated."

The dirty little secret that universities and state and federal legislators don’t want the public to know is that these universities and legislators are de facto agents of class warfare. Note the shocking disparity between the rich and the poor that AEI reports: “Among not-for-profit institutions, the amount of taxpayer subsidies hovers between $1,000 and $2,000 per student per year until we turn to the most selective institutions.... Among these already well-endowed institutions, the taxpayer subsidy jumps substantially to more than $13,000 per student per year.”

It is class warfare. AEI argues, “If the country is to retain its competitive edge, it must reverse the current policies that result in providing the lowest levels of taxpayer support to the institutions that enroll the highest percentage of low-income, nontraditional, and minority students -- the fastest-growing segments of the population.”

And this should include illegal-immigrant students, who are residents of the state and pay sales and property taxes. They will pay back more than they take.

Ronald L. Trowbridge, Ph. D is a senior fellow at the Center for College Affordability and Productivity, a research center in Washington.

Iowa caucuses provide teaching and researching opportunities

Smart Title: 

The state's political caucuses are presidential candidates' first opportunity to show their stuff. Iowa's professors, students and researchers also get a chance to dissect the caucuses for the rest of the nation.

Dreams Deferred?

The heated rhetoric surrounding immigration reform legislation in Congress threatens to drown out an important, bipartisan effort to resolve a decades-old inconsistency in federal immigration law concerning postsecondary tuition costs for undocumented students who have graduated from high schools in the United States.

The “DREAM Act,” which was incorporated into the Senate Judiciary Committee’s immigration reform bill last week, would allow states to provide in-state tuition for postsecondary education to undocumented students who have attended (for at least three years) and graduated from high school in their states.

Federal immigration law now prohibits them from doing so, though that has not stopped several states, including “red” states like Utah, Kansas, and Texas, from adopting such legislation in recognition of the fact that there are more than 50,000 of these students each year that graduate from high school as -- in nearly every way -- children of the American dream.

Of the DREAM Act, Sen. Jeff Sessions (R-Ala.) stated, “I find it inconceivable that we would provide greater benefits to persons who are here illegally than to American citizens. It makes a mockery of the rule of law."

However, Congress must ensure the debate over the education of undocumented students is actually grounded in the law, rather than rhetoric. Federal law related to this issue was interpreted more than 20 years ago by the United States Supreme Court’s 1982 Plyler v. Doe decision.

Plyler v. Doe involved a Texas law that effectively banned undocumented minor children from participating in public elementary and secondary education. The Court heard arguments that sounded quite similar to those used to deny in-state college tuition for the same students: that providing K-12 education rewards illegal immigration, that we should not give public benefits to those in the country illegally. The significance of this case is not that it settled once and for all the ideological arguments surrounding immigration. Rather, the Court created protective legal precedent for minor undocumented students by carefully examining the intersection of immigration law, the distribution of public goods, and individual rights as protected by the Constitution of the United States.

The Supreme Court’s decision addressed the question: Did a child break the law because the parents brought the child into the country illegally as a minor? The Supreme Court said “no.”

The Court ruled that such children, in fact, were entitled to equal protection under the law, one of America’s most cherished legal principles. As cited in the Court’s opinion, the Fourteenth Amendment to the Constitution provides that “[n]o State shall…deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”

As a population within the state’s jurisdiction, undocumented students were, therefore, entitled to equal treatment under the law. In the opinion of the Court, Associate Justice William J. Brennan Jr., wrote, “To permit a state … to identify subclasses of persons whom it would define as beyond its jurisdiction, thereby relieving itself of the obligation to assure that its laws are designed and applied equally to those persons, would undermine the principal purpose for which the Equal Protection Clause was incorporated in the Fourteenth Amendment.”

The Court further argued that federal immigration law, despite “sheer incapability or lax enforcement,” was not a justification for denying children equal protection and access to education.

In recognition of this principle, several state legislatures have passed laws to allow in-state postsecondary tuition for undocumented students who have attended public high schools in state for more than three years. They realize the legal “no-man’s land” these students occupy, and have sought to remedy it under the law.

The central relevance of the Supreme Court’s case to this debate over in-state tuition for undocumented students is that we cannot simply ignore what Justice Brennan called the “shadow population” of students who go about their daily lives and contribute to our society in the same way that we all strive to contribute. Moreover, we cannot deprive these students of the equal protection that our Constitution provides simply because they graduate from the high school setting where the Supreme Court has decided that it applies.

Though the issue is easy to weigh down with heated rhetoric, we hope that the law will, in fact, prevail, and that Congress will pass the DREAM Act. As Justice Brennan concluded, “[W]hatever savings might be achieved by denying these children an education, they are wholly insubstantial in light of the costs involved to these children, the State, and the Nation.”

Author/s: 
David Hawkins
Author's email: 
editor@insidehighered.com

David Hawkins is director of public policy for the National Association for College Admission Counseling.

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

Where Are the Students?

Last year, college students were the most fervent supporters of Obama’s bid for the presidency. Now, the U.S. Senate has taken up what Obama says is the defining legislation of his term: health care reform. Oddly, the voice of college students is nowhere to be found in the national debate -- most likely because the activist set does not realize how much is at stake for them personally.

It might seem that college students have little to worry about. Most full-time students in fact have health insurance right now. Two-thirds are covered through their parents’ insurance plans and another 7 percent are covered through a university plan, according to the Government Accountability Office.

But one thing is guaranteed: College students with the good fortune to have insurance right now will lose their current coverage soon after graduation. For those who are insured through their parents’ plans, they will be dropped after they leave school. And for students on a university plan, they will soon learn that the loyalty of their alma mater has limits: It does not extend to a lifetime of affordable health care.

What is a student to do? The current answer, unfortunately, is to get a job. And not just any job: a stable, full-time job with an employer that will offer them health insurance. That, in fact, is the bizarre reality of health care in the United States. We currently live in a system that presumes “employer-sponsored insurance,” in which you must have a steady paycheck before you can get affordable health care.

As college students surely know, however, the prospect of steady full-time work is looking worse than ever. The unemployment rate for young adults is up from 10 percent last year to a whopping 15 percent this year. For recent grads who have the good fortune to land a job, they will be more likely than older workers to work for small companies. But small employers are also the least likely to offer health insurance, and more small companies have dropped health insurance for their workers every year since 2000.

The alternative is to buy insurance individually rather than to bother with an employer. For recent grads in particular, it’s a pity that the cost of these plans is rising faster than wages. As workers just starting their careers, college students will most likely have the lowest earnings of their lifetimes. Short of a steady job or enough money and know-how to navigate the private insurance market, the Class of 2010 will get insurance under the current system only if they are poor or disabled. Only then would they get scooped up by a government safety net program: Medicaid. But it’s not clear that any college students aspire to that fate.

This scenario does not even take into account the existential question that college seniors may be pondering right now: whether they even want to follow the straight-and-narrow path from college to traditional career. Entrepreneurs, activists, travelers, farmers, parents, artists -- be warned: All of those opportunities would require verve, intelligence -- and the willingness to sacrifice good health if need be. It is little wonder that people in their 20s are more likely to be uninsured than any other age group in the U.S. today.

Right now, the U.S. Senate is debating a bill that could help change this situation for college students. But many senators are not yet convinced that Americans really want health care reform. Do college students?

It is a good time for students to think through their answers. For one thing, Obama is calling for a vote on the Senate bill before Christmas. No doubt, health care bills are complicated and boring -- not exactly end-of-term pleasure reading. But students might start with a blog by the director of the White House budget office, Peter Orszag.

Heading into winter break, students also have the chance to think through the health care debate on a more personal level. They can find out when their current coverage is going to end. For those on a parent’s plan, it may come as a shock to find that they will lose coverage on Commencement Day.

Over the holidays, college students can also chat up their grandparents and other older relatives. Polls consistently show that people over the age of 65 are the most resistant to health care overhaul -- in large part because they want to protect their Medicare coverage.

College students do have a major stake in the outcome of the health care debate. So whether on campuses or on their own, students would be wise to think through the issues -- not for Obama’s sake this time, but for their own.

Author/s: 
Laura Stark
Author's email: 
doug.lederman@insidehighered.com

Laura Stark is an assistant professor of sociology and science in society at Wesleyan University; she co-wrote this essay with several Wesleyan juniors and seniors: Suzanna Hirsch, Samantha Hodges, Gianna Palmer and Kim Segall.

Campuses and Interfaith Cooperation

Last week, leaders from higher education gathered at the White House for a conference on Advancing Interfaith Service on College Campuses. Senior administration officials from the Department of Education, the Corporation for National and Community Service and two White House offices – of Faith-based and Neighborhood Partnerships, and of Social Innovation – addressed the crowd of university presidents, professors, chaplains and students.

That the White House would hold a conference on interfaith cooperation is no mystery; President Obama made the topic a theme of his presidency from the very beginning. But why a gathering that focuses on campuses? I think there are four reasons for this:

  1. College campuses set the educational and civic agenda for the nation. By gathering higher education leaders, administration officials are signaling that they hope campuses make learning about religious diversity a mark of what it means to be an educated person. And just as campuses helped make volunteerism and multiculturalism a high priority on our nation’s civic agenda, staff in the Obama administration are hopeful that higher education can do the same for interfaith cooperation.
  2. College campuses are social laboratories that can illustrate what success looks like. While there may be frigid relations between some religious groups in politics and the public square, a college campus has both the mission and the resources (chaplains, diversity offices, religion departments, resident advisers) to proactively cultivate positive relations between Muslims and Jews, Christians and Buddhists, Hindus and Humanists. They can demonstrate cooperation rather than conflict.
  3. Campuses have the resources and mission to advance a knowledge paradigm – an orientation and body of knowledge that appreciates and possibility engages religious diversity. From Samuel Huntington’s clash of civilizations theory to stories of religious conflict on the evening news to the recent spate of bestsellers by ‘the new atheists,’ we are increasingly subject to a knowledge paradigm about religions being the source for violence, bigotry and ignorance in the world. While this paradigm should certainly be acknowledged, another one can be advanced: that diverse religions share positive values like mercy and compassion that can be acted on across lines of faith for the common good.
  4. Campuses train the next generation of leaders. Students who have a positive experience of the “religious other” on campus take that worldview into the broader society. Students who develop an appreciative knowledge of the world’s religions on campus educate their neighbors. Students who learn the skills to bring people from different faith backgrounds together to build understanding and cooperation together on the quad apply those skills with their religiously diverse coworkers.

President Obama has shown the way in each of the above categories, and college campuses are uniquely positioned to follow his lead.

In his inaugural address, Obama lifted up America’s religious diversity and connected it to America’s promise: “Our patchwork heritage is a strength not a weakness, we are a nation of Christians and Muslims, Jews and Hindus, and nonbelievers...." The message: Educated citizens should know of our nation’s religious diversity, and it is a civic virtue to engage this diversity positively.

College presidents in America could sound a similar note in speeches to the incoming freshman class.

In the advisory council for the Faith-based Office, the president created his own laboratory that models what positive relations between religiously diverse citizens. I had the honor of serving on the inaugural council (a new group of 25 is expected to be appointed soon). There were Orthodox and Reform Jews, Catholic and Protestant clergy, Sunni and Shia Muslims, Hindu civic leaders and Evangelical movement-builders. And that’s not all -- we were Republicans and Democrats, gay and straight, Mexican and Indian and white and African American. And we had to agree on a final report that went to the president.

College campuses could have an interfaith council that works on common projects.

In Cairo, the president advanced a new “knowledge paradigm” with respect to religious diversity. Eschewing the tired clash of civilizations theory, which falsely claims that religions have opposing values that put them in conflict, Obama highlighted the positive interactions between the West and Islam throughout the course of history, the many contributions Muslim Americans make to their nation, and the dimensions of Islam he admired such as the advancement of learning and innovation.

College campuses can have academic courses that do the same.

As a young adult, Obama was a community organizer working under a Jewish mentor, bringing together Catholic, Protestant and Muslim groups to start job training centers and tutoring programs on the South Side of Chicago. In this way, he acquired the competencies of leadership in a religiously diverse world. The president has signaled that he believes this is a valuable experience for today’s young adults, making interfaith cooperation through service a line in his Cairo address and a theme of the Summer of Service program.

College campuses, with the high value they place on service, leadership development and the positive engagement of diversity, are perfectly prepared to launch robust interfaith service initiatives.

Interfaith initiatives have been growing on campuses for several decades. The White House invited the vanguard of the movement to Washington, D.C., last week with a clear message: this administration appreciates what you have been doing, and we think you can do more. A movement goes from niche to norm when a vanguard recognizes its moment. For the movement of interfaith cooperation, this is the moment.

Author/s: 
Eboo Patel
Author's email: 
newsroom@insidehighered.com

Eboo Patel is the founder and executive director of Interfaith Youth Core, an organization that works with college campuses on religious diversity issues.

Nobel Peace Prize -- for Education?

No one has ever won a Nobel Peace Prize for education. Click here and look for yourself. I can’t be alone in finding this embarrassing for all of us in education. Here in the nation with the self-proclaimed “finest higher education system in the world,” why hasn’t the Big Ten won the Nobel Peace Prize? Or the Ivy League? Or even the Little Three. The opposite of peace would be war and conflict. Isn’t war the ultimate failure to solve a problem by other means? Isn’t our job in education to teach people to solve problems of all sizes?

I wonder because today, again, begins the dark time of year for the world-changing, approval-seeking wannabes with whom I cast my lot. Today’s the day the MacArthur Fellows are announced. As of press time, I have to conclude that no phone call or photo request so far means I missed again. I’ve been here before. I am resilient. What keeps me going in this season is the knowledge that the Nobels are on the way in October, starting Monday.

These past 12 months have been perplexing. Since President Obama won his Nobel Peace Prize last year, though, I haven’t made any progress on my annual post-MacArthur questions: What would a Nobel Peace Prize for education look like? What would a Norman Borlaug-ian accomplishment by an educator be? I don’t have an answer, and the Nobels will pass by again. Someone out there must have an idea for education.

As a sometime English teacher, I like analogies and metaphors. Norman Borlaug, the 1970 Prize winner, put more grains of wheat on shorter stalks. That was a big step to reducing hunger and starvation in Mexico and Pakistan. Yes, Borlaug’s Green Revolution, which no one disputes fed millions, has critics. So does whole milk. Before my own critics howl, I do not mean that stuffing more students into smaller classrooms is the Nobel idea to consider. Borlaug’s idea, though, is the kind of global game-changer that perhaps we educators and columnists need to ponder.

I can’t see that any winners are better necessarily better thinkers than educators. In 2007, droning Al Gore won for just describing a problem – global warming. Winners from medicine, though, keep thinking beyond their laboratories and hospitals. The 1985 winner was International Physicians for the Prevention of Nuclear War. In 1999, the winner was Medicins sans Frontieres, "in recognition of the organization's pioneering humanitarian work on several continents." The 2005 winners were Mohamed ElBaradei and the International Atomic Energy Agency "for their efforts to prevent nuclear energy from being used for military purposes and to ensure that nuclear energy for peaceful purposes is used in the safest possible way." Why not a similar focused effort for education?

In 1947, the Quakers won. In 1917, the International Red Cross. What’s to stop a U.S. college or university from doing the same? Or at least a teacher with an education peace plan.

A pox on us all is the recurring Nobel Peace Prize theme of nuclear weapons. As if invention of the weapons in the first place weren’t already evidence that we teachers have room for improvement. In 1995, the winner was the Pugwash Conferences on Science and World Affairs -- "for their efforts to diminish the part played by nuclear arms in international politics and, in the longer run, to eliminate such arms." I keep thinking that we educators ought to have an answer by now, how to solve problems without even the threat of nuclear war.

My Nobel Peace Prize education contender is Nicholas Negroponte, for the scope and success of his project One Laptop Per Child. The mission statement has a peaceful ring: “To create educational opportunities for the world's poorest children by providing each child with a rugged, low-cost, low-power, connected laptop with content and software designed for collaborative, joyful, self-empowered learning.”

I’m not hoping. The world declared Negroponte crazy for saying he could build a laptop for $100. That same world has now discredited him for having missed by 100% and coming in with a price of $200. I’ve tried one. They are great. I’ll do what I can to move Negroponte out of the No Good Deed Goes Unpunished Hall of Fame. One Laptop Per Child, in my book (or hard drive) anyway, beats Al Gore any day. Negroponte is working on real solutions.

Thinking about all this last year in my office at Bunker Hill Community College, I looked up one day to find that 1992 Nobel Peace Prize winner Rigoberta Menchu Tum was arriving on campus that afternoon. After her talk, I asked what she made of the absence of any Nobel Peace Prizes for education. Through a translator, here’s what she said: “Everything in this world depends on education. All the people who have graduated, with formal educations, they are the ones who are the leaders. But with all the people who are harming the world, we need to take another look at how education focuses on the positive social mission. Part of the learning takes place in the classroom, but I’d move part of the learning out into the street, resolving conflicts, resolving conflicts, solving problems. If we have leaders who think only about war, well?”

I imagine the Nobel committee is wrapping up for this year. What can education have on the table for the 2011 Nobel Peace Prize? Suggestions welcome below.

Author/s: 
Wick Sloane
Author's email: 
newsroom@insidehighered.com

Young Voters and the 'Rally for Sanity'

Last week Inside Higher Ed published a column by Scott McLemee entitled “Rude Democracy,” which discussed Jon Stewart’s Rally to Restore Sanity and apparent trends indicating a lack of political engagement among young people. McLemee’s argument was both intelligent and important, but I believe there’s another side to the story of Stewart’s rally, political civility, and turnout among college students and young voters in the 2010 midterm election.

Unsurprisingly, Republicans were very successful in the midterm election, gaining control of the House of Representatives and cutting into the Democrats’ majority in the Senate. While the politically active on campuses across the country will surely devote much discussion to the results of the election and their implications over the proceeding days and weeks, it’s less likely they’ll discuss the execrable turnout among 18- to 29-year-olds.

Early exit polling done by CBS News indicates that young people made up roughly 9 percent of all voters in the midterm election. In 2008, young people made up 18 percent of the electorate. Why?

Political scientists and campaign consultants offer several theories. Americans are more likely to vote when enthusiasm abounds for the candidates they support and young people tend to support Democrats. Young people historically don’t turn out for midterms. Barack Obama’s candidacy in 2008 was uniquely galvanizing for young voters. The agenda of Congress and the president has not adequately addressed energy policy -- a very important issue to college students – and media coverage of the health care reform bill (which did quietly include benefits for young people) focused mostly on the concerns of older voters. Thus, young people seem to have concluded that voting isn’t worth their time right now.

I, however, believe that something deeper may explain young people’s disengagement in 2010. Scott McLemee, in his “Rude Democracy” piece, discusses Jon Stewart’s Rally to Restore Sanity in light of a book written by the Georgia Tech political scientist Susan Herbst (which is also titled Rude Democracy).

Herbst studied the views of young people and found that “72 percent of students agreed that it was very important for them to always feel comfortable in class.” Herbst argues that “Contrary to the image of college being a place to ‘find oneself’ and learn from others, a number of students saw the campus as just the opposite – a place where already formed citizens clash, stay with like-minded others, or avoid politics altogether.”

Based on Herbst’s study, McLemee, writing prior to the sanity rally, argued that, while Stewart’s rally was likely to draw lots of young people and provide them with a fun weekend, “the anti-ideological spirit of the event is a dead end. The attitude that it's better to stay cool and amused than to risk making arguments or expressing too much ardor -- this is not civility. It’s timidity.” Clearly, McLemee believes that the unwillingness of young people to engage in political debate – argument – is not a political virtue, but rather a democratically harmful form of indifference.

Before accepting McLemee’s assertion, though, I think several important questions need to be answered. Why do the students in Herbst's survey feel that it isn't possible to persuade others? Could it be that such a belief is the product of an uncivil political culture? If students had political role models who successfully persuaded others in civil and respectful ways, would they be more inclined to view the political arena – and the classroom – as a space in which the clash of ideas can occur and yield positive results?

Personally, I can think of two positive things Stewart does; first, he encourages young people to refuse to subscribe to the currently pervasive ultra-partisan view of politics that fosters incivility and acts as a barrier to progress; and second, and more basically, he brings some level of political awareness through humor to people who might otherwise be totally apathetic and ignorant. Stewart’s influence, in my view, doesn’t breed timidity (as McLemee asserts), but rather increased youth engagement of the type that rejects a toxic political culture.

It also seems possible that the “Obama Effect” I mentioned earlier, holding that young voters turned out in 2008 because of their admiration of the current president, is at play. I'm worried that young people, perhaps naively, viewed Candidate Obama as a post-partisan role model and that President Obama’s lack of success thus far may further discourage engagement among young people who believed he had the ability to catalyze change without acting like every other “scumbag politician” they've come to dislike.

Moving forward, two things are clear. First, the perspective of young people has the power to change the nature of partisanship; if we, as a generation, continue to subscribe to the ideals of the Rally to Restore Sanity, we have the potential to improve the tone of politics.

Second, however, the burden most certainly falls on us; politicians are not going to pander to a portion of the electorate they don’t believe will turn out to vote, so if we want to transform Stewart’s rally from a sunny Saturday on the Mall into a new political reality, we’ve got to make our voices heard.

Author/s: 
Matthew Lacombe
Author's email: 
newsroom@insidehighered.com

Matthew Lacombe is a senior at Allegheny College, in Meadville, Pa., and a student fellow at Allegheny's Center for Political Participation.

Pages

Subscribe to RSS - Politics (national)
Back to Top