There is growing anxiety among educators and policy makers that American colleges and universities are not churning out enough science and engineering majors, thereby jeopardizing the economic advantages currently enjoyed by the United States. Gone are the days when university presidents such as Robert Hutchins placed the study of philosophical and literary works at the heart of undergraduate education; now academics are much more likely to recommend (as Claudia Goldin and Lawrence Katz do in The Race Between Education and Technology) an increased attention to developing technological skills.
But are the humanities really only good for soft skills and vague ideals, like self-realization or civic understanding? Do they provide nothing to our “hard” achievements in science and engineering? A comparative look at undergraduate curricula suggests otherwise, and that the American emphasis on a liberal arts education is key to our success in nurturing innovative thinking among students.
Let me begin by recounting how I came to realize the importance of the humanities not only for cultivating the mind, but also for teaching the profitable art of originality. It was in the context of Stanford University’s obligatory “Introduction to the Humanities” (IHUM) course series for freshmen. Unlike its “Western civ” predecessor, which was the source of so much criticism in the 1960s, IHUM provides a variety of course offerings, ranging from “Ancient Empires” to “Rebellious Daughters and Filial Sons of the Chinese Family.” In the course I teach with my colleagues Robert Harrison and Joshua Landy, “Epic Journeys, Modern Quests,” the students read literary works ranging from Gilgamesh to The Trial, and write a series of papers analyzing the texts along the way.
Speaking with some Chinese students one day before class, they explained to me how they found these writing exercises utterly baffling. “We are supposed to come up with an original thesis?,” they asked. “How are we meant to do that?” Never before had they been encouraged to provide their own interpretation of a text or event. High schools in China focus obsessively on memorization; there is no place in the curriculum for constructing an original argument.
American culture and economy, by contrast, place an almost unrivaled premium on originality: “Invent, invent, invent” was the title of a recent Thomas Friedman column in the Times. The iPhone may be made in China, but, as its packaging proudly declares, it is “Designed in California.” We have exchanged manufacturing for innovation, and seem mostly content with the deal. Rarely do we ask, however, how and where originality is taught. And if we try to answer this question, it becomes clear that humanities courses such as IHUM offer far more opportunities for innovative thinking than most science classes.
To be sure, universities such as Stanford offer seminars in, say, mechanical engineering, in which students are called upon to invent new designs and products. But these courses tend to be reserved for upper-level students. While science educators are beginning to emphasize the importance of problem-solving courses at the entry level, the purpose of most basic math or science classes is not to encourage originality. If you take a calculus exam and get the same answers as 50 other students in the class, you may well get an A. If your essay thesis for a history course is the same as 50 other students in the class, you most likely will not.
The point here is a simple one: humanities courses provide students with lessons in innovation from day one. Good professors model original thinking for their students in their lectures, which is one of the reasons that research and teaching can be mutually beneficial. Students in turn learn how to examine topics under new light. Whether they go on to become software engineers, surgeons, or physicists, this primary training in innovative thought will help them imagine, invent, and create the world of the future. The modest undergraduate essay on Euripides’ Iphigenia requires the same conceptual skill-set as does devising a new medical procedure, constructing a different architectural schema, or coming up with a creative business model.
University administrators are quick to recognize the importance of creative thinking in academic curricula, but too often assume that creativity is found only in the arts. In fact, if the arts offer more opportunities for creative expression, the humanities can provide a better forum for reflecting on innovative processes, and by extension, a better chance to apply these lessons in other fields. This is not to suggest that the arts fail to live up to their pedagogical promises, but rather that the humanities offer a critical supplement.
When we consider the future of American higher education, therefore, we would do well to remember that a long-standing attachment to a liberal arts education has contributed in no small way to its great renown. Why is it, after all, that students from around the world dream of studying in the land of Apple and Google, when such a large chunk of our curricula is dedicated to reading Aristotle and Goethe?
The United States is in fact one of very few countries where college students continue to receive a general education. After graduating from high school, French students dedicate themselves immediately to the study of law, medicine, biochemistry, or another narrow specialization; the same holds true for English, German, Swiss, Italian, and most other European students. In the United Kingdom, specialization begins around the age of 15-16; after that, students usually only pursue three disciplines, often within a single area (e.g. the humanities or sciences). This important difference means that American universities are quite unique in their insistence that all undergraduates receive a humanistic education.
Alumni of American universities do not seem to find that their time reading Jane Austen or Alexis de Tocqueville was in vain. Entrepreneurs, for instance, emphasize the importance of a liberal arts education for business: “Entrepreneurship is a philosophy. It’s a way of looking at the world,” the venture capitalist Randy Komisar recently told Siliconvalley.com, adding how it “dovetails nicely with a liberal arts education.” Even academics are insisting on the parallel: as Mary Godwyn of Babson College wrote in Academe, “Entrepreneurship is a tangible, practical manifestation of a liberal arts sensibility.” Steve Jobs famously dropped out of Reed College, but recalled in his 2005 commencement speech at Stanford how a course in Asian calligraphy transformed his vision of fonts and text.
The thought that students in a Shakespeare class might read The Merchant of Venice for insights on investment banking rightly sends shivers down every English professor’s back. Rather than consider how we need to “integrate liberal arts and entrepreneurship courses,” as Godwyn suggests, it bears emphasizing the benefits provided by the liberal arts tout court. The fundamental activity that lies at the heart of humanistic studies is practical enough. If business, medicine, and engineering professors see it fit to incorporate literary or philosophical material in their syllabi, so much the better; but we would lose many of the other, more intangible values of the humanities if we reduce them to mere “how-to” studies.
This is not to say that professors and researchers in the humanities cannot retool their pedagogical and scholarly strategies in order to convey the excitement and passion of, say, the French Revolution to students who yawn at the mention of a pre-Facebook age. Indeed, as university presses become anemic, now is a good time to rethink the whole disciplinary pressures on specialization, which often translate into writing for a choir of a dozen faithful. No doubt we should aspire more to becoming “conversational critics,” in Adam Gopnik’s phrase. But equally important is the need for university administrators, policy makers, and cultural commentators, to recognize the important work already being done in freshmen seminars and writing classes in the humanities.
There are many steps from Iphigenia to the iPhone, but fostering an innovative, thoughtful, and humanistic environment is the first.
"All of you to whom furious work is dear, and whatever is fast, new, and strange -- you find it hard to bear yourselves; your industry is to escape and the will to forget yourselves. If you believed more in life you would fling yourselves less to the moment. But you do not have content enough in yourselves for waiting -- and not even for idleness."
Reducing the price of higher education by offering a three-year undergraduate degree for all students embarrassingly announces to the world that in America finance and clever marketing trump learning. Lopping off one quarter of the current norm for bachelor's degrees seems a compelling way to achieve the big savings we all long for, particularly when coupled with imaginings that college education is inefficient and readily compacted. Higher ed is easy prey to such imaginings because it deals often with what is not immediately perceived and readily measured. Its subject is the mind and the maturing young person. Its playing field is the duration of time.
It may be that we can no longer afford the four-year standard for an undergraduate education. If economic realities push against our current model, so be it. But before we fast forward college in the name of affordability, let's at least be honest about what is being lost. Three is usually not more than or equal to four. Not all results -- especially in education, where "widgets" are not the product -- are available at lower price and the same quality. Perhaps we can "get undergraduates through" in three years. However, what we may have to alter to achieve that end might severely compromise what we hope to accomplish for our students, particularly in areas vital to a thriving 21st century democracy and economy.
Consider the following areas of concern. Most involve potential dilution of those very educational goals deemed by the marketplace to be critical competencies to a global workforce and by the public to be essential to American democracy.
Global perspective. American higher education has traditionally done a poor job providing students with global perspective, despite the clear importance of globalization for our future. True, more Americans are studying abroad than in the past, but for shorter periods -- this despite convincing evidence that stays of up to an academic year yield markedly superior results. A three-year degree program -- with the "no frills" philosophy that often supports it -- is likely to reduce space for global education across the curriculum; it certainly will constrain study abroad. Not to mention the more specialized but important issue of impact on instruction in critically needed but demanding languages such as Arabic or Chinese.
Interdisciplinarity. The movement in higher education has been steadily toward more interdisciplinary work, and for multiple, good reasons. This is where much of the action is in research and discovery. Think, for example, of such fields as biochemistry, neuroscience, bioinformatics, or environmental studies. More generally, most of the problems we currently face are interdisciplinary in nature (think, for instance, of what one needs to know to address the issue of climate change seriously). Both in the workplace and as citizens the ability to "connect the dots" by drawing insights from multiple fields is becoming ever more essential. As a consequence, we now add "synthesis" to traditional demands for "breadth" and "depth" as a key dimension of undergraduate education. A shortened degree can limit our capacity for interdisciplinary programming, whether in majors or general education.
Complexity. As the foregoing indicates, academic fields -- and the world at large -- are becoming more complex, not less. Look, for example, at what it now requires to be a biologist as compared to 20 (or even 10) years ago. New information, new methods (often borrowed from other fields as in bioinformatics), and new instrumentation have opened doors both to more knowledge and more questions to answer, not fewer. This is not only a matter of mastering an academic discipline but of expanding into contemporary issues from health care to financial markets. We need more thinking and students knowing with some certainty about the complexity of issues, not less.
Choice. Growing complexity has meant that academic majors have become both fuller and more hierarchical (i.e., more courses with more prerequisites). Pushing hierarchical majors back from four years to three will inevitably up the pressure on students to decide on a major immediately, and significantly constrain the possibility of a change in direction. Early specialization is the European model, but should it become ours? Will it maintain America's edge of advancing students and a workforce who are engaged, entrepreneurial and creative in part because they have taken the time to find out who they are and remained open to new possibilities? This question has particularly salience as the new, 21st century economy demands ever more flexibility.
Creativity. Much of the foregoing focuses on an arena in which Americans, and American students, have been historically distinguished -- creativity. Can this quality thrive in a "hurry up," "let's get it done" version of higher education? This goal is certainly more difficult to achieve in a course of study that is predicated upon early specialization and in which combining depth with experience in a variety of fields is minimized?
Democracy. It is true that we as a nation must educate for the skills/abilities that fuel our economy, and at reasonable cost. But we educate for the habits of mind and action that fuel a democracy as well. Educating for democracy as opposed to mere academic coursework is a global differentiator of American higher education. Three-year compacting may very well push out opportunity for the broader tools and vision we need for citizenship to unfold over time in a residential setting -- especially among students who are generally the youngest to begin university in the industrialized world. Let us remember that many of the skills of organization and association that since Tocqueville have been identified as guarantors of American civil society are developed in co- and extra-curricular activities that characterize current residential education. These, too, are potential victims of degree acceleration.
Meaning. When advocates of a shorter degree call for "no frills" and an end to "waste," likely targets for substantial cutbacks are the humanities and arts. The press for more practical undergraduate degree programs, intensified by global economic competition, has already reduced enrollment in these fields. Is it worthwhile to have a system in which speculation on what it means to be human and exposure to the range of human creativity and expression in the arts are increasingly pushed aside? Our students already do too little of this.
Technology. New online technology is often offered up as the elixir for students of any age that shortens time and improves quality by simultaneously accelerating and enhancing instruction. But this is far from proven. The emerging reality may be that technology works best among younger students when combined with more traditional, faculty-contact based approaches rather than as a substitute. Moreover, mastery of many technological innovations -- whether general skills of computing or more specialized skills associated with new instrumentation and techniques, especially in the sciences -- itself places time pressures on the undergraduate degree. Not to mention that the substantial costs of developing and applying effective technology in instruction work against promised cost savings through degree acceleration.
It is worth noting that at many institutions the door is already open to a three-year degree. Students can deploy credits earned in high school through Advanced Placement, summer school, and/or a few semesters with a course overload to reduce their time to a bachelor's degree to three or quite easily three and a half years. How many do? Very few. Advocates of degree acceleration would claim, with some reason, that we do not advertise the fast track and have built the system to discourage it.
They might also argue, again with cause, that students who might otherwise finish in three years fear competition from peers who have taken longer to mature, hone their abilities, and develop resumes full of internships, study abroad and senior research experience. But could it also be that many students are in no rush because they sense some of the points made above? Perhaps they have an inkling that four years of study and maturation prepare them better for graduate work, career, and life?
Supporters of the three-year degree often cite the example of Europe as justification for reducing our time in undergraduate study. Indeed, as part of the Bologna Declaration, member countries are required to move by 2010 to a five-year bachelor's-master's degree sequence. Many nations are choosing the 3+2 option. Yet the comparison of the two systems of higher education is highly misleading, most obviously because conditions in Europe are quite different from conditions in America.
Take Germany as an example -- a country that has chosen the 3+2 option. There, high school students prepare for the university with a rigorous liberal arts and science course of study until the age of 19. This study is so demanding that numerous colleges and universities in the United States award a full year of college credit for the completion of the Abitur -- the German high school degree. In essence, the German secondary school experience de facto makes for a four-year degree. Moreover, German men are required by law to complete a period of either military or civil service and thus will begin undergraduate study generally at 20 and finish at 23 or 24.
Certain fields of study, however, have a limited number of seats for study available in any given year ("Numerus Clausus") and therefore, students have to wait a year or more additionally to begin university. The American three-year proposal would have our students completing college at 20 or 21. The age difference is striking -- and not to our advantage. Moreover, at present only 35 percent of German school students proceed to university study versus approximately 65 percent in the United States. Clearly Germany is subjecting a far more uniformly well-prepared and limited number of students to the "three-year" baccalaureate than would be subject to it in the United States. The comparison of the two nations for advocacy of the three-year degree simply breaks down on several fronts.
Importantly, many European professors, students and educational agencies are awakening to negative outcomes of the three-year degree. For example, increasingly German students are forgoing their former practice of study abroad in order to finish on time their tightly prescribed three-year program. The situation has become so alarming that the German Academic Exchange Service (DAAD) is apparently advocating a four-year undergraduate degree program to accommodate a year of study abroad. Students and professors are also discovering that the three-year course of study is so regimented that there is little to no time to engage in studies across disciplines or to reflect upon what has been learned.
Returning to America and looking at our system as a whole, the key numbers may not be four and three. As the Obama administration has clearly discovered, roughly one third of our students are enrolled in community college, and here the issue is improving the quality of two years. Even in regard to four-year institutions, the real challenge may be increasing the number of students who complete on time. Our focus ought to be on the five, six or more years it often takes to complete. For example, more than 60 percent of students with Advanced Placement credit -- theoretically prime candidates for an accelerated degree -- currently fail to finish in four years. In addition, all of this, as the European example demonstrates, is predicated upon another set of numbers -- K through 12.
Underlying the issue of degree non-completion at all levels of our higher education system are the demographics of access. Opening the doors of college to more Americans, including particularly students from groups historically underrepresented in higher ed, creates challenges in regard to cost. But it also raises the issue of quality. Will we expand opportunity and access by diluting the product by introducing a three-year degree at such a critical period of opportunity in their lives?
Of course, the commentary on three-year degrees is typically based on the assumption that it will radically decrease cost. But the new model has yet to be rigorously structured financially. The claims are that three years will save a great deal of money, but are we certain that is so? Implementing an accelerated degree efficiently in regard to scheduling, advising, and facilities will require additional administrative overhead. Offering the necessary courses may well mean additional faculty. And there will be other added instructional costs. These might include extra professors to implement more intensive pedagogy, new monies to support online work, or both. Could it in fact cost the same or at the outside even more to accelerate?
Capturing the spirit of the times, one prominent advocate styled the three-year degree as the "higher ed equivalent of a fuel-efficient car" compared to the "gas guzzling four-year course." A metaphor from the food industry might be more apt. Slow education, as in slow cooking, is enthusiastically replaced by Fast Ed or McEd, with comparable results. Higher education is certainly in need of efficiency. Our current business model, which has yielded steadily increasing costs, needs change and, perhaps, radically so. Let us not be fooled by adapting across the system solutions that appear corrective but may be destructive of the virtue and distinction of American higher education and its ambition -- education for the workforce and for participation and leadership in a democracy.
Reducing the undergraduate program to three years from four is a "quick fix." Much is to be lost and much wagered, ironically at a time when the four-year program that has helped so many to success is being made available across American society. We can do better.
William G. Durden and Neil B. Weissman
William G. Durden is president of Dickinson College. Neil B. Weissman is provost and dean of the college at Dickinson.
Without intending it, I offended my friends by speaking a foreign language.
When I left a research center for the humanities and started work in a philanthropic foundation over five years ago, I wanted to know if a foundation could make a difference to the extent and depth of student learning in the liberal arts. To answer that question, I had to learn as much as I could about how students learn and how we know about their learning. Before long, I was studying reports such as the one produced by the Association of American Colleges and Universities’ Liberal Education and America’s Promise initiative (LEAP) that argued that liberal education ought to be understood not as exposing students to certain fields of knowledge, but as helping them to develop long-lasting cognitive and personal capacities. When I started using that phrase, I was on a slippery slope.
The next thing I knew, I was asking whether colleges and universities were translating that understanding of liberal education into clear learning outcomes. The phrase did not come tripping off the tongue, but the question was such an important one that I went right ahead and asked whether their practices were truly and effectively aligned with these outcomes. Were scaffoldings in place to help students move from one cognitive level to a higher one?
Despite its efforts to strengthen teaching, almost no one at the humanities center had spoken this lingo -- or asked such questions. When I started to do so, I found myself making the strange hiss sounds of “assessment,” a sound so savagely obnoxious that my friends began to hint that I was opening the gates to the barbarians.
I tried to conciliate them by substituting the term “evidence” for “assessment,” but they were too smart for that. And when I found I needed to investigate the various instruments that had been developed to help measure student learning, it was clear to many friends that I had gone over to the dark side. Terms such as NSSE, CLA, HERI, and CIRP were shibboleths that marked me as one of them.
It did no good to explain these were just convenient acronyms for titles in plain English. The titles themselves gave the show away: the National Survey of Student Engagement, for example, was clearly code for an alien view of education. The surveys were quantitative, a classicist friend noted with horror, warning me that “You can’t measure the human soul with numbers.”
Even worse, when I learned that the NSSE surveys had produced an empirical base for identifying a few high-impact practices, ones that demonstrably improved student engagement, learning, retention and graduation rates, the terms were so off-putting that in some quarters the ideas behind it could, as they say, gain no traction.
One friend -- who has somehow remained so despite my wayward behavior -- told me I needed to find some way to “translate” phrases such as high-impact practices into language more acceptable in the more ethereal reaches of the academy.
But I had done enough translating in my days as a classicist; now I was more interested in changing practice, and that, I realized, meant changing discourse. My theoretically minded friends had taught me one thing, after all. Discourse shapes practice.
Or, freely translated, “You have to talk the talk before you can walk the walk."
So I went on to other ophidian sounds, asking how higher education could successfully make systemic and systematic changes. Teagle Foundation grants for this purpose were going well, but the sibilants still sounded pernicious in many ears. Nor did it help to “translate” systematic into the phrase continuous quality improvement. That had few sibilants, but an unmistakable whiff about it of a Toyota factory or some other banausic enterprise.
The new mode of speech had a disconcerting inflection as well as an annoying vocabulary. For example, the stress in the “teaching and learning” moved disconcertingly from the first syllable of the dactyl, “teach’ing and … ” to the penult in the spondee, “learn’ing.” That reflects the emphasis in the new discourse on student learning. It expects students to take responsibility for their education rather than leaving the burden on “great teachers” and “good pedagogy.” Goodbye, Mr Chips. Hello daily development of cumulative cognitive and personal capacities.
Although it continues to give offense, the new discourse has in the last year or two passed a tipping point. It has now become the dominant mode of arguing, thinking and doing something about higher education.
There are two reasons, I believe, for this. First the accrediting organizations now insist on clear learning goals and rigorous assessment of progress toward them. And they are “drilling down” to the department and even course level to see what is being achieved.
More important, however, is a second reason: Faculty members who approach teaching in this way report that it is energizing, empowering, refreshing. It’s a welcome change from endless debates about the literary canon, or the curriculum. They say the terminology is no more opaque than the vocabulary of the economists, or the language we philologists use in establishing the stemmatics of ancient texts, or the useful technical terminologies developed in reader-response theory, deconstruction, and subaltern studies.
Every craft has its discourse, and every discourse shapes practice. It’s the results that count. It’s worth learning some new vocabulary when new friends whose speech I have come to understand are saying that they like having students who are more intensely engaged in learning, and taking greater responsibility for their education. They even talk about greater “satisfaction.”
How’s that for a change in discourse?
Robert Connor is president of the Teagle Foundation.
A variety of scholars have weighed in on the current debate about American political civility, noting brutal fights on the floor of Congress in the 19th century, nasty mud-slinging of U.S. presidential campaigns throughout history, and other less than impressive aspects of our cultural past. And of course, they are correct that incivility is nothing new. What makes incivility seem omnipresent is the communication environment of our day: the pressure on our 24/7 journalists to fill airtime, new venues for citizens to state their opinions -- thoughtful or lunatic -- online, and a culture that encourages unabashed self-expression.
Who thought we would see the day when CNN news anchors would read incoming “Tweets” from viewers to us in serial fashion, opening an international information channel to faceless, opinionated people with no qualification for broadcasting except time on their hands?
It was difficult not to be appalled by the excesses of campaign rally crowds during the 2008 presidential election, the displays at some health care town hall meetings this past summer, and Congressman Joe Wilson’s outburst ("You lie!"). Students of American political history put these events in context, easily, because incivility is manifest in a variety of ways during different eras. But that scholarly response seems a very unsatisfying reaction to the ill-mannered eruptions, name-calling, and sheer meanness that we find on television and our favorite internet sites, now on a regular basis. The incivility is still worrisome, even if historically predictable, and we look for a way to cope with it.
The scholarly literature on trends in civility is mixed in its conclusions, with some arguing for either a bumpy or near-linear increase of incivility in both the United States and Western Europe, others arguing that we are actually more polite now than ever in public, and still others – like myself – who posit that civility and incivility are both timeless strategic rhetorical weapons. Some people are better at using these tools than others, to achieve their goals, but a macro-historical argument about collective civility is probably a bit of a stretch and difficult to demonstrate empirically, to say the least.
The “incivility as strategy” approach fits our current circumstances, particularly the health care reform debate, fairly well. The political right now draws on Saul Alinsky’s mid-century tactics on behalf of the poor in Chicago for instruction on town meeting behavior, and the political left tries to come up with brutally effective broadcast advertisements, guided by the Republican “Harry and Louise” spots that undermined the Clintons in the 1990s. Civility and incivility are weapons, as are facts, logic, demonstrating, teaching, striking, and all the other means of persuasion one finds in the arsenal of public expression.
But perhaps the essential issue is that incivility is just more interesting than is measured, calm discussion. Incivility is intriguing, almost always. It can be downright exciting, as when blows are exchanged at a town meeting, and replayed like a train wreck on YouTube by millions of viewers. And who is not fascinated by citizens (apparently on the same side of the issues) marching with pictures of the president portrayed as both Lenin and Hitler? It is bizarre, and also hard to take our eyes off of.
As President Obama put it on a recent broadcast of 60 Minutes: "I will also say that in the era of 24-hour cable news cycles that the loudest, shrillest voices get the most attention. And so, one of the things I'm trying to figure out is, how can we make sure that civility is interesting. And, you know, hopefully, I will be a good model for the fact that, you know, you don't have to yell and holler to make your point, and to be passionate about your position."
Obama might, over the longer term, fight incivility in part by maintaining his own preternatural calm throughout incessant appearances on television. But my sense is that exciting nasty discourse needs to be matched by something that gets the blood boiling just as well, or incivility will indeed triumph in any given situation.
Soaring rhetoric from President Reagan in the past, Obama today, and others with their talents in the future may be passionate, but as rhetoric soars, it does not always argue. Great oratory gets steamed up when it expresses hopes and beliefs (e.g., Americans cannot always support other citizens financially, or health care is an inalienable “right”), not when it argues for, say, the "public option" or insurance cooperatives. So, the trick is to find mechanisms for public policy discussion that are exciting, passionate, creative, and thoughtful all at the same time.
From the ancient philosophers onward, a variety of academics across disciplines have tackled the questions of rhetoric, persuasion, political debate, and civility, and as a result, we can offer a tremendous amount of theorizing and empirical research on these topics. But that complex material simply will not penetrate or guide contemporary American public discourse any time soon. And pointing to our campuses as models -- underscoring the ways we debate and argue with respect for each other every day (or nearly every day), in classrooms, faculty meetings, symposiums, and beyond – doesn’t go very far either. It’s hard to explain unless you have lived it: Imploring political leaders or fellow citizens to look to universities as exemplars of "cultures of argument" will not work because it is too experiential in nature.
However, colleges and universities do offer more practical ideas and tools to American lawmakers, journalists, and interest group leaders, that are far more helpful and productive. There is the wonderful work by Gerald Graff and others on teaching argument and conflict, demanding that our students know how to make an argument in class, in papers, and as they go about their lives. As the years pass, these scholars have made a difference, and my bet is that their impact will be even greater as a younger generation of faculty learn how to incorporate argument into their teaching, no matter the discipline or class size.
But even more accessible than these pedagogical paradigms and tools is formal debate itself, from policy debate modeled by national championship college and university teams, to Lincoln-Douglas-style debate, and a variety of other formats that have emerged across nations. While I was only a high school debater myself, and I'm now far outside both the high school and collegiate debate “circuits," it is clear to me that if we can train our students – not only our student leaders and teams – in debate, and make it a stronger presence on campuses, we might build a more constructive public discourse with generational change. Anyone can debate – learn to make an argument, marshal evidence, rebut – with some instruction and practice. And these skills, once gained, can be translated into the sorts of forums our students will eventually find themselves in: workplace meetings, the PTA, community organizations, and in some cases, city halls and legislatures. We do not need to train a generation of lawyers, but we do need to train a generation of students who can simulate what attorneys and great debaters do as a matter of course.
There are many people, organizations and institutions that teach debate either for the classroom or for regional or national competitions, in the United States, abroad, and online (see here and here). But the basic elements are the same across formats: Argument, evidence, forced reciprocity and dialogue, equal time, and mandatory listening. These are precisely the elements missing from much of the contemporary debate about health care reform, and I predict they will be absent as well from the worrisome debates coming next, immigration policy reform in particular. These aspects of communication are the very building blocks for civility, and at this point at least, we have a deficit of them.
Those of us who study political communication used to hope – and perhaps many scholars still do – that the best American journalists would educate the public on the quickly-evolving policy issues before us, leading reasoned debate through newspapers and television programs. Some journalists give it an honest try, when they hold jobs that allow it. And we can locate a few lone heroes among the Sunday morning talking heads, if we wade through all the worthless talk of presidential popularity polls, embarrassing gaffes, and who is spinning whom. But with the financial struggles and disappearance of so many news organizations, it is difficult for any journalist – no matter how talented – to get our attention.
They compete, for better or worse, with bloggers and Twitterers, and wise information “gate-keepers” are leaving us with every passing year. It may be up to academic leaders to take on unexpected and much greater responsibility in shaping citizens, not just in our conventional ways of teaching liberal arts or specialized disciplinary knowledge. Of course we shape citizens already, but we must also figure out how to train our students for the rough and tumble they will find after they leave our contemplative campuses. It’s a jungle out there in the world of American political discourse, and our students will need to give it all some logical structure, and simultaneously invent new forms of civility for their generation.
Many colleges and universities teach public speaking at present, and some have made introductory courses mandatory in core curriculums or as part of major requirements in fields like Communications. Why not, similarly, consider formal debate training, as a mandatory – or at least greatly encouraged – aspect of a college curriculum? To my mind, it should at least be a consideration of all educators watching our national political debate in the fall of 2009. We can shut off CNN in disgust and sit in awe of some truly horrendous town meetings. But we can help things somewhat, by teaching our students both how to argue and why it is exciting to do so. College and university faculty can enhance the long-term health of political communication by focusing on the development of argumentation, in whatever form fits their courses, disciplines, institutions, and community.
Along these lines, Model United Nations is another excellent tool for teaching students how to argue respectfully and take positions they would not normally take. These programs demand more of students in a course than debate might, but as with teaching debate (in person or online), there is extensive support for instructors available for free on the Web. As with debate, the general structure of Model U.N. can be altered to fit a particular curricular goal or theme. For example, in teaching the Middle East conflicts and issues, the National Council on U.S.-Arab Relations supports a network called "Model Arab League" at both the high school and college levels. And of course, more ambitious faculty can try to fashion entirely new stakeholder-based deliberation programs, using the general rules of more established activities like Model U.N.
Our students will not – no matter how compelling and well-trained – be able to demand that their local school board follow the tight structure and rules of policy debate or of congress (on a good day). That is an absurdity. But they will have an ideal-typical model for what logical, evidence-based debate should look like, and will inevitably bring some elements of it with them to whatever table at which they find themselves. I have found in so many groups and organizations that people are generally starved for rules about how to conduct their discussions – a rationalized (in Weber’s sense) approach that might bring fairness, civility, and progress. The point is that we need to give students exemplars, somehow, so they can lead others toward structures for talking, listening, and constructive exchange, based on mutual respect and decency. And they might even bring civility to the internet, developing new ways to harness free communication in the service of democratic talk.
The truth is that while Americans pioneered a kind of democracy, we have never been particularly good at debate -- not during Alexis de Tocqueville’s era, and not today. We certainly don’t seem to have the patience for it. There have been some intriguing presidential campaign exchanges here and there, memorable moments in congressional hearings, and of course many moving orators in mainstream politics and outside of it. But we will never see the sort of civil, thoughtful, inventive debate that enables good public policy making until we inspire the young adults in our midst how to pursue it themselves.
Susan Herbst is chief academic officer for the University System of Georgia and professor of public policy at Georgia Institute of Technology.
Early on, as the financial markets spiraled down and unemployment surged, some commentators argued that the national environment would provide the impetus to effect serious change in higher education. After all, they reasoned, campus stakeholders understood the seriousness of the events around them as massive layoffs were occurring, 403(b) funds were being reduced to 203(b)s and it was universally understood that no job on campus was safe, potentially even faculty jobs.
As a variety of troubling conditions became almost simultaneously woven together, it appeared as though a sea change for institutions was inevitable -- a perfect storm for change was developing over higher education. The economic downturn and associated collateral damage created urgency for all stakeholders to come together in a more politically civilized environment to invoke major shifts in how the academy operates as an organization and as a learning community.
However, generally absent from cost containment and revenue sustainability decisions are cost reallocation decisions regarding the relevance and viability of the academic portfolio. The extent to which institutions explore the financial performance, market demand and mission impact of academic programs (e.g., programs, concentrations, courses, sections) across the program portfolio is largely unknown. It is unclear if institutions have a structured process, access to the data and reporting mechanisms to inform review of programs and, subsequently, if they have the capacity to make decisions to retire/eliminate programs.
Given the significant resources allocated to academic programs, the time many programs have been in existence, and the changing market place and challenging economic conditions, a rigorous, objective review is a reasonable and necessary part of an institution’s due diligence. However, these decisions may be the most challenging of all.
Even in the face of unprecedented financial challenge, are the traditions, political forces, mission arguments and ideological posturing within the academy trumping the ability to restructure the academic portfolio, and the decision making and resource allocation structures that currently exist? Or, alternatively, is the eye of the storm of such magnitude that this level of macro change will be deferred until stimulus funding evaporates and there is a public moratorium on tuition and fee increases?
Perhaps for some regions, major restructuring will occur only when the reality of large declines in the high school pipeline make their way into annual operating budgets, and community colleges begin cannibalizing enrollments from neighboring four-year institutions.
A Case Illustration
Consider a view of the national academic program portfolio. In 2007, higher education produced 2,189,315 degrees in total across 1,079 fields of study. The distribution of degree conferrals across fields of study varies greatly, ranging from 0 to 218,212. Despite the volume of degrees conferred annually, focused on an extensive variety of fields of study, it is a reasonable assumption that not all of these programs possess either the recent historic evidence or market opportunity to support their continuation.
For illustration purposes, review the set of program viability metrics below. These are real data points of an academic program currently offered by an accredited institution. Enrollments have not grown over the past 5 years, degrees conferred have declined by 20.5 percent, projected employment of graduates in this field within the State is relatively static through 2014 and the regional competitive landscape is saturated with similar programs, as seen in the table below:
Has enrollment for this specific program grown at the institution?
Enrollment for the program has witnessed 0 growth from 2004-2007 with 17 degrees conferred during each of those years.
Nationally, have conferrals in this or similar degrees grown?
From 2002 to 2007, bachelor’s degrees conferred nationally in this field declined from 468 to 372 degrees, or a 20.5% decrease.
Regionally, are relevant occupations for graduates of this degree expected to increase?
Employment of graduates in this State is low and growth is expected to remain static. Specifically, employment is expected to increase minimally from 99 in 2004 to 122 occupations in 2014.
Nationally, are relevant occupations for graduates of this degree expected to increase?
Employment prospects for this field will remain relatively static at a 3.7% growth rate from 2006-2016 (or 1,000 jobs dispersed nationally) with no (0) expected annual average job openings due to growth and net replacements.
Is there a strong market opportunity for this degree program?
There are 12 regional competitors offering a similar bachelor’s degree.
Institutional leaders can use this type of analysis to make difficult, but evidence-based, decisions. There are, of course, other variables that should be considered in this context. For example, is the program directly aligned with the institution’s mission and strategic plan, and/or does it support the goals of a liberal arts education? However, a decision to maintain the program will be made based on a review of a more comprehensive set of program metrics, including projected market demand.
Adopting a Portfolio Review Process
An academic portfolio review process differs from the traditional internal review process. The internal review often focuses on such academic program elements as student achievement and learning outcomes, course scheduling, strengths of faculty, course/adviser workload and resource utilization. The review of the academic portfolio is focused on sustainability, market relevance, and viability of programs moving forward.
The results of a regular and systematic academic program viability review can help institutions creatively address a number of key challenges. As institutions identify emerging program growth areas, many have a severely restricted capacity to add new programs -- new programs that make sense in the context of emerging/evolving fields, occupations and sectors such as sustainability, energy and the health sciences. However, absent grant awards and major gifts from donors, these and other necessary new programs will not have access to the significant capital to both launch and sustain them over time.
Beyond new program development, there are also competing needs for resources to improve student retention and success; advising and mentoring, faculty enrichment, assessment, and focused student support resources. The academic resource pool should be dynamic and fluid. Programs that might be missed but are no longer necessary or relevant (based on market demand, financial performance, competitive landscape, quality, etc.) should have their resources repurposed for emerging needs or opportunities. The tradition of adding programs without changing the base is simply no longer feasible.
So, to what extent are institutions engaged in a systematic and regular evaluation of its academic program portfolio? Consider the following set of questions as an entry point to such a process:
1. If a program has neither the demand (marginal or declining enrollments) nor the market for its graduates, what other factors or rationale is used to support the program’s continuance?
2. To what extent are academic offerings directly aligned with the vision, mission and strategic objectives of your institution’s priorities? If a program is not financially viable but is clearly aligned with the mission of the institution, can the institution afford to have that program subsidized by other financially viable programs?
4. What impact does the competitive landscape for a program have on the institution’s capacity to successfully recruit students, retain faculty and sustain resources to make the program viable in the long term?
5. Do the characteristics of the program lend itself to an alternative delivery mode such as online learning?
6. If analysis suggests that a program is not financially viable, is without a market and is not mission critical, consider how those instructional, program and physical space resources could be re-tasked to address emerging needs or other mission-specific needs of the institution.
There is no question that this is a challenging area to address. There can be strong arguments to maintain programs even if those programs are not directly reflected in present or future market demand or are financially neutral. It may be that they are “untouchable” due to the core values and commitment to a broad based education. But it seems implausible to think this can be the case for all academic programs.
Creating a program viability assessment culture that objectively organizes the metrics for market demand, financial performance, mission impact and program quality appears a necessary part of institutional due diligence, especially during these economic times.
Your “frill” is not my “frill. “ My frill, in fact, is an essential component of the work I do, which is an equally essential aspect of our institution’s mission. Maybe you say the same about yours.
And therein lies the heart of the difficulty in discussing what has recently become a phrase bandied about in the world of higher education. “No-frills education” has been touted by the Pennsylvania State Board of Education, the president of Southern New Hampshire University in recent attention-getting interviews, and pundits commenting on the out-of-control costs of college. If we can just strip the college experience down to its most basic form, the argument goes, we can restore sanity to the price structure and access to those who need it.
But the first challenge comes when we begin to discuss, and decide on, what constitutes a “frill.” Unfortunately, the contentious and fractured nature of higher education, long a hotbed of competing priorities, makes that a difficult conversation.
Shopping for a college education is not like buying a new car, and building an effective institution to provide that education is not like building one. If one of us goes into a car dealership with a plan to buy the most stripped-down vehicle on the lot, and we stick to that plan, we have a pretty good idea of what we will drive away owning: a car without many of the nifty features now available. No GPS, no satellite radio. We will have a smaller engine, which we understand will leave our simple little car a bit underpowered on the highway.
But we know too that we will have a car equipped with the basic safety features required by law -- seatbelts and airbags -- and that it will have the components necessary to drive off the lot: four wheels supporting a frame, powered by an engine.
But what is it about a college education that is truly essential? And how do we arrive at that conclusion? We can start with the curriculum, but if there is an institution out there that has not suffered through lengthy debates about the components of that curriculum, neither of us knows where it is. The only thing constant about the “essential” components of a curriculum has been the regular change each institution imposes on it.
Foreign languages, for example, have been a mainstay of a liberal arts education. But as demand has lessened and resources have dwindled, a number of institutions have reduced or eliminated this requirement. Skill in writing has long been one hallmark of a college education, but at many large research institutions, students can graduate having written fewer than a dozen substantive papers, many of those having been graded and returned with few comments and corrections. Colleges and universities have added, and then removed, requirements for courses addressing diversity, gender issues, global concerns.
What was essential in one decade is seen as frivolous in another. At the furthest extreme is an institution as esteemed as Brown University, which has no required courses among its thousands of offerings.
Is academic support a “frill”? If one agrees that writing is indeed an essential component, then is a writing center that provides intensive tutoring in this skill also an essential component? That’s a fairly easy argument to make. And yet, in a time of budget cuts, we have seen writing centers forced to reduce their hours and staff. At what point does this essential component become so limited that an institution’s mission is threatened?
To return to the car-buying analogy, we know that tastes and needs have an impact on standard equipment in a car, and that over time, we adjust our expectations of that equipment upward. One would be hard-pressed, for example, to find a car without a radio today. It doesn’t mean the radio hasn’t added to the cost of the car, just that we are in agreement that we will accept the cost as part of the price of the car.
But easy acceptance has never been part of academic culture. We can, and do, argue over everything from the lack of vegetarian options in the dining halls to class schedules, from the awarding of tenure to a less-than-stellar instructor to the political correctness of a mascot. Debate is, one could argue, an essential component of our mission (though we have to admit there are days when we wish it were a frill that we might be willing to do away with). The risk for our institutions is not in the content of this debate, but in the oft-reflexive assumptions we bring to the debate, which can then degenerate into a harsh and morale-sapping exchange between groups of colleagues.
“No-frills education” discussions have their common fodder: gleaming recreation centers, posh residence halls with concierge desks, heavily-funded student activities events, athletics and all its attendant costs. These are among the items that proponents of “no-frills” education seek to eliminate. The “no-frills” education offered by Southern New Hampshire University, for example, is a commuter-based approach to garnering credits; many classes are taught by the same faculty who teach at the university’s “heavily frilled” other campus. But are those students getting the same education as their peers down the road? Perhaps they don’t need a recreation center, but is there any doubt that students learn valuable skills from activities outside the classroom?
Over the past 20 years, service learning as a component of the curriculum has become increasingly common as faculty and students alike, supported by data, acknowledge the deep level of learning that takes place when students must put their classroom skills to good use in the community. What about learning to develop a budget for an organization, motivating volunteers, evaluating the success of an effort? And practically speaking, how does a no-frills education impact a student’s relationship with the institution? Will these students be loyal alums 10 or 20 years after graduation?
It’s equally critical that we remember that very few frills are either/or propositions. Most exist on a continuum of cost and usefulness. Perhaps a climbing wall (a “frill” often cited as an example of an unnecessary expenditure) isn’t a good use of campus dollars. But is a fitness center with basic cardio equipment that gives students, as well as faculty and staff, a convenient way to relieve stress and stay healthy in that same category? Similarly, a residence hall with a spectacular view of Boston’s skyline, such as the luxury accommodations recently opened by Boston University, can hardly be discussed in the same conversation as the standard double-room, shared-bath residence halls still operating on most campuses.
These debates about “amenities” versus “necessities,” about what our students need versus what they want, rage on, as they should. It is our responsibility as the keepers of our institution’s educational integrity to own these debates and decisions. If we abrogate our responsibility to do this, someone else, like a state legislator or policy maker or a popular magazine that makes a bundle on its “rankings” issue, will step in.
Who should get to decide that a particular outside-the-classroom activity is a frill? Living on campus is a “frill” in the minds of some higher education policy makers, and certainly the community college system in American has shown for a century that students can receive a good education without experiencing dorm life. But who would argue that learning to live with others isn’t a valuable skill? It’s certainly one we hope our neighbors have learned before they move into the townhouse next door.
Is residence life essential? No. Is it a frill? No. Is it somewhere in the middle? Most likely. So who on any given campus is best positioned to determine whether it stays or goes as part of a move toward “no-frills” education?
An athletics program is similarly difficult to gauge. At one of our institutions, a small, professionally focused college, athletics was eliminated without much of a fight, and the college hasn’t missed a step.
At the other of our institutions, a small, selective liberal arts college, a quarter of the students participate in an intercollegiate sport. The budget to support these efforts, while modest compared to larger schools, is not insubstantial at a time when every dollar is scrutinized. There are on this campus, as we’re sure there are on every campus, those who would characterize athletics as a “frill.”
But if we eliminated the entire program, or even a few sports, enrollment would suffer greatly as those student-athletes sought other opportunities to continue their athletic pursuits, and we would have a hard time keeping our doors open for the rest of our students. It’s also worth pointing out that on this campus, as is the case on many small college campuses, our athletes are retained at a higher rate, and receive less financial aid, than the student body in general.
Some of the “no-frill” efforts being proposed are closely aligned with a view of higher education that is more vocational in nature, more targeted at providing students with skills essential to building an effective and pliable work force to rebuild the American, and global, economy. Setting aside the enormous question of whether this should be the true purpose of a college education, we nonetheless need to consider the role of career services in this equation.
Does a “no-frills” institution help its students find jobs after education? Perhaps, but how? Does it help students identify possible internships with employers? That would be a good idea. Does it invite recruiters to campus to interview students? That makes sense. Does there need to be an employee whose responsibility it is to arrange these internships and visits? That is helpful. Should someone work to prepare these students for these interviews? Review their resumes? Help them determine which recruiters might be of interest to them? Offer a workshop on interviewing skills? Those services make sense if the institution is truly committed to helping students move successfully into the workforce. So now perhaps this institution needs a career services office to provide these opportunities, replete with staff, a small resource library, some career-oriented software supplied on office-located computers.
Frills? Yes, no, and somewhere in between, depending on the vantage point from which you approach the matter.
The point of these examples is not to lead us down a path of endless debate about residence halls, athletics, career services, student activities, or any of the “frills” that proponents of “no-frills” would like to eliminate. It’s to point out that we have, at this point, no agreed-upon framework with which to discuss and define “essential” versus “frill.”
Will these “no-frills” campuses take a pass on academic support services? How about orientation or a campus conduct system? Will faculty at these no-frills institutions be any more comfortable dealing with students in serious academic or emotional distress than our faculty colleagues are now, most of whom appear grateful to have a counseling center (which some might consider a “frill”) to refer these students? Will students with learning and physical disabilities still be able to get the assistance they need, or will anything beyond the bare minimum required by the federal government be considered a “frill” and cast aside along with the climbing wall, spring concert, turf field and whatever else is the frill-of-the-day as portrayed in the media?
We can’t, and won’t, answer yes or no to these, though we each have our opinions. We just want to propose that each institution should own its discussion about these matters. Casting aspersions on the work of others, on the contributions of that work to students and to an institution’s core mission, is not productive. What is productive is an ongoing, civil conversation about those students and that core mission, and an effort to first build a framework for that conversation that educates each of us in the work of one another.
Every institution must have its own conversation, and no two institutions will reach identical conclusions. One institution’s frill is another institution’s essential service: ours to decide, and ours to defend. Leaving the definition of “frill” to others puts us at grave risk of losing control over our very purpose. We must look inward for the anchor points of this conversation. Who are our students, and what do we owe them? What do they need from us (rather than want from us) to ensure they have the best chance of succeeding at whatever it is we have crafted as our institution’s goals? And then we must measure what we offer against those goals, rather than against the college down the road that is awash in apparent frills (which, perhaps, they don’t define that way, and that is, of course, their prerogative).
What each one of us believes is essential may not be what another believes is essential, but we do share, at our best, a deep commitment to this work of educating college students, and we each deserve a voice in the conversation.
Lee Burdette Williams and Elizabeth A. Beaulieu
Lee Burdette Williams is vice president and dean of students at Wheaton College, in Massachusetts, and Elizabeth A. Beaulieu is dean of the core division at Champlain College.
It’s difficult to believe now, but not so long ago, I looked forward to making up syllabuses.
Once the grand meal of the course had been structured and I’d chosen an exciting title, the syllabus design was my dessert. I took the word “design” quite literally, having fun with frames and borders, trying out different fonts, fiddling with margins.
Then, after printing out the final document, I’d sit at my kitchen table and add images saved for the purpose from old magazines, vintage catalogs, pulp advertising, obscure books, and other ephemera. Fat cherubs blowing their trumpets would announce Thanksgiving break; a skull and crossbones marked the spot of the final exam. My masterpiece was a course on the work of Edgar Allan Poe, whose syllabus was a gothic folly with a graveyard on the front page and cadaver worms crawling up the margins.
Over time, my syllabuses grew less creative. I still gave my courses what I hoped were enticing titles, and I’d usually add an image to the front page, but nothing more. In part, I was afraid my quirky designs might make the course seem less serious; I also had far less free time than I used to. But mostly, it was the number of disclaimers, caveats and addenda at the end of the syllabus that made my designs seem out of place. All these extra paragraphs made the syllabus seem less personal, and more institutional -- but then, I realized, perhaps it was time I grew up and began to toe the party line.
Those were the good old days. Now, at a different institution, I teach in a low-residency program whose courses are taught, in part, online. The institutional syllabus template is pre-provided: Times New Roman, 12-point font, 1-inch margins -- and don’t forget the “inspirational quote” at the top of the page.
The Course Description is followed by the list of Course Objectives, Learning Outcomes, Curriculum and Reading Assignments, Required Reading, Assessment Criteria and so on, all the way down to the Institute’s Plagiarism Policy and Equal Opportunity Provisions. Colleagues tell me it’s the same almost everywhere now; the syllabus is now composed mainly of long, dry passages of legalese.
I no longer design my own course titles -- or, if I do, they need to be the kind of thing that looks appropriate on a transcript, which means “Comparative Approaches to the Gothic Novel,” not “Monks, Murder and Mayhem!” There’s an extra plague in online teaching, however, in that -- at least, at the institution where I’m currently employed -- all course materials, including weekly presentations, must be submitted months in advance.
This, I’m told, is not only to ensure that books are ordered and copyrights cleared, but also for the various documents to pass along the line of administrative staff whose job includes vetting them in order to be sure no rules have been violated, then uploading them in the appropriate format. Moreover, a syllabus, we are constantly reminded, is a binding legal document; once submitted, it must be followed to the letter. Omissions or inclusions would be legitimate grounds for student complaint.
Gone, then, are the days when I could bring my class an article from that morning’s New York Times. Now, when I stumble on a story, book or film that would fit perfectly with the course I’m currently teaching, I feel depressed, not excited. I can mention it, sure, but I can’t “use” it in the class. Nor can I reorient the course in mid-stream once I get to know the students; I can’t change a core text, for example, if I find they’ve all read it before; I can’t change the materials to meet student interests or help with difficulties, as I once did without a second thought.
This is especially perplexing in online teaching, where it’s so easy to link to a video, film clip, or audio lecture. We have an institution-wide rule that such materials may not be used unless accompanied by a written transcript for the hearing impaired. When I object that there are no hearing impaired students in my small class of six, I am told that no, there are currently no students who have disclosed such an impairment. The transcripts are needed in case any of them should do so -- in which case, they would be immediately entitled to transcripts for all audio-visual material previously used in the course. Sadly, those who pay the price for this assiduous care of phantom students are the six real students in the course.
In brief, what used to be a treat is now an irksome chore.
Instead of designing a syllabus, I’m filling out a template, whose primary reader is not the student, not even the phantom potential-hearing-impaired student, but the administrators and examiners who’ll be scanning it for potential deviations from standard policy.
Sitting at my kitchen table with scissors and glue, I always felt as though the syllabus -- and, by implication, the course -- was something that came from within me, something I had literally produced, at home, with pleasure and joy.
Now, by the time the course is finally “taught” months after the template has been submitted, it feels like a stillbirth from a mechanical mother.
Mikita Brottman is chair of the humanities program at Pacifica Graduate Institute.
At a recent gathering of junior faculty, convened by the Teagle Foundation to discuss the future of liberal education, a remarkable fact appeared so clearly that it went unremarked. Discussions about the value and purpose of higher education had lost the acrimonious and partisan tone that defined the culture wars of the '80s and '90s. To be sure, those present (myself included) were no doubt fairly homogeneous in our political and academic backgrounds. And we were a self-selecting group, as all had expressed interest in the value of liberal education, even if we did not agree on (or even know for sure) what exactly it was. It was nonetheless an encouraging sign – no doubt prepared by such reasoned criticisms of the academy by the likes of Derek Bok – that liberal education no longer appeared as a minefield of partisanship, but rather as the site of constructive and rational debate.
One reason, I suspect, for this development may be that some of the institutions most committed to liberal education have transformed the way in which it is taught. At Chicago, Harvard, and Stanford, for instance, freshmen are still required to take a version of a "core curriculum." But unlike Columbia’s venerable core, these newer versions all allow students to make their own choices from a selection of classes. At Chicago, students compose a three-course meal from offerings in the humanities, civilization studies, and the arts. Stanford’s “Introduction to the Humanities” (IHUM) program presents students with a slightly leaner diet: they choose from a collection of starters chosen to “demonstrate the [...] productive intellectual tensions generated by different approaches,” before tucking into a two-quarter entrée that “promote[s] depth of study in a single department or discipline.” Finally, Harvard just introduced last fall a "Program in General Education" that is more buffet style: students select courses from eight different groups, roughly half of which satisfy humanities requirements.
While in no way revolutionary, these curricular developments, I argue here, may justly be regarded as harbingers of a third way in liberal education. This new way bypasses the old battleground of the culture wars — the canon — by recognizing the privileged place that certain works and events occupy in past and present societies, without dictating which of these must absolutely pass before every student’s eyes. As opposed to the more common "general education requirements," moreover, the courses in this model also provide students with an intellectual meta-narrative, that is, a synoptic perspective linking different periods, cultures, and even (ideally) disciplines. Finally, this model can offer scholars, administrators and policy makers a new language with which to define the goals and ideals of liberal education, and to help define criteria for their evaluation.
The language currently employed to discuss liberal education has itself proven remarkably apt for avoiding partisan flare-ups. Who can object to a pedagogical program designed to improve thinking, moral reasoning, and civic awareness? Glaringly absent from such skills-oriented definitions is, of course, curricular content. While this strategy of omission has conciliatory advantages, it also carries risks: Discussions about liberal education can end up sounding terribly formalist, as though students were destined to perform ghostly mental operations in a vacuum (“practice citizenship!”). The very idea of liberal education can suffer from such excessive formalism, since, emptied of content, it risks becoming little more than a talking point or sales pitch.
This approach also ignores a penetrating criticism, made with particular (if somewhat hysterical) emphasis by Allan Bloom in The Closing of the American Mind. In the absence of any overarching curricular structure, students can easily end up losing themselves in a labyrinth of unrelated courses. These courses may individually belong to disciplines traditionally associated with liberal education, and may each, in their own way, contribute to the development of important and worthy skills. But they may also leave puzzled students wondering how, say, their knowledge of Russian history relates to their classes on French literature. Of course, there are not always clear bridges between disparate subjects. And finding your way from one point to another can itself be an intrinsic part of education. At the same time, teaching students how to integrate knowledge from different fields is a valuable skill, one which we would be rather perverse to withhold from them, particularly when it is requested.
Beneath the geographical metaphors proliferating in the above paragraph lurks, of course, the familiar fault line of curricular content. But this is precisely where the reforms of core curriculum courses at the universities listed above can provide a less contentious framework for discussion. Indeed, the dominant feature of these courses is that they combine requirement and choice; students are obliged to choose from a selection of courses. This means that a) there is a degree of personal tailoring: for instance, hardcore “techies” at Stanford can take a course on the history of science and technology; and b) the emphasis is shifted from a debate over which exact texts every student should read – inevitably a source of heated disagreement – to a debate over which different sets of texts (or historical events, or works of art, etc.) form a coherent and meaningful syllabus.
The advantages of this system are numerous, but I would like to emphasize two ways in which it offers a valuable framework for liberal education. First, in addition to the benefits gained from studying individual texts or topics, these courses provide students with an overarching narrative. It is not necessarily a teleological or master-narrative, nor need it even be a story of progress with a happy end. But it is a narrative that allows students to perceive how events or ideas transform over a considerable stretch of time and space. The IHUM course that my department offers, for example, takes the students from the Mesopotamia of Gilgamesh to the Caribbean of Maryse Condé’s Crossing the Mangrove. Our syllabus is primarily literary, but the lectures draw heavily on each text’s historical, religious, cultural, and philosophical context. In this way, such narratives also illustrate how frontiers between humanistic disciplines are not closed borders, but can be freely crossed.
Ironically, the narratives transmitted in these classes are ultimately destined to fade away, or at least be significantly transformed, over the course of a student’s education and life. Their purpose is primarily structural: to borrow a hallowed metaphor, they allow students to attach the ideas they will later acquire onto different, yet connected branches of a single tree of learning. But this metaphor is somewhat misleading, since narratives are far less wooden frames. Subsequent coursework will complicate or contradict episodes of the story students began with; and at the end of their college education, they will ideally have written their own narrative with the knowledge they have gained. But even if the initial story they were told disappears in the process, it will have served its purpose, and taught the students a valuable lesson along the way – namely, that to be persuasive citizens and scholars, we need to know how to tell a compelling narrative. The ability to piece disparate facts and ideas into a coherent whole is a critical part of liberal education. We are always putting Humpty Dumpty together again.
Second, an important criterion for composing the syllabus of these courses is that their contents be sufficiently authoritative. Here we brush up again against the touchy subject of the canon, which cannot be completely avoided, even if the model under discussion does not advocate including specific books at all costs. But the inclusion of “authoritative” works or events – and I choose this word deliberately – does strike me as a necessary part of liberal education. This is not because some works contain The Truth and others only pale reflections of it. This argument of Bloom’s, and of his predecessor at the University of Chicago, Robert Maynard Hutchins, is more likely to puzzle than to offend today (how do you teach Homer as "the truth"?). But as John Guillory pointed out in Cultural Capital, certain texts simply have (or had) greater authority in our societies: not to engage with at least some of them leaves students at a social disadvantage.
I would also argue that understanding these authoritative texts is key for achieving what Montaigne identified as the ultimate goal of education – the ability to challenge existing authorities, an ability we would today call critical thinking. If students are to challenge authorities, they must begin by knowing who those authorities are and what they argued. Only in this fashion can the students acquire both a better understanding of how and why our societies came to be the way they are, and the ability to counter authoritative accounts in a knowledgeable and evidence-based manner.
It is to be hoped that liberal education will always remain a fertile topic of discussion, and the model that the universities discussed here have adopted – with a number of differences, to be sure, which I did not address – is certainly not the only solution. Indeed, I hope that other colleges will experiment with different models, so that our collection of experience continues to grow. But the promise of the current model is that it does offer a way past the opposing camps of the canon wars, and in this regard, may come to be regarded as a third way in liberal education.
Dan Edelstein is assistant professor of French at Stanford University.
One of the main thrusts of what has come to be called "the undergraduate student success movement" is misguided. Yes, we did mean to use the term "misguided." A strong word and a strong assertion, but we have equally strong evidence. Simply stated, higher education institutions in the United States focus heavily on student success programs, but rarely do they have a comprehensive plan to guide those programs. In the absence of a plan, redundancies and gaps occur, and retention stagnates. In short, a program or programs do not a successful plan make.
Of course, making this assertion means that John Gardner, one of this essay’s authors and a key architect in the national student success movement, has to admit that over the years he may not have given the best advice to all people at all times. For about three decades, Gardner has gone around the country telling college educators that their institutions need to adopt or adapt one form of student success program or another. Drawing from his experiences, the recommended program was often a first-year seminar -- a contemporary staple in the American college curriculum that dates back to the 1880s. And, in fact, research does correlate participation in first-year seminars with positive differences in student retention and graduation rates.
At the same time that Gardner was advocating for first-year seminars in particular, he was also advocating for a broader philosophical approach to the first year. He coined the term, “the first-year experience,” and meant it to encompass a total campus approach to the first year, not a single program. Upon reflection, it seems that speaking about one program extensively while at the same time advocating for a collective approach may have fostered a bit of confusion. And today the “first-year experience” can mean anything from a single course to a full-fledged coordinated effort to improve the first year. But it was the single course that gained the most national and international interest.
Gardner himself ran University 101, a first-year seminar at the University of South Carolina, for 25 years, and then helped replicate this course type at many other institutions. Colleges and universities often adopted first-year seminars because they increased retention rates, and thus increased tuition revenue. Educators were hunting for the silver bullet -- the “program” that would bring about miraculous student-saving and money-making results. This search for the ideal program also became subsumed under the language of “best practices.” The idea was very simple: there are best practices out there, they can be identified and replicated with minimal thought given to context, and these best practices should yield the same results everywhere. But retention improvements that resulted from one-shot programs have generally been short-lived and, taken together, have failed to move the national retention statistics in a positive direction.
Fast-forward several decades, and this search has been intensified. A plethora of organizations and consultants now exist to feed the hunger for specific programmatic solutions to the retention problem. Clearly it is time for a change.
Beginning in 2003, with support from several foundations, the Gardner Institute for Excellence in Undergraduate Education launched a process, called Foundations of Excellence in the First College Year -- a self-study and planning process designed to help campuses move beyond “programs” and “best practices” to the development of a comprehensive intentional plan for the first year. Participants in the Foundations of Excellence process are encouraged to answer a fundamental educational question: What does our college or university need to do to provide an excellent beginning experience for all students relative to our unique mission, location, and student characteristics? To answer that question an institution first needs to assess how it is currently performing vis a vis standards of excellence for the first college year. The process provides nine such standards. Finally, once the plan has been created, institutions must implement it.
But implementing a plan is more easily said than done. Our own research on the effectiveness of the institute’s work with 197 institutional participants has found that the two most significant variables that interfere with executing a plan are a change of senior leadership with its resulting destabilizing effects, and the impact of unforeseen budget cuts.
We have also learned from successes. Over 95 percent of the campuses with which we have worked report implementing action plans. An independent analysis of Foundations of Excellence found that campuses that implemented the plans to a self-reported “high degree” recorded significant first-to-second year retention rate increases -- an aggregate 5.62 percentage points or 8.2 percent higher over four years as reported by IPEDS. Institutions that did not implement their FoE action plans experienced a 1.4 percentage point decrease in retention -- in other words, if you don’t implement the plan you have, you seem to get attrition. To plan is not enough. The executed plans included a combination of changes in institutional policies, a renewed focus on pedagogy in first-year courses, and particular programs -- yes, programs -- that were intentionally selected to address the unique needs of the institution and its students. For example, institutions connected their learning community offerings with their evolving core curriculums to maximize the success of both efforts; orientation programs were expanded to include and serve previously underserved and/or completely unserved populations such as low-income and transfer students; and oversight offices and/or committees were created to intentionally connect previously disparate pieces so that learning opportunities were not left to chance.
In conclusion, our experience leads us to convey that while programs are necessary, unless they are conceived and carried out as parts of a whole, they are not sufficient. What we believe is that institutions need to undertake a thorough planning process focused on excellence in the first year. Appropriate programs and best practices can then organically emerge and/or be modified, executed, assessed, and refined in context.
Institutions cannot fulfill their potential for improving student success without a comprehensive vision for excellence in the first year. Thus, we encourage you to recognize that the future of our students is too important to leave to chance. Instead, we hope you and your institution will become more intentional and deliberate in the way you commit to first-year excellence. In the process, you will be contributing nationally as you act locally to create the change and foster growth that our students and country require.
John N. Gardner and Andrew K. Koch
John N. Gardner is president of the John N. Gardner Institute for Excellence in Undergraduate Education and distinguished professor emeritus and senior fellow at the National Resource Center for the First-Year Experience and Students in Transition, at the University of South Carolina.
Andrew K. Koch is vice president for new strategies, development, and policy initiatives of the John N. Gardner Institute for Excellence in Undergraduate Education.
Most attention is paid at institutions of higher education to the beginning and end of undergraduate studies. Curriculum committees debate the nature and number of requirements that students must fulfill, mostly in their freshman year; and departments spend a great deal of time evaluating the content and structure of majors, which tend to occupy students in their junior and senior years. No one gives much thought to what students do in the middle, when they're generally encouraged to explore whatever topics they wish.
The principal philosophy that governs this middle period of a student's education is of course the elective system. The right for all students to take a class on the subject of their choosing is a hallmark and admirable feature of the American university. It is often through such chance encounters with less common subjects that scholarly passions are born and majors are chosen. No one studies linguistics or anthropology in high school.
But because the elective system is so fundamental to higher education, and because the major is under departmental control, we rarely step back and ask whether this combination of general ed requirements, electives, and specialization actually meets the objectives of a liberal education. Of course, the answer to this question depends largely on how one defines liberal education. For the sake of argument, let’s take the definition offered in the 2009 Modern Language Association Report to the Teagle Foundation on the Undergraduate Major in Language and Literature. This report identified the acquisition of broad, cross-disciplinary and transhistorical "literacy" as a central component of liberal education (scientific literacy would be another component, but that’s a different story). In other words, students should be sufficiently well versed in an array of humanistic fields, canons, methodologies, and periods, for them to engage with sources (and pursue further research, if they wish) in a large number of areas. To be sure, we expect a lot more from liberal education than this single aim; this is simply a minimalist definition.
Given this definition, it seems fair to say that we place blind faith in the academic virtues of our current system. We simply assume that somewhere along the way, between fulfilling their general education and major requirements, students will pick up enough knowledge about other fields to meet the demands of a liberal education.
It is easy to understand why we place such faith in this system, since there is no obvious, acceptable alternative. Institutions such as St. John’s College, whose curricula are set in stone, will only ever cater to a tiny minority of students; even Columbia University’s two-year core curriculum is highly exceptional. As Louis Menand recently noted in The Marketplace of Ideas, it is virtually impossible to imagine introducing a curriculum such as Columbia’s core today; such highly regimented courses could only evolve under particular historical circumstances. The vast majority of students today desire a greater say about the content of their education. And we must honor this desire, if only because students who do not buy into their educational program are unlikely to be good learners.
There are other ways, however, to think about the middle part of undergraduate education, particularly in the humanities. Let us focus momentarily on students who major in the humanities. Whether students chose to major in English, religious studies, anthropology, or history, there are in fact no structures in place to encourage or enable them to acquire a solid foundation in other disciplines, cultures, literatures, and historical periods. The student writing her honors thesis on Alexander Pope often does not know who Pope Alexander VI was.
Moving now to all undergraduates, I would push this argument even further. Why is it that the vast majority of humanities courses are taught as if we were training students to professionalize in a given field (say, French), when only a tiny fraction of these students – non-majors and majors alike – are actually going to pursue a graduate degree in the field? Whether a student is majoring in engineering and taking a French class out of a love for French literature, or whether she’s a French major and is required to take a French class, chances are that she is not going to become a professor of French. And yet our humanities majors, and our undergraduate curricula more broadly, are designed to produce budding experts in fairly narrow fields. This design is understandable in fields such as economics or engineering, where students often do go on to take jobs in which they need specific skills and knowledge. But why is it so in the humanities?
To be sure, specialization, even at the undergraduate level, has its virtues: engaging with material at a higher level of expertise allows students to hone their research skills and to produce more consequential bodies of work (such an honors thesis). Still, I would ask whether our primary objective, as humanities professors, should be training students as though they will all go on to become scholars, or whether our primary objective shouldn’t be something else – such as offering all undergraduate students a broader and less discipline-focused foundation for their future lives.
This issue seems particularly pressing today, as the humanities have gone from facing an existential crisis, to literally fighting for their existence. If smaller departments (such as those that were just axed at the State University of New York at Albany) continue to justify their academic purpose chiefly in terms of number of majors, then they will perennially fear (and often face) the chopping block. Admittedly, such a change would also require a shift in perspective on the part of the administrative powers-that-be. But if humanists made a stronger case that the chief purpose of a liberal education is not disciplinary specialization, but broad historical and cultural literacy, then universities simply could not make do without Greek epics, French classical theater, German philosophy, or Russian novels.
What would a curriculum reconfigured along these lines look like? One option would be for humanities departments to join forces to offer genuinely interdisciplinary core courses on major topics of interest. An art historian could team up with a literature professor and religious studies scholar to teach a course on the Renaissance; a historian, political theorist, and Spanish professor could offer a course on the discovery of the New World; or a philosopher, psychologist, and musicologist could lead a course on Modernism. These courses, which would need to be vetted by appropriate faculty committees, would stem from faculty interest, and could vary over time.
This curricular structure presents a number of advantages over the existing one. First, by virtue of having courses team-taught and not placed under the auspices of a single department, they would not have a narrow disciplinary focus, but would open up key events or questions to a variety of approaches. (This is currently the structure adopted at Stanford for the fall Introduction to the Humanities courses.) At the same time, professors could underscore the methodological differences between their disciplines, thereby providing students with a roadmap of how knowledge is divided between the various academic departments (and where to look for classes in the future).
Secondly, by requiring these courses to cover broad topics, they would collectively constitute an overarching panorama of the humanities. This would be a disjointed panorama, to be sure, yet that might be a quality, since it would avoid the problems associated with establishing a grand récit. If this panorama resembles an exploded version of an ideal, inaccessible core curriculum (“These fragments I have shored against my ruins”?), this is ultimately a misleading resemblance. Since the various pieces of this series would constantly be changing, it is not a palliative for a "Great Books" curriculum, in an age that has turned against such courses, but rather the product of a different pedagogical philosophy. Rather than valuing certain specific texts more than others, this philosophy places value on the breadth of knowledge, and the ability to synthesize very different forms and genres of information, from plays and paintings to maps and graphs.
The truly thorny issue that every curricular reform faces is that of requirements. If we build a new program, will anyone come, if they’re not obliged to? One option would be to require students to take, say, two or three of such courses at some point during their studies. This arrangement grants students a degree of choice and a good deal of scheduling flexibility. Other incentives could be found to encourage students to take more than the bare minimum of courses: completion of additional courses could lead to some sort of certification, or could form part of an honors program.
Since a central objective of a liberal education is to ensure breadth of knowledge, it follows, to my mind at least, that a significant humanities requirement is needed. In cases where this is impossible for pragmatic or philosophical reasons, I would argue that it is still important to provide students with a curricular structure that would allow them to achieve the goals of a liberal education on their own. This is particularly true for non-humanities majors, who often do not venture into humanities classrooms, not necessarily due to a lack of interest, but because of the highly specialized focus of most courses. They also may simply not know where to look: our courses are not listed in a central place, but buried behind individual department nomenclatures.
Our academic divisions may make sense for research purposes, but are often at odds with our pedagogical goals. The MLA Report to the Teagle Foundation identified four “constitutional elements” that it considered key to liberal education – "a coherent program of study, collaborative teamwork among faculty members, interdepartmental cooperative teaching, and the adoption of outcome measurements" – yet the first three of these four elements cannot be achieved at the departmental level alone. To fulfill the promise of liberal education, we must ensure that students can build “coherent programs of study” that cut across disciplines.
Finally, perhaps we should have more confidence in the wares we’re vending. Wide-ranging courses that combine powerful texts, vivid iconic material, controversial ideas, and dramatic historical episodes, with insightful analysis should not fail to exhilarate students. Of course, good professors, catchy titles, and intriguing perspectives are also needed to invigorate the study of our disciplines; a dry "introduction to X" approach will never be sufficient to meet the goals of a liberal education. But there is also a real thirst for this kind of knowledge, and not only among students in the humanities. Who knows? Maybe if we build it, they will come.
Dan Edelstein is assistant professor of French at Stanford University.