Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
The liberal arts are dead, or — at best — dying. That's the theme of story after story in today’s news media.
Professional skills training is in. The STEM (science, technology, engineering, and math) fields are in. Practical, vocational higher education is in. The liberal arts are out, relics of a “traditional” way of thinking that has been overtaken by the pressing demands of our dizzyingly complex digital age.
As new students arrived on college campuses this fall, the message many of them heard is that majoring in history, or English, or anthropology is a surefire recipe for a life of irrelevance and poor job prospects. These “conventional” disciplines cannot possibly train students for productive, enriching careers in the high-tech information age whose future is now.
Although this viewpoint is rapidly gaining the status of settled wisdom, it is tragically misguided. It is based on a false dichotomy, namely that the liberal arts and the more vocational, preprofessional, practical disciplines — like, say, computer science — are fundamentally different and opposed. But this misunderstands both the age we’re living in and the challenges we face, not to mention one of the most significant trends in higher education over the last few decades — the evolution of interdisciplinarity.
In essence, this whole debate comes down to skills. The liberal arts are often said by critics to provide little that is of “practical value” in the “real world.” In reality, though, liberal arts curriculums can and do give students skills that are just as professionally useful as those in more “relevant” occupationally specific fields of study.
At my university, the University of Maryland-Baltimore County, students this fall can declare a new major called global studies, which integrates courses in 12 liberal arts departments — including economics, geography and environmental systems, history, media and communication studies, and political science — into a rigorous interdisciplinary curriculum. Majors are required to study abroad and to achieve fluency in at least one foreign language. By graduation, they will have demonstrated their research, analytical, critical-thinking, and writing skills in a substantial, “capstone” research project. Our students will also do internships with companies, not-for-profits, and government agencies.
Equally important, they will develop “global competence,” which employers in many professions have identified as one of the most desirable, but grossly lacking, sets of skills required of their new employees. Broadly defined, global competence is “the capacity and disposition to understand and act on issues of global significance.” Its central elements include knowledge of world affairs — cultural, economic, and political; proficiency in communicating with people in and from other societies, both verbally and in writing; the ability to appreciate multiple perspectives and respect cultural diversity; and the intellectual and psychological flexibility to adapt to unfamiliar and rapidly changing circumstances.
Developing the skills that we hope to instill in UMBC’s global studies majors is an inherently interdisciplinary mission. In a recent New York Timescolumn, Yale professor Nicholas Christakis argues that the social sciences (a subset of the liberal arts) badly trail the natural sciences in generating innovative “institutional structures” that can produce the kind of cutting-edge science necessary for solving some of the world’s most intractable — often intrinsically interdisciplinary — problems. However, he also notes that this is beginning to change, for example, in the form of a new global affairs major at Yale.
Whether it’s global studies at UMBC or global affairs at Yale, these exciting new programs tangibly articulate why talking about liberal arts education versus practical training creates the false perception that these two enterprises are essentially at odds. At UMBC, it's the combination of interdisciplinary liberal arts education; substantial research, writing and analysis; rigorous foreign language training; study abroad; and experiential learning in the form of internships and other applied opportunities that will give students the skills they will need to thrive and “do good” in the 21st century.
The tragedy is that we might blow it. If we continue to present students with a false choice between the liberal arts and “real-world” vocational training, we will produce what social scientists like to call “suboptimal” outcomes. Too many talented, energetic, hard-working students will choose “safe” educational and career paths, and too many truly global problems will go unsolved.
Devin T. Hagerty is a professor of political science and director of global studies at the University of Maryland-Baltimore County.
If you’re an adjunct, I have a small but important task for you:
Ask your students what "adjunct professor" means to them. You might hear something like, It means you don’t have a Ph.D., or You don’t have tenure yet. (Yet ... if only.) Don’t be bitter or cynical, and don’t barrage them with statistics, stories of unfair working conditions, and vitriol against "the administration." Try to be as calm and diplomatic as you can, and simply listen. Some might understand and empathize, or some may simply brush it off. If you’re a multi-campus adjunct (or "road scholar," as we’re sometimes called), students may understand that their class and campus aren’t the only things demanding your attention. Carve out some time in class, and ask your students what "adjunct" can or does mean. Maybe they’ll like the break from talking about another scarlet A or going over their next writing assignment.
Better yet, ask your students regardless of whether you’re adjunct, tenure track, or tenured. Part- and full-time faculty, regardless of discipline, need to be collaborating -- both as part of this conversation and more broadly across our disciplines and campuses
Whenever I’ve asked my students what "adjunct professor" means, I’ve told them about some of the differences between being on and off the tenure track, how pay can differ, how we don’t get paid as much as other professors or administrators, and so on. Thanks to some Modern Language Association networking I did with New Faculty Majority, I was part of a "PBS NewsHour" story in March, and I mentioned it to a few students in case they wanted to watch. I had hoped the piece would be substantive and lead to others focused exclusively on adjunct labor -- and thus reach students’ parents -- but it wasn’t and it didn’t.
I’m ultimately trying to raise questions and spark discussions -- debates, even -- about how and why to talk about labor conditions with students. I’m not intending to provide a script or list of directives, short of saying that by no means should “Job Information List” or “search committee” be said in the classroom. Our students have a right to know that all professors aren’t treated and promoted equally -- and, more importantly, that this affects how we educate them. Anecdotally, I’ve had to limit my accessibility, office hours, and even designs for more ambitious courses in the past when I’ve taught at two campuses and had 70-80 students. This semester, for the first time since 2006, I’m only on one campus and have 26 students. I have a lot more time and energy for important teaching tasks: slowing down when I grade to write fuller, more meaningful comments; spending more time and energy to design new assignments and "know" my students more fully; and, simply enjoying more time to grade, prep, and meet with students.
Having been an adjunct for 14 years and counting -- the first six while finishing my doctorate, the last eight as job seeker and teacher-scholar -- I find myself thinking a lot about how to involve students in discussions of academic working conditions. Regardless of how we raise the adjunct question in our classes, we need to do so constructively, meaningfully, and diplomatically, and without simply airing grievances or ranting against "the administration." Tone is key. My program director once described me as calm and articulate -- which is typically how I discuss such professional matters in the department or online. There are private Facebook groups (e.g., Con Job), direct Twitter messages, and hallway conversations for the vitriol and the ranting -- which, believe me, is therapeutic.
By the same token, I’ve also been thinking about how to reach students’ parents in equally meaningful and constructive ways. I’d never advocate direct-emailing them or aggressively interrupting freshman orientation, but perhaps our conversations with students will trickle down to their parents. Or, perhaps we might find ways to talk with parents at orientations and move-in weekends. They, too, have a right to know that some of their children’s professors have limited availability, minimal financial support, and overbooked schedules across campuses.
Particularly with universities that (claim to) value the first-year student experience, new students and their parents should be aware that the professor of their intro-level course may work multiple jobs -- teaching or otherwise -- while acquainting students with college-level learning, or that s/he doesn’t have a TA to handle some grading. Parents might need reminding that most adjuncts don’t have other full-time jobs and simply teach "on the side" or "in the evenings." For some of us, teaching and working in several part-time positions is our full-time job.
It's crucial that we ask these questions and talk with our students. In my case, the adjunct question has come up in a few different ways with my students. Last fall, one student was a little impatient (albeit well-intentioned) about my replying to an email she sent asking about feedback on a paper idea, so she sent me one of those, "Did you get my last email?" messages. I replied that I’d seen the first one and planned to respond soon -- while reminding her that she was one of about 75 students I had that semester across four courses and two campuses. In a few other instances, students’ schedules have conflicted with my office hours, and I’ve sometimes had to teach on another campus when students requested to meet. We work it out, often with a little finessing of the schedule, but I seize the chance to tell them why I’m not around as much as they or I might like.
I posed the adjunct question to an honors-level Shakespeare course last fall, and it led to a short discussion of some differences in faculty rank and course assignments. Most recently, I asked a student (also a campus tour guide) what the university told her to say to prospective students and their parents about different faculty ranks and working conditions. They do acknowledge that the university has different levels of faculty, and that several professors are part-time and teach at other local universities. (About 13 years ago, another student-tour guide told me he was instructed to say that all university faculty were full-time, even though he knew I was an adjunct without a Ph.D. yet. At least some things have improved.) At this point, though, simply acknowledging that there are different faculty levels at the university -- while still knowingly maintaining an uneven playing field -- is problematic at best, and unconscionable at worst. Don’t just tell students you have different faculty ranks; help the part-timers earn more and move up those ranks.
Clearly, some things can’t be changed by one conversation with students. Tenure-track positions aren’t going to multiply overnight, department chairs and deans aren’t going to automatically promote adjuncts, and students aren’t going to march on the university president’s office. Although there won’t be immediate big-picture effects of such a conversation, we should still have it with our students. Ask them and see where the conversation leads.
I now have another small task for you: Start talking about how we can -- indeed, should -- involve our students in discussions of academic labor. Share and tweet this piece. Comment, answer, even disagree. Remember that there are different kinds of action: from simply reading and sharing this piece, to talking with your students, to figuring out what has and hasn’t worked well. And maybe see how and when you can reach a student’s parents.
Clearly, these conversations are fluid and ongoing. Your job -- our job -- is to take them off the page or screen and into our social media feeds, department meetings, and, perhaps more pressingly, classrooms.
This article is adapted with permission from one part of a series in Hybrid Pedagogy on contingent faculty members. Joseph Fruscione teaches first-year writing at George Washington University; he also works as a freelance tutor and editor. He has taught American literature, adaptation studies, and first-year writing at the university level since 1999.
In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.
But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.
Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.
I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.
Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.
Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.
With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.
An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.
I was incredibly, indescribably proud of them.
Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?
In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.
And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.
But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.
So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:
"My advice would be to leave it alone."
It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.
While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.
As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.
Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.
After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.
Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.