In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.
But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.
Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.
I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.
Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.
Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.
With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.
An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.
I was incredibly, indescribably proud of them.
Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?
In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.
And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.
But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.
So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:
"My advice would be to leave it alone."
It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.
While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.
As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.
Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.
After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.
Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.
Because of my experience as former CEO of the Seagram Corporation, young business students and aspiring entrepreneurs often seek my advice on the best way to navigate the complex and daunting world of business. As college students begin to think about selecting their majors, they may be influenced by the many reports coming out this time of year that tell them which majors provide the highest post-college earning potential. Last month, PayScale released its 2013-2014 report, lauding math, science and business courses as the most profitable college majors.
My advice, however, is simple, but well-considered: Get a liberal arts degree. In my experience, a liberal arts degree is the most important factor in forming individuals into interesting and interested people who can determine their own paths through the future.
For all of the decisions young business leaders will be asked to make based on facts and figures, needs and wants, numbers and speculation, all of those choices will require one common skill: how to evaluate raw information, be it from people or a spreadsheet, and make reasoned and critical decisions. The ability to think clearly and critically -- to understand what people mean rather than what they say -- cannot be monetized, and in life should not be undervalued. In all the people who have worked for me over the years the ones who stood out the most were the people who were able to see beyond the facts and figures before them and understand what they mean in a larger context.
Since the financial crisis of 2008, there has been a decline in liberal arts disciplines and a rise is pragmatically oriented majors. Simultaneously, there was a rise of employment by college graduates of 9 percent, as well as a decrease of employment by high school graduates of 9 percent. What this demonstrates, in my mind, is that the work place of the future requires specialized skills that will need not only educated minds, but adaptable ones.
That adaptability is where a liberal arts degree comes in. There is nothing that makes the mind more elastic and expandable than discovering how the world works. Developing and rewarding curiosity will be where innovation finds its future. Steve Jobs, the founder of Apple, attributed his company’s success in 2011 to being a place where “technology married with liberal arts, married with the humanities … yields us the results that makes our heart sing.”
Is that reflected in our current thinking about education as looking at it as a return on investment? Chemistry for the non-scientist classes abound in universities, but why not poetry for business students? As our society becomes increasingly technologically focused and we build better, faster and more remarkable machines, where can technology not replicate human thinking? In being creative, nuanced and understanding of human needs, wants and desires. Think about the things you love most in your life and you will likely see you value them because of how they make you feel, think and understand the world around you.
That does not mean forsaking practical knowledge, or financial security, but in our haste to get everyone technically capable we will lose sight of creating well-rounded individuals who know how to do more than write computer programs.
We must push ourselves as a society to makes math and science education innovative and engaging, and to value teachers and education. In doing so, we will ensure that America continues to innovate and lead and provide more job and economic opportunities for everyone. We must remember, however, that what is seen as cutting-edge practical or technological knowledge at the moment is ever-evolving. What is seen as the most innovative thinking today will likely be seen as passé in ten years. Critical to remaining adaptable to those changes is to have developed a mind that has a life beyond work and to track the changes of human progress, by having learned how much we have changed in the past.
I also believe that business leaders ought to be doing more to encourage students to take a second look at the liberal arts degree. In order to move the conversation beyond rhetoric it is important that students see the merits of having a liberal arts degree, in both the hiring process and in the public statements of today’s business leaders.
In my own life, after studying history at Williams College and McGill University, I spent my entire career in business, and was fortunate to experience success. Essential to my success, however, was the fact that I was engaged in the larger world around me as a curious person who wanted to learn. I did not rely only on business perspectives. In fact, it was a drive to understand and enjoy life -- and be connected to something larger than myself in my love of reading, learning, and in my case, studying and learning about Judaism -- that allows me, at 84, to see my life as fully rounded.
Curiosity and openness to new ways of thinking -- which is developed in learning about the world around you, the ability to critically analyze situations, nurtured every time we encounter a new book, or encountering the abstract, that we deal with every time we encounter art, music or theater -- ensures future success more than any other quality. Learn, read, question, think. In developing the ability to exercise those traits, you will not only be successful in business, but in the business of life.
Edgar M. Bronfman was chief executive officer of the Seagram Company Ltd. and is president of the Samuel Bronfman Foundation, which seeks to inspire a renaissance of Jewish life.
For the layperson, the solution to the problem of low wages for part-time workers might be simple: a full-time job. But for part-time professors in U.S. postsecondary education, the hopes of landing a full-time job can be about as remote as winning the lottery -- especially in disciplines where part-timers outnumber their tenured full-time colleagues.
The layperson might also assume that full-time professors and their unions naturally favor equal pay since their jobs could be undercut by cheaper part-timers. But in the case of tenured faculty, their full-time jobs are guaranteed by tenure and thus they have little to gain from equal pay for part-timers. The fact that full-time faculty are tenured and are paid more per course has given rise to the elitist notion that full-time faculty are superior and more deserving.
And that in turn affects consideration of proposed legislation like California’s AB 950 (which was proposed this year), which would protect the ability of full-timers to teach up to 150 percent of full-time while depriving part-timers of the chance to teach those sections. Often seen as an entitlement by the full-timers, faculty overtime has caused disputes; in Wisconsin, an American Federation of Teachers part-time union challenged its AFT full-time counterpart. But inequities go far beyond overtime pay. For faculty and administration, it may take exposure to the egalitarianism of a system like Vancouver Community College’s in British Columbia -- where part-timers have equal pay, equal work, and job security -- to realize how internalized notions of full-time faculty elitism are manifest in U.S. higher education:
Disguised Low Pay
Not only are professors off the tenure track paid at a much lower rate -- in violation of the principle of equal pay for equal work -- their low wages are rarely disclosed in a forthright manner. The Washington State Board for Community and Technical Colleges, for example, has long reported "part-time" faculty salaries as a percentage of what "full-time" faculty earn, as 60 percent of the average full-time faculty salary of $58,000, or $34,500 a year, which to the layman would seem a reasonably handsome average income for “part-time” work.
But a note in the same board report explains that $34,500 is not actual but hypothetical earnings -- what a part-timer would earn if he or she taught full-time. A more realistic average part-time faculty workload would be half-time, which would yield $17,400 a year, which is below the 2013 federal poverty level of $19,530 for a family of three. Indeed, since part-time faculty are not allowed to work full-time, reporting their income as if they were full-time is misleading.
Lack of Raises for Experience and Professional Development
Many full-time faculty are granted automatic step raises in recognition of experience/promotion in rank and professional development, but not most part-timers. In Washington State, from 1999 to 2004, 90 percent of all appropriations for salary step increments went to the full-timers (who represented only one-third of the faculty), which contributed to the state’s current biennial disparity of over $115 million between part-time and full-time faculty salaries.
Smaller Raises and Cost of Living Adjustments (COLAs)
Bargained pay raises and cost of living adjustments are routinely calculated on an equal percentage basis for the part-time and full-time faculty. But since part-time salaries are lower, their pay raises and COLAs are lower too, which also contributes to increasing the pay disparity.
Workload Limitations or Caps
Few people are aware of the limitation on non-tenured faculty workload, often called “caps,” which prohibit non-tenured faculty from working full-time and from qualifying for tenure.
Rather than pressuring institutions to create more full-time positions, caps have encouraged an "easy come, easy go" approach to using cheaper contingents who can be hired or laid off at will and who have become the majority of faculty.
In California, a part-time faculty member’s workload within a community college district is now capped at only 67 percent of full-time; this restriction is set by state law, section 87482.5 of the California Ed. Code. But whether 60 percent, 67 percent, or some other percentage short of full-time, this limited workload means hardship for many of California’s 38,000 part-time professors who teach in the state’s 72 college districts, especially given the discounted part-time rate of pay.
Overtime for Full-Time Faculty
The cap, however, does not apply to the system’s 14,000 full-time faculty. Full-timers not only have a guaranteed full-time teaching load, they have the right to teach overtime if they desire. Full-time faculty are allowed to select their classes, including overtime assignments, before the remaining classes are offered to part-time faculty.
In Washington State, these “overloads” make up 13 percent of all full-time instruction in community colleges, and over the last five years, the “moonlighting” of full-timers has increased by 8 percent in Washington (page 58 of this report).
California Bill AB 950
This year California legislators considered but didn't pass (though it could come back next year) AB 950, for which the California Federation of Teachers (CFT), the state’s affiliate of the American Federation of Teachers, provided backing. AB 950 would have instituted a new state regulation on full-time faculty workload, not by prohibiting overtime, but formally allowing full-time faculty to work overtime, up to 150 percent of full-time.
The bill is frequently pitched as benefiting part-time faculty, and some of California's 38,000 part-timers support it: limiting full-time faculty overtime (overloads) to no more than 150 percent of full-time would seem to protect the jobs of part-time faculty. Also, since the bill is sponsored by the CFT, some part-timers who are union members feel it is a show of good faith to stand in solidarity with their union to support the bill.
But the bill would NOT benefit part-time faculty. Of the roughly 14,000 full-time faculty in the California community college system, only 172 have accepted workloads in excess of 150 percent.
What the bill would do, however, would be to write full-time faculty overtime into state law, giving sanction to all 14,000 to teach up to 50 percent more, that is, up to 150 percent of a full-time load. The bill could result in more full-time faculty overtime and thus undermine part-time faculty jobs.
One consequence of enactment would be to doom appropriations in future years for faculty pay increases too -- if some full-time instructors are customarily teaching 125 or 150 percent of full-time, it makes it very difficult to claim that all full-time instructors are overworked and deserving of higher pay.
Also, since the general public surely expects full-time tenured professors to be working full-time, establishing the voluntary option of full-time faculty overtime in state law would seem tobe a terrific public relations liability, especially at this moment, as many federal civil servants are taking forced furloughs and could resent the unfairness of tenured faculty working overtime for additional pay at will.
The practice of allowing full-time faculty to teach overtime whenever they wish reflects a failure of full-time faculty to self-regulate, since the workloads of full-time instructors’ non-teaching duties — campus governance, research, curriculum development, etc. — are fundamentally self-directed and are the primary justification often offered for the pay differential between full-time and part-time faculty.
Whenever a full-time faculty member elects to teach more than a full-time load, he or she displaces a part-time faculty member’s job. If AB 950 passes in some future year, and a higher percentage of the 14,000 full-time faculty members feel empowered to teach course overloads, fewer classes will be left for part-timer faculty to teach. In this way, AB 950, presented as a means to limit full-time faculty who abuse their ability to teach course overloads, could actually end up taking away part-time faculty jobs and thus should be seen as a wolf hiding in sheep's clothing.
At a time when U.S. higher education is seen as a job prerequisite and is the subject of reform from the highest levels of public policy discourse, it is very long past time for non-tenured faculty to be respected as professionals, not as casual workers without job security, job protection or equal pay, whose jobs can be raided at will. The true solution would be to move toward the Vancouver model, where full-time faculty may teach full-time, and no more, which then permits part-time faculty to have job security, and to allow part-time faculty to work up to full-time.
Jack Longmate and Keith Hoeller are members of the Washington Part-Time Faculty Association. Longmate teaches English at Olympic College and Hoeller teaches philosophy at Green River Community College. Both are contributors to Equality for Contingent Faculty: Overcoming the Two-Tier System, edited by Keith Hoeller, forthcoming with Vanderbilt University Press (January, 2014).
The current state and future prospects of the humanities are occasioning considerable anxious comment. Many humanists are sadly resigned to a belief that the humanities have irrevocably ceded pride of place to the social sciences and sciences; and, indeed, the social sciences and sciences generate and command much intellectual energy in the 21st-century university, for understandable reasons.
The usual remedies proposed for this state of affairs have seemed to me to be limited at best and perhaps even misguided. A typical argument for the utility of the humanistic disciplines is that studying them enhances critical thought and powers of expression, and one would certainly agree.
But I wonder whether such an argument will gain much traction with college-age students and especially their parents. The data suggest a clear national trend away from the humanistic disciplines toward those that seem to offer a different kind of promise or outcome: a vocational utility or practical applicability. Under such circumstances, abstract arguments about the enhancement of critical thought – no matter how skillfully they are advanced, no matter how much one might agree with them – are less likely to prevail.
I propose here one different kind of case for the humanities, one that identifies – and celebrates – their specific vocational utility.
Now, many of my fellow humanists, I suspect, will be troubled – even offended – by such an argument: the humanities ought not to be sullied by vulgar assertions about their supposed practicality. But there would be an irony in that response to my argument.
As a historian, I – like all historians – have invariably found it informative, illuminating and useful to consider the historical context and precedents for the issue at hand. And as a student of the Italian Renaissance, I have always found it ironic that, notwithstanding likely present-day resistance to evaluating the humanities in terms of their vocational utility, they enjoyed the considerable prestige they enjoyed during the Italian Renaissance and thereafter precisely because of their perceived practical utility.
Currently, the humanities, relative not only to the current place of the sciences but also to the place of the humanities during the Italian Renaissance, have withdrawn from a prominent role in the public arena, and this, I suspect, is one of the causes of their momentarily precarious state. During the Italian Renaissance, on the other hand, the humanistic disciplines were prestige subjects of study expressly because they enjoyed a relationship to the political and social order -- because those with political authority saw real practical value in encouraging humanistic study and employing those who had undertaken and completed it.
The adherents of the studia humanitatis held posts in the governments of the Italian cities and courts of the 15th and 16th centuries; their skills enabled them to serve their employers effectively as speech and letter writers, historians of the state, diplomats and government magistrates. They wrote elegant prose that was then deployed in diplomatic dispatches and letters and in speeches that they or their employers – the bearers of political authority – delivered effectively and persuasively, in part due to the elegance of the language, in part to the emphasis characteristic of the humanist program on skilled oratorical delivery. If I understand correctly, this is the collective opinion of a succession of distinguished historians of the Italian Renaissance: Paul Oskar Kristeller; Lauro Martines; Anthony Grafton and Lisa Jardine; James Hankins; and others.
Precisely how were such linguistic and literary skills leveraged as professional assets? In the words of one student of Renaissance humanism, rhetoric “was ... effective in the daily encounters of the tribunal, marketplace, and political forum, not to mention in diplomatic and personal correspondence. Artful communication ... became a[n] ... .instrument for gaining or maintaining power.” Grafton and Jardine have written that the skills
...inculcated had an established practical value in fifteenth-century Italy. The ability to speak extempore on any subject in classical Latin, the ability to compose formal letters to order in the classical idiom... were... valuable assets. Equipped with them the student could serve as an ambassador, or secretary to a government department... In other words, although the programme was strictly literary and non-vocational, it nevertheless opened the way to a number of careers....[T]he independence of liberal arts education from establishment values is an illusion. The individual humanist is defined in terms of his relation to the power structure, and he is praised or blamed, promoted or ignored, to just the extent that he fulfils or fails to fulfil those terms. It is ... a condition of the prestige of humanism in the fifteenth century, as Lauro Martines stresses, that “the humanists ... were ready to serve [the ruling] class.”
“In this setting,” Grafton and Jardine continue, “the rhetoric of humanism represents the power of Latinity and eloquence as actual power – as meshed with civic activity in a close and influential relationship.”
As models for their linguistic practices, the Italian Renaissance humanists turned to familiar and newly recovered classical texts, and the classicizing character of university education in the post-Renaissance European and Europeanized world is directly attributable to the influence of the Renaissance humanists, who advocated strenuously and successfully for the virtues of their particular disciplines. As late as the mid-to-late 19th century, venerable American liberal arts colleges offered a course of study for the A.B. degree that continued to feature classical texts, almost to the exclusion of other subject matter. (The course of study for the A.B. at such institutions also included some more limited course work in “geometry and conic sections,” algebra, plane and spherical trigonometry, mechanics, “general chemistry and the non-metals,” and additional subjects other than classical languages and literatures.)
So persuasive had the Italian humanists been in their advocacy that, centuries later, the course of study in the classic 18th- and 19th-century American liberal arts college continued to reveal the influence of the Italian Renaissance, notwithstanding the challenges one would have faced in arguing compellingly for the continuing utility of such an educational tradition in 18th- and 19th-century America. The Harvard historian Bernard Bailyn wrote that “[t]he classics of the ancient world are everywhere in the literature of the [American] Revolution,” “everywhere illustrative… of thought. They contributed a vivid vocabulary..., a universally respected personification...of political and social beliefs. They heightened the colonists’ sensitivity to ideas and attitudes otherwise derived.” And, indeed, James Madison, A.B., LL.D. Princeton University, 1771, 1787, mastered several ancient languages before “fathering” the American Constitution.
Harvard president and chemist James Bryant Conant could write as late as the 1950s that “[in] Europe west of the Iron Curtain, the literary tradition in education still prevails. An educated man or woman is a person who has acquired a mastery of several tongues and retained a working knowledge of the art and literature of Europe.”
Now, what does one learn from this brief primer on the historical context? First, that advocacy – the kind of advocacy characteristic of the Italian Renaissance humanists, who, according to Kristeller and those who wrote after him, wrested a temporary position of preeminence in their society precisely through the force and effectiveness of their advocacy – is perfectly acceptable, and carries no risk of coarsening the quality of the enterprise: a succession of Italian Renaissance humanists beginning with Petrarch advocated spiritedly for their program, and one could scarcely argue that their intellectual achievement was cheapened as a result of that advocacy.
And second, that such advocacy is especially successful when it legitimately emphasizes vocational utility and professional applicability, when it advances an argument that one’s field of study leads incontrovertibly to coveted careers and has concrete benefits for the state and for the political and social order. Let us be spirited advocates, therefore, and celebrate the utility of the humanities as one of the justifications for studying them.
Could a similar, and similarly effective, case be made today for the humanistic disciplines? I believe so. In what ways could one argue – reasonably, justifiably, and therefore persuasively – that the humanities have direct professional viability, and that one can therefore envision and countenance studying them not only because of the intrinsic intellectual satisfactions of doing so or merely because their study enhances critical thought or powers of expression in some abstract sense, but also because there is true, clear utility to doing so?
It would not be difficult to inventory a considerable number of coveted professions and enterprises where humanistic training is not only professionally valuable, but indispensable. I offer just a few possibilities here, and the list could easily be extended, I should imagine. (For example, Lino Pertile suggested the importance of humanistic training to careers in the growing nonprofit sector.)
And my argument is that, in our advocacy for the humanities, we should not be at all reluctant to make much fuller and more explicit reference to their career utility.
What would a 21st-century inventory of concrete vocational applications of the humanities look like? For example:
A field that embraces what was once termed bioethics and related areas. When one addresses and attempts to resolve such pressing public-policy issues as stem-cell research, abortion, the availability of health care, female genital mutilation, AIDS, epidemics and pandemics, and many others, a satisfactory resolution of the problems encountered will depend not solely on scientific and medical expertise, but also a command of the time-honored questions of the ancient discipline of philosophy: notions of justice (for example, determining how to distribute justly limited resource like health care); morality; and ethics. These are urgent matters that require a humanist’s expertise and the philosophers’ millennia of experience in analyzing such vexing issues. The career possibilities in international health organizations, government agencies, non-government organizations, and think tanks seem promising. The indispensability of the humanities to the successful practice of this field is such that it is now often termed the medical humanities.
Architecture and urban planning. The architect and urban planner creates the built environment (an urgent and practical, enterprise, in that human beings require spaces in which to live and work), and in doing so, he or she functions at the nexus of the political-economic, the social, and the aesthetic; the architect and urban planner is equal parts humanist (who deploys aesthetic sensibilities in the design work) and sensitive reader of the practical social, political, and economic contexts within which he or she necessarily operates. Enlightened city planning offices welcome colleagues with such sensibilities.
Foreign service and diplomacy. Never before has there been a more urgent need for skilled readers of cultural difference. A sensitive humanistic understanding of other cultures, acquired above all through the rigorous study of foreign languages (and literatures), will be indispensable in coming to terms with such developments as the encounter of Islam and the European and Europeanized worlds. The repercussions for so practical a consideration as American national security are obvious, and one can imagine many outlets for such skills in government service.
Various modes of public discourse (or “writing in action,” as my former Tulane colleague Molly Rothenberg has termed it). By this I mean the effective use of language in the public arena, such as journalism (both print and broadcast, and, increasingly, digital) or television and motion-picture screenwriting. But it could also be extended to embrace advertising (increasingly web-based, which entails yet another humanistic skill, the aesthetic sense required in the visual and aural material that now invariably complements text); web-page design (which, once more, will entail a fusion of the visual, aural, and textual); and related enterprises. The humanist’s command of the aesthetic complexities of text and language, visual image, and aural material, and their simultaneous deployment will be indispensable. Indeed, the digital technologies of the 20th and 21st centuries are so powerful, and the full reach of the transition currently under way so difficult to apprehend, that one can only speculate as to what shape human communication will take when the shift to a new paradigm is more or less complete. (Indeed, humanistic sensibilities may prove to have a salutary, tempering influence on the effects of digital technologies.) The skillful fusion of still and moving images, aural material, and text will determine the effectiveness of MOOCs, which will depend as much on humanistic skills as scientific and technical.
Rhetoric and oratory. This element is related to the previous one. The electronic information technologies that emerged beginning with the invention of the telegraph in the 19th century have a characteristic that makes them unlike manuscript copying and print: they “dematerialize” information and make it possible for it to be disseminated with lightning speed across vast distances. And the invention of radio, film, and television added the elements of the aural and moving visual to those that had characterized the medium of print (and manuscript copying before it): written text and still image. These newer technologies more closely replicate “live” human experience, and much more closely than print, which freezes discourse, and alters its character. As a technology, print (and the media associated with it) have been giving way to electronic technologies, with their capacity for the full integration of written and spoken language, still and moving image, and sound (music and other aural material), and for the dematerialization and dissemination of such information. The implication for colleges and universities is as follows: we have invested admirably in initiatives designed to train our students to write well and read texts critically and perceptively. But given the power of the new technologies, there is a case to be made for a return to greater instruction in rhetoric and oratory, to an equal command of the spoken word, which can be captured on audio- or videotape or broadcast over radio, television, and the computer (via Skype), in a guise that print has never demanded. The development of electronic communication technologies that permit us to communicate extemporaneously over vast distances in a conversational tone and manner, suggests that we might well retool our educational system to feature once again the time-honored humanistic practice of effective oratory and refine our students’ facility in the spoken word.
One need only consider the example of Barack Obama’s skilled oratory (or Franklin Roosevelt’s, or Ronald Reagan’s, or John Kennedy’s) to appreciate the importance to the political order of a venerable humanistic skill like oratory; these are political figures who postdate the development of electronic technologies, notably. Columnist George F. Will has observed that the American presidency is “an office whose constitutional powers are weak but whose rhetorical potential is great.”
By no means do the new electronic information technologies obviate the need for continuing skill in other, more traditional and familiar humanistic modes of communication – the kind of careful, comprehensive, subtle argument that written text affords – and the close, sensitive reading and command of existing texts that inform the authorship of new texts. Henry Riecken suggested that “[t]he text of the Federalist Papers was put into machine-readable form in order to carry out an analysis that resolved questions of disputed authority of some of the papers; but the new format did not replace the bound volumes for readers who want to absorb the thoughts and reflect on the aspirations of this stately document.”
Art conservation, and its relationship to the political economy. Nations with an exceptional legacy of monuments in the visual arts (Italy being an well-known example) face a particular challenge with respect to maintaining the condition of that legacy. And in Italy’s case, the relationship of the condition of that legacy to the economy is obvious: given the central place of tourism in the Italian economy, it is vital that the nation’s artistic patrimony be satisfactorily conserved. Sensitive art conservation is at the intersection of the humanistic (the aesthetic), the scientific and technological (an understanding of the nature of surfactants and the effects of environmental conditions), and the political-economic (the need to balance the claims of conserving the artistic patrimony acceptably against other claims on public resources).
What is interesting about this list is how closely its elements are aligned with the Italian Renaissance humanist’s earlier construction of the studia humanitatis. The kind of ethical reasoning demanded in successful practice of the medical humanities is, in its way, a modern iteration of the Renaissance humanist’s moral philosophy; 21st-century applications of writing, rhetoric, and oratory are, in their way, contemporary versions of the Renaissance humanist’s grammar, poetry, and rhetoric; the understanding of foreign cultures and languages required for effective foreign service in today’s bewilderingly complex and interdependent world is, in its way, the modern expression of the Renaissance humanist’s practice of history. The foundational elements of the core humanistic program have perhaps not changed so very much.
What is different is the explicitness with which the Renaissance humanists advocated – persuasively, compellingly, successfully – for the professional utility of their disciplines, which permitted them to secure a place of considerable prestige and authority in their world. There is warrant for their 21st-century successors’ advancing a similar argument: that one undertake the study and practice of the humanistic disciplines not only within the confines of the academic world (as intrinsically worthwhile, in a fundamental intellectual sense) but outside them as well (as critical to the successful execution of one’s expressly professional and vocational responsibilities).
Specifically, I propose that we self-consciously reframe the presentation and delivery of the humanistic offerings of the modern-day college and university to make much more explicit reference to their potential applicability: that we foreground this kind of argument for their virtues. Some of what is now being done within the university is being done absentmindedly, so to speak, without a sufficiently self-conscious articulation of why we do what we do. Were we to reframe our offerings in this way – reposition the humanities and articulate their virtues differently – we might find that the national trend away from them could be halted and perhaps even reversed.
My sense is that many students rather naturally hunger for the humanistic disciplines and are driven to make other curricular choices in part because of concerns about career viability. Were such concerns addressed – legitimately, effectively, persuasively – we might find some such students willing to study what their hearts prompt them to study. In our curriculums, were we to foreground explicit, purposeful reference to the ways in which the humanities are indispensable to the successful practice of some of the esteemed and rewarding professions identified above (rewarding in several senses of that word), we might succeed in alleviating student (and parental) anxiety about the practicality of studying such supposedly “impractical” subjects.
Only by such means, I believe, will the humanities truly be able to re-secure the place they once enjoyed, and still deserve, in the collective cultural imagination and in the great public arena. And by no means should we be hesitant about advancing such an argument, since we have the example of the Italian Renaissance before us: it would be difficult to argue that energetic advocacy on grounds of vocational viability compromised the artistic and intellectual integrity of the achievements of Petrarch and his venerated successors.
Anthony M. Cummings is professor of music and coordinator of Italian studies (and former provost and dean of the faculty) at Lafayette College.
Submitted by Jeff Rice on October 7, 2013 - 3:00am
Not all academics eat well. Often, I have found myself among a group of friends at the end of a conference, hungry for dinner, and, by some unknown force, our movement is directed toward an overpriced, chain steak house or fast food restaurant.
Conference hotels often house a Starbucks; each morning of my field’s main conference, a line of 30 people deep can be found before the day’s proceedings begin. Publisher-sponsored affairs are always a big hit. Cold shrimp served with ketchup-based sauce. Small cheese-stuffed pastry hors d'oeuvres. Toast with tomatoes on top. Cheese and crackers. Hummus. Crudités. Free food. Conference lunches can be less generous as colleagues grab day-old sandwiches -- made in some unknown factory -- at Starbucks. Or they push coins into a machine and grab a Milky Way.
Office microwaves are often messy with the remnants of frozen pizza, ramen noodles, or reheated hot dogs. Sometimes, when I am walking from the parking garage to my campus office, I spot colleagues at 8 in the morning leaving the nearby McDonald's with bags of fried something-or-other. One of my more astute colleagues, who works extensively in cultural criticism, has stood more than once across from me in our building’s elevator, a bag of Chick-fil-A in his hand. Academic cocktail parties, at the university or at a conference, usually offer $6 bottles of Bud and Miller Lite. The $7 Sam Adams is labeled “import.”
Our department meetings take place every other Tuesday during lunch time. It is not uncommon for me to eat a sandwich or salad during the meeting. Being at work makes me hungry no matter how large or small a breakfast I’ve had. When I taught community college night classes almost 20 years ago, I ate a salad before class started. Sometimes, I pack hard-boiled eggs in my salad so that the sulfuric odors permeate the room during meetings.
No matter where I’ve worked, campus catering coffee is always bad. Order a vegetarian meal for an event, and campus catering makes something heavy in starch (pasta drenched in a bland, unseasoned red sauce) or portabello mushrooms (grilled or raw). Across the street from our campus is a restaurant with the word “ass" in its title (“huge” is another word in its name). Across the street, one can also dine at a Korean restaurant, an African restaurant, a local pizza place with a good beer list, a Middle Eastern restaurant, a regional taco chain with the word "local" in its title, a fast food restaurant that specializes in chicken fingers, and a McDonald’s. I entered the student union the other day and saw a 30-person deep line at the Subway.
While my wife and I are members of the local co-op, not all of our colleagues know that it is located three miles south of campus. Recently, I bought local paw paw at the coop and posted a picture to my Facebook profile; some people mistook it for rotten avocado. The possible outsourcing of campus dining to a private company has raised faculty and student concerns that the university’s spending of almost $800,000 per year on local food will vanish. During a tour of the campus dining food warehouse last year, I was informed that when the university purchases local cattle, the chefs sneak ox tail into stews served in student housing. While most, if not all, of us housed in the humanities support same sex marriage, the elevator in our building reeks of Chick fil-A - whose owner opposes such marriages - on any given day. Purchasing a University of Kentucky Dining Plan allows a student to eat at Chick fil-A and Subway in addition to campus dining facilities.
Every October, regardless of what I am teaching, I share with students my hatred of candy. "I work all year," I say to them, "to keep candy away from my kids, and two hours of walking around the neighborhood on October 31 ruins my hard work."
In the living-learning community I co-direct, we leave Tootsie Rolls and Milky Ways out in a bowl for students to snack on when they come in for academic or life advice. In the residential hall where the living-learning community is housed, for our weekly coffee chats with members of the local community or university, we provide factory-made cookies from the Kroger supermarket chain and Cheetos. The best way to get faculty or students to attend a meeting or event, common advice goes, is to serve pizza.
Several times I’ve taught a course with the word “Eating” in its title. When I was at the University of Missouri at Columbia, the course was called “Eating Missouri.” When I took a job at the University of Kentucky, the course became "Eating Kentucky." After reading Anthony Bourdain, Calvin Trillin, a profile of Whole Foods CEO John Mackey, and notable food critics and discussions, including exposés of the fast food industry and mass-produced food, students still came to class with chicken McNuggets, defrosted frozen pizzas, high-fructose corn syrup sweets, and Krispy Kreme donuts. At the four different universities that have employed me, a Subway has always been within walking distance.
Many colleagues drink 32-ounce sodas in the morning. Because of my reputation as someone who enjoys craft beer, when I’m visiting a campus for an invited talk, colleagues feel obligated to take me to a place that serves good beer. When I was on my campus visit at the University of Missouri seven years ago, colleagues took me to the local brewpub for dinner. After our last main conference in my field, I fretted over the long flight home from Las Vegas (beginning early in the morning and ending at night) and worried that I would not have time during the layover to purchase something to eat. To ease my fear of future hunger, I bought a vegetarian sandwich in the casino Subway.
I used to get excited about attending the dinners for guest speakers or job candidates. Free food. Free food at expensive restaurants. I’ve since grown tired of menus that offer only heavy meat dishes, overcooked lamb chops, bacon in everything, or scallops. The Chick-fil-A in our campus food court is "proudly" closed on Sundays. One Friday a month, the agriculture college hosts a food-related discussion for faculty and members of the community at 7:30 a.m. Local food is served for breakfast. Participants are encouraged to bring their own coffee mugs. I once gave a talk entitled "Menu Literacy" for the discussion.
I sometimes say that my casual conversational skills are limited to discussions of kids, food, and beer. My attempts to recruit job candidates often involve telling them how great the local farmer’s market is and what kind of beer they will be able to buy if they move here. For some reason, I can host a catered event with local vendors in one building on campus, but in the building next door, I must use campus catering. In my previous job, because of budget cuts, the office I directed was no longer allowed to order $15 worth of cookies from a local bakery for board meetings that took place twice a month. In my previous job, I angered campus catering by complaining about the low-quality food they served during a "Writing Across the Curriculum" event I hosted. Campus catering at my current university won’t allow me to invite a local Mexican food truck for a small event that would take place outside of the living-learning community residential hall.
I know I sound like a grump or food snob with these random observations. And I probably am as much of a food snob as I am a critic snob or rhetoric snob or teaching snob or snob of any other part of my professional life. Snobbery can simply mean valuing one thing over another to a significant, and sometimes hyperbolic, degree. I value eating.
Snobbery is not alien to academic discussion; we place value on any number of things we admire or teach. I’ve often wondered how cultural snobbery, often expressed by colleagues in regard to art, literature, music, or film, does not extend to gastronomy. I’ve often wondered how astute cultural critics or critics of the university are poor food critics. By that, I don’t mean that we must decode every food representation we encounter in order to better understand ideology or power in the food industry. Instead, I wonder why, in our practices of everyday life, we succumb so easily to fast food, high-fructose corn syrup, chains, and other items instead of merely trying to eat outside of these problematic practices.
Pleasure, of course, is a powerful agent. Pleasure, of course, is at the heart of bad eating habits. And food writers such as Michael Pollan have demonstrated the ways fat , sugar, and salt compose and encourage a specific system of food pleasure, one encouraged by much of the fast food industry. None of us are beyond such pleasure, but that does not mean we must succumb to every instance that calls out to us.
Calvin Trillin’s best effort at food critique was to declare, "I wouldn’t throw rocks at it." My food pleasure is not another’s food pleasure, I realize. And I have no desire to preach health-conscious food habits or mindful eating to my academic colleagues. I have no overall argument to make regarding what academics should or should not eat. I have no agenda to preach. My observations merely prompt me to ask: Why don’t some academics eat well?
In asking that question, I am sketching some observations that include me, too. Among these observations I highlight, I also note that I support the local food movement known as "Kentucky Proud," and my wife and I try to buy most of our produce and meat from the Lexington Farmer’s Market. But when on the road or on campus without coffee, we succumb to Starbucks, too. Among our food purchases, we buy for our kids Arthur Pasta, dehydrated cheese and pasta shaped like the popular PBS character. We are not beyond the commercialization of food either.
Bruno Latour has warned of "purification narratives," stories that try to portray some event, movement, or way of thinking as pure or without contradiction. Roland Barthes once noted that every text is made up of contradictions, what he referred to as the pleasure of the text. That I have ordered a coffee at Starbucks or bought a box of pasta named for a cartoon character might seem to be minor contradictions of my interest in local food or my series of somewhat critical observations. Minor or major, the contradictions no doubt reveal a larger crack in any kind of purification narrative of food I might want to portray. I’m sure there are more or larger cracks in my ideological stance. After all, even after he carefully decodes the industrial, meat industry in his New York Times essay “Power Steer,” Michael Pollan confesses to not caring for grass-fed beef.
My point is only to trace a type of academic eating, a series of habits and practices that run counter, at times, to our professional practices and beliefs, that suggest an untapped pleasure of the text as we build elsewhere purification narratives regarding culture or texts. For good or for bad, many academic eating practices follow similar trajectories to one another as the banal and bland overpower the local and flavorful. Professionally, we are great critics: MOOCs, corporatization of education, adjunct labor, global conflict, a fiscal crisis here or there. What about bad eating?
One type of pleasure of the text might be the relentless critic who finds fault in every representation outside of the bag of Chick-fil-A in his hand. One might surmise from this lack of critical parallelism a lack, or crack, in the overall project of critique. French fries or diet soda, it seems, may be outside of critique, the behavior change that critique is meant to promote, or even basic awareness. Such an assumption goes far beyond my simple observations of eating in the university. I can only speculate in the meantime how critical practices might better shape food practices. Do you know what you are? Frank Zappa asked. You are what you is, he responded. Or, as the popular health saying goes, you are what you eat. Either way, not all academics eat well.
Jeff Rice is Martha B. Reynolds Professor of Writing, Rhetoric, and Digital Studies at the University of Kentucky.