Departures are stressful affairs. In 1904, James Joyce, an Irish Modernist writer, and Nora Barnacle, his girlfriend, began their lifelong pilgrimage through Europe. They had just met, a few months before — she a hotel maid from Galway, he a Jesuit-educated young man with poor eyesight and an ambition to become a famous writer. Joyce didn’t deceive Nora when he predicted the discomfort of their upcoming elopement and their life in exile. He confessed that he could not “enter the social order except as a vagabond.” Propelled by the desire to encounter the new, they both left the familiar constrains of home behind.
And in the midst of the debate about the so-called crisis of the humanities, I want my entire academic field to draw inspiration from authors like Joyce. Without dismissing the very real financial crisis in humanities departments, I want to address another kind of crisis — not entirely unrelated to funding — the widely professed crisis of identity.
Can Joyce’s life and writing give us some direction to re-envision the humanities as a field? A sense of personal crisis and disillusionment compelled him and many other expat Modernists away from home. Joyce rejected formalized religion and the insular culture of turn-of-the-century Dublin. Yet he remained saturated with both religion and Dublin and explored them in his writing until he died.
Voluntary exile furnished him with inspiration and necessary distance from the familiar, a detachment that many creative writers consider invaluable in capturing the complexities of fictional settings. But Joyce wrote about his homeland with a great deal of warmth, not just criticism. In his fiction, he goes back to Dublin streets again and again, and he goes back to the West of Ireland, where his beloved Nora came from. The last paragraph of “The Dead” is the most touching description of a native land by a self-exiled writer.
James Joyce — a voluntary exile, a wanderer, a seeker — always came home. This master of experimental writing and irreverent violator of tradition returns home whenever he alludes to Odysseus’s wandering and whenever he lets us encounter his Irish equivalent of Odysseus, Leopold Bloom — an Irishman, a Jew, and a cuckold, an alienated character, an “ancient mariner.” As we plow through Ulysses, we read about Stephen Dedalus’s snot, Leopold Bloom’s erections and bowel movements, and Molly Bloom’s menstruation, and we’re not quite sure where we’re heading. Yet, in all this apparent directionlessness, we learn a great deal about suffering, betrayal, desire, and compassion. We know the characters intimately, and we want to reach out and touch them, cry with them, walk with them. We sympathize with Bloom, who responds to violence by proposing that the answer to “force, hatred, history, all that” is “Love.”
Exile and nomadism, those unsettling symptoms of Modernist physical and spiritual displacement, can furnish us with love — love for discovery, love for learning, love for the other. It is through leaving the comfort of home and encountering alien people and ideas that the humanities classroom thrives. We expect our students to enter the world of the unknown with courage, but we are often hesitant to do it ourselves. We should have the courage to face the new — collectively as a discipline.
Let’s look at the “crisis of the humanities” as an opportunity to re-envision the field, to send it off on a great adventure away from home. Let’s not treat the humanities as a field with a calcified identity, entrenched in the past. Lest you misunderstand me: This is not a call to forget about the past, to abandon Confucius and Aristotle, Beowulf and Dante, Voltaire and Tolstoy.
I want the humanities to remember home, but to be comfortable with change, to embrace new opportunities, to feel the excitement of letting their identity be molded by movement, not to be threatened by changing or porous boundaries. If we do not initiate new adventures and if we do not embrace an itinerant mode of exploration as potentially educational and formative, we will be forced to change anyway.
And the difference between choosing exile and being forced into a refugee status is profound. Joyce, for example, was never barred from returning to Dublin. He maintained his ties with Ireland and, if he chose to, he could always return home — through his experimental fiction and political essays or by visiting Ireland himself. Refugees facing real violence have no luxury of returning home.
Underfunded and disrespected humanities are the refugees of academe. In the last decade alone, whole departments have fallen victim to the corporate takeover of learning. So without dismissing the value of staying home, I want to suggest that we explore new ways of scholarship and that we travel to other disciplines — yes, including computer science and STEM — to enrich our thinking about our disciplines. Being homesick without being homeless, conversing with the past while imagining new beginnings — all this is potentially generative and exciting.
The writers we study in literature classrooms and the teachers who assign their texts put “home” in conversation with the tradition in order to other it. These writers often speak with each other across the boundaries of time and space. They leave home to drop in on distant relatives or total strangers. Colm Tóibín’s Testament of Mary responds to the New Testament and allows Mary to voice her dismay over the idol-worship surrounding her son and, eventually, her anguish over his death. Carol Ann Duffy revisits Greek and Roman mythologies to give voice to the women rendered mute by the original storytellers.
This is the essence of the humanities: embracing the nomadic state of not knowing and not belonging and, at the same time, living in the text and conversing with it freely; being rooted in tradition and challenging it; respecting the canon and revising it as we begin to understand who has been silenced; retaining our reverence for the printed book and letting ourselves feel excited about new modes of writing, publishing, and discussing literature.
Our disciplines are grounded in printed text or painted canvas, but they should also explore the new technologies that democratize people’s access to knowledge and allow the difficult conversation with tradition to happen instead of hiding behind a paywall. We should use these technologies with excitement and criticize them where they fail to deliver.
In the nomadic future of the humanities, scholars of sub-Saharan literature collaborate freely with visual artists and computer science experts on projects that would attract students and the general public. In the nomadic future of the humanities, business owners, nurses, and local artists join college students in poetry slams and book clubs. Our brilliant philosophers of gender, race, and class leave the campus regularly to engage middle-schoolers and high-schoolers in the life of the mind, leading discussions about the issues that affect them. In the nomadic future of the humanities, we prove that literature is not only for the elite few, that the beauty of the written and spoken word can move everyone, and everyone can try to articulate why.
To accomplish all this, the humanities will have to open up and venture out without the fear that we’re undermining some primeval principle of what it is we should be doing as scholars and teachers. Pretentious, intentionally obscure, and insular humanities will soon face decline. I do not dismiss the beauty and importance of navigating the world of ideas without any stated utilitarian purpose. But the humanities should be in flux, inviting others to join in their nomadism, open to other disciplines, learning from them and teaching them, too.
Like James Joyce and other Modernists who left home in both literal and metaphorical ways when they abandoned the comfort of established modalities of expression, the humanities — as well as their teachers and students — should be encouraged to redefine themselves as they cross borders and encounter alien worlds. If the humanities could repeat Stephen Dedalus’s call “Away! Away!,” with equal enthusiasm but with less arrogance, perhaps we wouldn’t be talking about their “crisis.”
If we acknowledge the importance of the formative origins of the field and continue exploring them unapologetically and with passion but in a way that would be inclusive of those unfamiliar with the prohibitive jargon of most academic papers, we could capture the interest in ancient philosophy, Medieval morality plays, or postmodern theater among people who are not affiliated with academe but who enjoy the life of the mind. We could avoid the charge of being locked up in the Ivory Tower, waiting for our slow death as the masses outside rage against us. If we admit that revamping and energizing the field will take resources, creativity, and courage, and if we reward the courage to leave “home” in search of discovery, the humanities classrooms will again be filled with students.
We’re already doing a lot of great work on campuses across the nations: tweeting about philosophy, transforming theories of public engagement into practice in local communities, or sending students to professional conferences, writers’ workshops, and exhibitions. But it would take a more systemic shift to make all this possible on a larger scale.
First, a lot of these creative ways of approaching the humanities are time-consuming and costly, and grants for the humanities scholars and teachers, always unimpressive, are becoming even more rare as the National Endowment for the Humanities and Fulbright funds are being drastically cut. Second, we should start rewarding public engagement with the humanities in tangible ways. A series of compelling and clear blogs about an obscure 17th-century poet should count toward tenure and promotion, together with required well-researched papers published in specialized, peer-reviewed journals. Both forms of engagement with our subjects are important and valid, and they should be complementary as well as rewarded.
Publishing in traditional academic journals tests new ideas on the forum of narrowly specialized scholars and adds new knowledge to the field. Explaining our research to the general public in clear, accessible prose could make it possible for us to continue testing new ideas in a narrowly specialized forum. If popularizing the humanities, the hard work of bringing them out in the open, is derided as a job of a traveling salesman, the humanities will lose public support, and along with it, the resources necessary to thrive.
So let us together see the humanities take a stroll into uncharted territories but always remember home, like Leopold Bloom who — after walking through Dublin for many hours — returns in a chapter called “Ithaca” to his unfaithful wife’s bed and kisses “the plump mellow yellow smellow melons of her rump.” Voluntary exile from Ithaca, from the Blooms’ jingling bed, from Ireland, from Aristotle and Shakespeare, from a printed book and a lecture hall, will help us look at home upon our return in a new way, influenced by encountering the alien.
The humanities that boldly leave home — and yet always remember home—the humanities that are not afraid to take a risky detour, the humanities that are not too aloof to leave the campus and engage pressing issues with clarity and empathy — this is a field that will survive any crisis of confidence.
Agata Szczeszak-Brewer is associate professor and chair of English at Wabash College
Faculty review of the way U. of Illinois blocked a controversial hire sharply criticizes the chancellor and how she and trustees invoked issue of civility, but finds there may have been legitimate reasons to oppose the appointment.
Some years ago I met a woman who owned a large calico cat bearing a certain resemblance to Queen Victoria: stout, regal, disapproving. She had enjoyed her mistress’s undivided attention as a kitten; thesecond cat joining them a few years later proved easy to dominate. But the large male primate who began coming to the apartment on some evenings was another matter. I appeared incapable of taking a hint, and she was not amused.
Before long I was spending most evenings there. Oaf though I may have been, I did get a feeling of being disdained, at best, and could imagine the older cat taking the woman aside to say, through unhappy looks and feline telepathy, “This guy has got to go.” If things ever reached that point, the woman held her ground – and reader, I married her. (The cat with less seniority had in the meantime grown fond of me, which may have helped.)
The situation de-escalated before reaching the stage depicted in Octave Tassert’s Die eifersüchtige Katze, one of the paintings reproduced in Jealousy (Yale University Press) by Peter Toohey, a professor of classics at the University of Calgary. The author roams across several cultures, media, and disciplines in his investigation of the green-eyed passion. In literature, jealousy tends to resemble a kind of madness, and it usually becomes part of the daily news only after escalating into lethal violence. But Tassert’s canvas presents the emotion in one of its more comic expressions.
Painted circa 1860, “The Jealous Cat” depicts a love triangle of sorts. We see a woman sprawled on her bed in dishabille -- with her lover in, let’s say, close proximity, still clothed for the most part but with his pants below his knees. (A coat hangs on a nearby chair, not draped over the back but thrown on it at an angle suggesting haste.) At first glance he appears to be standing. But the angle of his legs and the way one arm seems to be swinging upward -- and the startled expression on his face as he looks over his shoulder -- all suggest he has just bolted upright. Just behind him, and a little lower, you see the creature giving the painting its title: a jealous cat, stretching up to sink both claws into the man’s exposed buttocks.
“They’re obviously more tempting than the uninspiring ball of string left by the chair,” Toohey deadpans. The painting itself is humorous but it raises perennial questions about emotion: Do animals have them, or is that just anthropomorphizing? And if they do experience feelings that in a human would be understood as emotion, how similar to ours are they?
Animals’ inability to self-report their own mental states makes any answer more or less unverifiable, and we are in the same position regarding the emotions of the human child in its first few years. What we have with both nonverbal animals and preverbal infants is behavior that looks and sounds like what we associate with happiness, excitement, fear, and perhaps one or two other emotions. But is jealousy among them? The experience of it can be raw and overwhelming, but it responds to a situation that is fairly complex. “The foundation stone of jealousy,” writes Toohey, “is triangular”: the product of a situation “usually [involving] two people and some form of possession, animate or inanimate.” The classic form – “the clichéd sine qua non of the jealous situation,” as the author puts it – is the romantic triangle: the jealous party’s claim on the significant other is violated, or at least menaced, by a rival.
Whether or not its brain can process all the elements in play, the jealous feline in Octave Tassert’s painting has at least determined the fastest and most efficient way to disrupt the situation. A desire to hurt the rival may not be noble, but it’s understandable and reasonably straightforward, especially when the rival is standing right there.
Human beings are prone to making things more complicated. The desire for retribution can target the beloved as well as the rival, and even become more intense – sometimes to really horrifying extremes. The author cites one case that sounds like the brainchild of an exploitation-movie director trying to outdo the competition: A British man who spent a week beating, strangling, and threatening his girlfriend also tried to fill her ears and eyes with quick-sealing putty. In handing down a prison sentence, the magistrate told him: “You are almost insanely jealous.” Almost?!
Explosions of jealousy -- even of sexual jealousy, by all counts the most excruciating sort -- usually stop short of mayhem. Toohey notes that there is just enough of a stigma around jealousy to limit how openly we feel comfortable expressing it. At the same time, jealousy is a persistent enough force to make subduing it hellishly difficult, and also irresistible as raw material for art and literature. In Othello, the work most indelibly identified with the experience of jealousy, Shakespeare treats it as a passion that, once ignited, feeds itself, with imagination as the fuel -- even when the grounds for it are entirely false.
Toohey writes of the moment when an individual sees or hears something that ignites the emotion. Even when based in rock-solid fact – with no Iago whispering baseless insinuations – the suffering of the jealous person comes mostly from scenes and conversations running in an obsessive loop within the mind. One of the most interesting chapters of Jealousy considers how literary and artistic works present our eyes and ears as the organs that make us vulnerable to the suspicion then elaborated upon within the brain’s theater.
Perhaps that accounts for the bizarre revenge taken by the “almost insanely jealous” man mentioned earlier. And perhaps imagination is the factor distinguishing human jealousy from whatever it is animals feel when faced with rivalry. Our motives are more complex, and our memories are longer. That gives us an evolutionary advantage. But it also opens up wide vistas of potential misery, where the jealous mind is condemned to wander in circles.
For a rising generation of administrators in higher education, the heart of education is innovative technology -- and faculty get in the way.
In a recent speech, the new president of Carnegie Mellon University, Subra Suresh, intimated his administrative philosophy, remarking that, “the French politician Georges Clemenceau once said that, ‘War is too important to be left to the generals.’ Some would argue learning is too important to be left to professors and teachers.”
The speech opened the inaugural meeting of the Global Learning Council (GLC), held at Carnegie Mellon in September. The GLC brings together a group of high-level university administrators, government officials, and corporate executives who aspire to be an at-large advisory group, akin to the National Research Council, for higher education.
Suresh could have used the help of an English professor to unpack the analogy. Presidents and provosts would be generals, not faculty, who are the soldiers in the trenches, so the fitting parallel would actually be “education is too important to be left to administrators.”
On that count, I agree.
Suresh’s phrasing was not a slip but a frank statement — for him, faculty have little place in decision-making. And I think that it captures the leaning of many current initiatives touting innovation and technology.
The classic definition of the university is that it represents the corporate body of the faculty. Like the protagonist of Flannery O’Connor’s Wise Blood, who wants to establish the Church of Christ without Christ, the New Leaders of higher education want to establish education without educators. Or more precisely, they want to call the shots and faculty to do what they're told, like proper employees. To wit, at the conference there were few regular faculty member in attendance (even if some of the administrators had started as or occasionally did guest spots as professors, it’s probably been a while since they devoted much of their work time to that realm), and there was certainly no social or cultural critic of higher education scheduled to speak. Rather than engaging much criticism or debate — which, after all, is a mission of the university, testing ideas — it had the character more of an infomercial.
The focus of the conference was to install technology in higher education as fast as possible, and the speakers included high-level figures from Google, Kaplan, edX, and various other companies with a financial interest in the changeover.
The only speaker who raised doubts about technology was a military person, Frank C. DiGiovanni, director of force readiness and training in the U.S. Office of the Undersecretary of Defense. In his talk he said that he found that, to be effective, education needs to “stimulate the five senses,” which does not happen with devices. In fact, he noted that there was a “loss of humanity” with them. He added in subsequent discussion: “I worry about technology taking over. The center of gravity is the human mind.”
It seemed a little ironic to me that the only person reminding us of a humanistic perspective was the military man, though it was clear that DiGiovanni had a good deal of experience with how people actually learned and that he cared about it.
The innovation mantra has been most prominently expressed by the business guru Clayton Christensen, who coined the phrase “disruptive innovation.” It has been the credo especially of tech companies, who come out with ever-new products each year. The theory is that businesses like the American steel industry have failed because they were set in their ways, doing things that were successful before. Instead, even if successful, they should disrupt what they’re doing. Hence, while Apple was making very good laptops, they went to the iPhone. Then to the iPad. Then to the Apple Watch.
Christensen has extended his theory to academe, in articles and his 2011 book, The Innovative University: Changing the DNA of Higher Education from the Inside Out (co-written with Henry Eyring). He basically sees higher education as set in its ways (hence the DNA metaphor) and ripe for a takeover by technology, and he holds up universities such as BYU-Idaho and the for-profit DeVry University as models for the future. He admits that Harvard University is still top of the line, but not everyone can go to Harvard, so, in cheery rhetoric (some of which is taken from the promotional literature of the colleges themselves), he sees these other schools doing what Walmart did to retail.
Christensen’s theory of innovation has been rebutted by Jill Lepore in a recent piece in The New Yorker,“The Disruption Machine.” She points out that most companies succeed because of sustainable innovation, not disruptive. Apple, after all, still makes laptops, and US Steel is still the largest steel company in the US. In addition, she goes on to demonstrate that a good deal of Christensen’s evidence is thin, not to mention that many of his examples of success have gone belly-up.
Besides holes in the general theory, it’s also questionable whether the kind of innovation that applies to technological or commodity products is readily translatable to education. Cognitivists have shown that education largely works affectively, through empathy, which requires live people in front of you. One learns by imaginatively inhabiting another’s point of view.
Moreover, most institutions of higher education have a different role than businesses — more like churches, which in fact is the analogy that helped establish their independent legal status in the 1819 Dartmouth decision of the U.S. Supreme Court. Something other than consuming goes on at universities, which gets lost in the ommercial model of higher ed.
Think of it this way: while I like to shop at Macy’s and hope it stays in business, I would not donate any money to it, whereas I have to universities and churches. Of course universities should use best business practices, but if they act primarily as a business, with a saleable product and positioning students as customers, then they abnegate this other role. This is an inherent contradiction that vexes the push to commercialize higher education.
This is not to say that there is no use for technology. The Online Learning Initiative, a project studying statistics pedagogy at Carnegie Mellon, shows that some online segments work better than large lecture sessions. But, if you read their reports, it’s clear that the experiment essentially offers a flipped classroom, and in fact students probably gain more faculty contact than in the lecture model. It’s more like a return to a tutorial model. Who knew students do better with professors?
What the rush for innovation is really about, as Christopher Newfield, a leading critic of higher education, has pointed out, is not a better theory of change but a theory of governance. As Newfield puts it, “it isn’t about what people actually do to innovate better, faster, and cheaper, but about what executives must do to control innovative institutions.” It’s all about top-down plans of action, with the executive issuing a plan to disrupt what you’re doing, and subordinates to carry it out. Hence Suresh’s brushing aside those pesky faculty, who traditionally decide the way that education should be. That might be O.K. for a corporation, but it violates any standard idea of shared governance and academic freedom, which holds that faculty decide the direction of education.
It’s also about politics. The vision of higher education that the New Leaders of higher education would like to install is not a traditional horizontal institution, in which faculty are generally of equal power. (For instance, I’m a professor at Carnegie Mellon like Suresh, so technically I have the same faculty rights and determine the content of my courses and research, not him — and fortunately I have tenure, so he can’t fire me for writing this, which he could if it were a regular corporation.) Rather, it has become an oligarchical institution, reliant on business deals and donations. Business corporations, after all, are not democracies but oligarchies, with decisions running from the owners and executives downhill.
The oligarchical leaning of the New Leadership became clear to me in a talk by Luis van Ahn, a young computer scientist at Carnegie Mellon and MacArthur Award winner. Van Ahn was animated and funny, bringing fresh energy to the proceedings. He evidently had made a killing in developing CAPTCHAs, those difficult-to-decipher wavy letters to verify you’re a human and not a bot online (in his PowerPoint he showed a picture of a man lying in a bed of money, which drew a lot of laughs).
Since then, he has developed and is CEO of Duolingo, a nonprofit designed to bring language training to people for free (or more precisely for their labor). It’s all online, and it’s self-funding: Duolingo sells crowdsourced translations from students to CNN or other businesses in need of them, and the money keeps the company going.
Van Ahn had several tenets of education, the first of which was that “the best education money can buy should be free.” I was with him on that, but I was not so sure about the rest.
One was that the best education should, “Be in your pocket, not in some building.” Again, if education relies on social contact and empathy, then we need a place for it other than the shallow contact of a screen. Think of it from the bottom up: children learn from the synesthesia of sociality, and those who are regularly read to by parents learn to read the soonest. What would a child be like if you locked him or her in a room with a device?
Moreover, while a program like Duolingo might be good for picking up a reading knowledge of a foreign language, I wonder about its transposition to speaking. While van Ahn attests to good testing results online, languages, after all, are not formulae but social. Anyone who has learned a foreign language knows that it’s a much different experience when you’re there, in front of live people.
Still, Duolingo seems like a good thing and an exemplary use of online. However, van Ahn had another tenet: that learning should be through a corporation, not through a government. He said that you cannot trust governments (most “suck” and “other people’s funding usually comes with other people’s ideas and influences”), which he drew from personal experience as an immigrant from Guatemala. That might be understandable in his individual case, but is deeply troubling to anyone who has a Jeffersonian sense of higher education and believes that it should be a public right and to cultivate citizens.
It boggles the mind to think that corporations would be better. What are the guarantees that they would be more free from “other people’s ideas and influences,” particularly of just a few people?
Perhaps if van Ahn is running them. (And still, he sold his previous project to Google, and one might question Google’s proprietorial policies, which we have little recourse to alter.) Governments presumably are based on the will of the people, whereas corporations are based on the will of their owners, boards, and executives, oriented toward gaining the most advantage for themselves. A poor government might fail to represent the will of its people, but the problem then is the lack of democracy. By definition, corporations represent a small, self-interested group.
While van Ahn seems like an admirable person and has put some of his money into good causes, his statement was the credo of plutocracy: the rich and powerful should rule, and their good effects might trickle down. But I don’t trust corporations as much as he does, particularly since they have brought us our current world of severe inequality.
American higher education was conceived as a remedy to inequality in the period after World War II, with policy documents like the 1947 Truman Commission Report setting out a plan to fight inequality “in so fundamental a right as education,” spurring state and federal funding to expand high-quality public colleges and universities and allow a greater number of citizens to attend them for minimal tuition.
The new technology reinstalls inequality, with the wealthy (and a few high-scoring poor) receiving bespoke higher education at elite schools, but most of the rest getting theirs on a screen — with great graphics! like a game!