Toward the end of one summer — 1994, to be precise — I arrived at St. Lawrence University as an 18-year-old freshman, excited yet nervous to begin my college career. I had a vague notion that I wanted to be a writer someday, though I didn’t really have an idea of what that would entail or how difficult it would be. I wasn’t particularly anxious about the classes I would be taking — though in hindsight, judging by my grades that first semester, I probably should have been.
No, my concerns were more social in nature. Would I like my roommate? Who would become my friends? Would the people who promised in my high school yearbook that we would be "friends forever" still matter to me, and I to them, by the time we saw each other again at Thanksgiving? Would I finally have sex? The answer to these questions were: Not particularly, a bunch of people, some, and no.
The last answer was the most devastating, to the freshman me, but all in all, that first year of college was a good experience. I read King Lear. I learned from my new female friends that feminists were not, as I had been led to believe, castrating man-haters. I saw my first Kurosawa film. I attended several meetings of the Black Student Union — for the first time, I experienced what it’s like to be the only white person in a room. I was in a play. I perfected my impressions of both R.E.M.’s Michael Stipe and the B-52’s Fred Schneider, in order to entertain my friends on Friday nights fueled by cheap beer and Boone’s Farm "wine products." I read memoirs and essays by the likes of Tobias Wolff, Piri Thomas, and Maxine Hong Kingston that created and nourished my interest in creative nonfiction forms.
As that first year came to a close, I was a little stressed by final exams and papers, and somewhat concerned that I’d never get a girlfriend. Mostly, though, I thought college was an exciting, intellectually challenging, and fun place to be, and I knew I didn’t ever want to leave. So, with the exception of a short break due to some health issues, I really didn’t — I went to grad school, eventually earned a Ph.D., and have been employed on college campuses ever since.
I’ve recently returned to my beloved alma mater — which I’ve written about for Inside Higher Edbefore — in order to teach creative writing and literature. This one-year visiting position came along at a time when, to be honest, I had been thinking about getting out of the academy altogether. Although I still loved teaching and writing and developing as a scholar and thinker, I had begun to feel, at the very least, like I did not belong — and could not stay — at the college where I had been working since 2008. There were many reasons for this feeling, but the important point is that I realized that I was unhappy where I was — that this was not the job I thought it would be. Worse still, I began to fear that the problem wasn’t that specific location, but rather that I’m not cut out for this line of work. So I returned to the scene of the crime, the place where I first learned to love literature, writing, and the academic life.
In "Once More to the Lake," E.B. White talks of returning to the lake where his father used to take the family on vacations, this time as a grown man with a son of his own. The essay is noteworthy for a variety of reasons, but kind of funny for his insistence that this place is just as he remembered it, even though he gives a list of things that have changed. "I could tell," he notes after observing the fact that the road leading to the camp was now paved, "that it was going to be pretty much the same as it had been before....” Or when talking about the nearby store: "Inside, all was as it had been, except...." Or the waitresses who serve them their pie, who were "the same country girls, there having been no passage of time, only the illusion of it as in a dropped curtain — the waitresses were still fifteen; their hair had been washed, that was the only difference — they had been to the movies and seen the pretty girls with the clean hair."
Different, but the same. Timeless, yet pushed forward in time. I didn’t really understand White’s disorientation until I returned to St. Lawrence. As White returns to the lake as a father, I’ve returned to St. Lawrence as a professor. He feels, at times, his own father next to him — or perhaps within him, as if he has become his father by bringing his son to this place. I teach in "The Shakespeare Room" in Richardson Hall, dedicated to Emeritus Professor of English Thomas L. Berger, my own Shakespeare professor from 15 years ago, whose blown-up photograph hangs on the wall to my left as I do my best to lead a discussion on Emily Dickinson.
Professor Berger isn’t really beside me, just as White’s father is not with him, yet his presence on that wall reminds me of what type of professor I want to be — erudite, funny, and maybe a little bit intimidating to students who haven’t done the reading.
On days when it’s not too cold — and here in New York’s North Country, those days can be few and far between this time of year — I like to walk around campus. I made a point of showing my wife the dorm I lived in freshman year, where I met the friend who would later ask me to be the godfather to her son. I walked through the building that now houses the theater and fine arts department, but that used to be the student union, where we would occasionally get pizza or burgers at the Northstar Pub, which stopped selling beer after my freshman year but was still called "The Pub" when I graduated. The new student union — located in a more centralized area of campus — houses the Northstar Café, but the students still call it "The Pub" for reasons that are probably a complete mystery to them.
As I was walking home from a poetry reading on campus one night last semester, a student smoking in front of his dorm called out "Dr. Bradley!" and walked toward me in order to talk about class. I haven’t had a cigarette in years, but I almost asked him for one. It seemed like the thing to do. Smoke a cigarette, talk about what you’d been reading. How many times did I do just that with my friends? Those actors and singers and painters and writers who were all so into this world they were just discovering. How many cigarettes did I smoke, talking about Uta Hagen, or Annie Dillard, or Quentin Tarantino? Of course, we smoked inside, back then. It was the '90s. A different era.
White notes that the souvenir counters at the store offer "postcards that showed things looking a little better than they looked," which is sometimes how the past seems when we reflect. If I talk of loving college, I should also tell you that I frequently drove myself crazy, putting the finishing touches on a paper at 4:30 when it was due at 5:00, then running around campus with a disk in hand, trying to find an available printer (again, it was the '90s). There were those times, towards the end of the semester, when — out of money on my meal card — I had to eat sandwiches made of generic white bread and processed cheese slices for every meal. And there were the romantic relationships. They all started out fun, but frequently ended with someone crying.
Still, if the experience was sometimes painful, it was also always educational. I wouldn’t want to trade those experiences or forget those lessons — they’ve shaped the writer, teacher, friend, and husband I am today. And something about this experience of being back on this campus has reminded me — and I’m shocked that I needed to be reminded — that my students are having those very same experiences right now. They’re reading something that’s going to change their lives. They’re falling in love. They’re learning not to send e-mails drunk. They’re listening to the Velvet Underground for the very first time. They’re figuring out who they’re going to be as they begin their adult lives.
So much is different. Everything’s the same.
In my previous Inside Higher Ed column, I talked about remembering my own youthful mistakes when I find myself frustrated with my students. I’m glad to have such perspective — it sometimes saves my sanity — but I’m also glad to remember how awesome it was to be young, to be humbled by the realization that there was so much out there to learn. I had lost some of that enthusiasm in the years since my own undergrad days, but being here, seeing and identifying with these students, has caused me to remember. As a 21st-century academic, it’s awfully easy to get nervous and jaded — it seems like every day, someone from outside of the academy is throwing around words and phrases like "strategic dynamism," "innovative disruption" or "paradigm shift" that don’t really mean anything to me except that the speaker or author doesn’t think very highly of the work we do in the academy, or at least the way we do it. I frequently feel embattled or unappreciated, but this year at my old school has reminded me that I didn’t go to grad school to make politicians or business leaders like me. I went because I wanted to help young people have the same life-changing experience I had.
It’s cold here in Canton right now — one day this week, it didn’t even get above zero — but you wouldn’t know it from all the activity happening on campus. There are informational meetings for students interested in studying abroad in the Czech Republic and Thailand. There’s a screening of the film "Argo." The student organization dedicated to environmental activism is having a vegetarian dinner, open to all interested students. There are athletic events. And, of course, there are classes. I’m not saying that these are activities special to St. Lawrence — I’m sure if you work on a college campus, similar stuff is happening around you. But sometimes, I think, the stress of our jobs causes us to forget what an awesome place a vibrant campus can be.
At the end of White’s essay, he talks of feeling "the chill of death" as he watches his son prepare to swim in the rain, but my recent experience with students at my alma mater has reminded me of how powerful it can be, to be surrounded by the warmth of lives that are really just beginning. I don’t know where I’ll be in a few months, but I’m glad for having learned this lesson this year.
William Bradley is visiting assistant professor of English at St. Lawrence University.
After yet another joke on "A Prairie Home Companion" about an English major who studies Dickens and ends up at a fast-food restaurant frying chickens, I couldn’t take it anymore. I had to write.
You and I go way back. I started listening to you during my undergraduate years as an English major in the mid-'80s and continued while in graduate school in English literature, when making a nice dinner and listening to "Prairie Home" was my Saturday night ritual. I get that you’re joking. I get the whole Midwesterner take down of — and fascination with — cultural sophistication that animates your show. I get that you yourself were an English major. And I get affectionate irony.
I’m afraid, however, that jokes about bitter and unemployed English majors that are already unfortunate in an economy humming along at 4.5 percent unemployment are downright damaging when the unemployment rate is near 8 percent — and some governors, in the name of jobs, are calling for liberal arts heads. Likewise, the most recent annual nationwide survey of the attitudes of college freshmen reported an all-time high in the number of students who said that "to be able to get a better job" (87.9 percent) and "to be able to make more money" (74.6 percent) were "very important" reasons to go to college. Not surprisingly, the same survey reported that the most popular majors were the most directly vocational: business, the health professions, and engineering (biology was also among the most popular).
The truth, however, is that reports of the deadliness of English to a successful career are greatly exaggerated. According to one major study produced by the Georgetown University Center on Education and the Workforce, the median income for English majors with a bachelor’s but no additional degree is $48,000. This figure is just slightly lower than that for bachelor’s degree holders in biology ($50,000), and slightly higher than for those in molecular biology or physiology (both $45,000). It’s the same for students who received their bachelor’s in public policy or criminology (both $48,000), slightly lower than for those who received their bachelor’s in criminal justice and fire protection ($50,000) and slightly higher than for those who received it in psychology ($45,000).
Another study by the same center paints a similar picture with respect to unemployment. In this study, the average unemployment rate for recent B.A. holders (ages 22-26) over the years 2009-10 was 8.9 percent; for English it was 9.2 percent. Both rates are higher than we would wish, but their marginal difference is dwarfed by that between the average for holders of the B.A. and that of high school graduates, whose unemployment rate during the same period was 22.9 percent (also too high).
Of course, majors in engineering and technology, health, and business often have higher salary averages, between $60,000 (for general business) and $120,000 (for petroleum engineering) and marginally lower unemployment rates, especially for newly minted B.A.s. But there’s nothing reckless about majoring in English compared to many other popular majors. Students who love business or engineering, or who are good at them and simply want to earn the highest possible income, make reasonable choices to pursue study in these fields. But students who want to major in English and are good at it should not believe that they are sacrificing a livelihood to pursue their loves. And students who don’t love what they are learning are less likely to be successful.
Because this kind of information is readily available, it makes me wonder why you, Garrison — and you’re not alone — continue to dump on English as a major. I think it must be because in the world of Lake Wobegon the English major has cultural pretensions that need to be punished with loneliness and unemployment. Likewise, the Midwesterner in you can’t believe that anyone who gets to do these things that you yourself love so much — revel in the pleasures of language and stories — could also be rewarded with a decent job.
Garrison, when it comes to English majors, let your inner Midwesterner go. You can study English and not be a snob. And you can study English and not fail in the world. I know you know these things; you’ve lived them. So my plea to you, Garrison, is this. Your "Writer’s Almanac" does a terrific job promoting the love of language and the study of English. But in my media market it plays at 6:35 am. Even where it gets better play, it has nowhere near the prominence of "A Prairie Home Companion." Can you find a way on the latter to tell stories about English majors that don’t involve failure? These stories would make a fresh alternative on your show to a joke way past its sell-by date. And they might make a few parents less likely to discourage their kids from studying English.
And here’s my final plea to all former English majors. "A Prairie Home Companion" can help, but English also needs its "CSI" or "Numb3rs." I know some of you are out there now writing for television and film. I admit it will take some creative chops to develop stories about English study that are as glamorous and engaging as crime drama. But you were an English major. I know you can do it. And it’s time to pay it forward.
Chair, English Department
George Mason University
P.S. to all former English majors: Since writing this letter I’ve learned about a new Fox TV show called "The Following" that features an English professor. He’s a serial killer who inspires others to kill. Maybe next time the English professor could be the hero? Thanks.
In an essay first published in 1948, the American folklorist and cultural critic Gershon Legman wrote about the comic book -- then a fairly recent development -- as both a symptom and a carrier of psychosexual pathology. An ardent Freudian, Legman interpreted the tales and images filling the comics’ pages as fantasies fueled by the social repression of normal erotic and aggressive drives. Not that the comics were unusual in that regard: Legman’s wider argument was that most American popular culture was just as riddled with misogyny, sadomasochism, and malevolent narcissism. And to trace the theory back to its founder, Freud had implied in his paper “Creative Writers and Daydreaming” that any work of narrative fiction grows out of a core of fantasy that, if expressed more directly, would prove embarrassing or offensive. While the comic books of Legman’s day might be as bad as Titus Andronicus – Shakespeare’s play involving incest, rape, murder, mutilation, and cannibalism – they certainly couldn’t be much worse.
But what troubled Legman apart from the content (manifest and latent, as the psychoanalysts say) of the comics was the fact that the public consumed them so early in life, in such tremendous quantity. “With rare exceptions,” he wrote, “every child who was six years old in 1938 has by now absorbed an absolute minimum of eighteen thousand pictorial beatings, shootings, stranglings, blood-puddles, and torturings-to-death from comic (ha-ha) books alone, identifying himself – unless he is a complete masochist – with the heroic beater, strangler, blood-letter, and/or torturer in every case.”
Today, of course, a kid probably sees all that before the age of six. (In the words of Bart Simpson, instructing his younger sister: “If you don't watch the violence, you'll never get desensitized to it.”) And it is probably for the best that Legman, who died in 1999, is not around to see the endless parade of superhero films from Hollywood over the past few years. For in the likes of Superman, he diagnosed what he called the “virus” of a fascist worldview.
The cosmos of the superheroes was one of “continuous guilty terror,” Legman wrote, “projecting outward in every direction his readers’ paranoid hostility.” After a decade of supplying Superman with sinister characters to defeat and destroy, “comic books have succeeded in giving every American child a complete course in paranoid megalomania such as no German child ever had, a total conviction of the morality of force such as no Nazi could even aspire to.”
A bit of a ranter, then, was Legman. The fury wears on the reader’s nerves. But he was relentless in piling up examples of how Americans entertained themselves with depictions of antisocial behavior and fantasies of the empowered self. The rationale for this (when anyone bothered to offer one) was that the vicarious mayhem was a release valve, a catharsis draining away frustration. Legman saw it as a brutalized mentality feeding on itself -- preparing real horrors through imaginary participation.
Nothing so strident will be found in Jason Dittmer’s Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics (Temple University Press), which is monographic rather than polemical. It is much more narrowly focused than Legman’s cultural criticism, while at the same time employing a larger theoretical toolkit than his collection of vintage psychoanalytic concepts. Dittmer, a reader in human geography at University College London, draws on Homi Bhabha’s thinking on nationalism as well as various critical perspectives (feminist and postcolonial, mainly) from the field of international relations.
For all that, the book shares Legman’s cultural complaints to a certain degree, although none of his work is cited. But first, it’s important to stress the contrasts, which are, in part, differences of scale. Legman analyzed the superhero as one genre among others appealing to the comic-book audience -- and that audience, in turn, as one sector of the mass-culture public.
Dittmer instead isolates – or possibly invents, as he suggests in passing – a subgenre of comic books devoted to what he calls “the nationalist superhero.” This character-type first appears, not in 1938, with the first issue of Superman, but in the early months of 1941, when Captain America hits the stands. Similar figures emerged in other countries, such as Captain Britain and (somewhat more imaginatively) Nelvana of the Northern Lights, the Canadian superheroine. What set them apart from the wider superhero population was their especially strong connection with their country. Nelvana, for instance, is the half-human daughter of the Inuit demigod who rules the aurora borealis. (Any relationship with actual First Nations mythology here is tenuous at best, but never mind.)
Since Captain America was the prototype –- and since many of you undoubtedly know as much about him as I did before reading the book, i.e., nothing – a word about his origins seems in order. Before becoming a superhero, he was a scrawny artist named Steve Rogers who followed the news from Germany and was horrified by the Nazi menace. He tried to join the army well before the U.S entered World War Two but was rejected as physically unfit. Instead, he volunteered to serve as a human guinea pig for a serum that transforms him into an invincible warrior. And so, as Captain America -- outfitted with shield and spandex in the colors of Old Glory – he went off to fight Red Skull, who was not only a supervillain but a close personal friend of Adolf Hitler.
Now, no one questions Superman’s dedication to “truth, justice, and the American way,” but the fact remains that he was an alien who just happened to land in the United States. His national identity is, in effect, luck of the draw. (I learn from Wikipedia that one alternate-universe narrative of Superman has him growing up on a Ukrainian collective farm as a Soviet patriot, with inevitable consequences for the Cold War balance of power.) By contrast, Dittmer’s nationalist superhero “identifies himself or herself as a representative and defender of a specific nation-state, often through his or her name, uniform, and mission.”
But Dittmer’s point is not that the nationalist superhero is a symbol for the country or a projection of some imagined or desired sense of national character. That much is obvious enough. Rather, narratives involving the nationalist superhero are one part of a larger, ongoing process of working out the relationship between the two entities yoked together in the term “nation-state.”
That hyphen is not an equals sign. Citing feminist international-relations theorists, Dittmer suggests that one prevalent mode of thinking counterposes “the ‘soft,’ feminine nation that is to be protected by the ‘hard,’ masculine state” -- which is also defined, per Max Weber, as claiming a monopoly on the legitimate use of violence. From that perspective, the nationalist superhero occupies the anomalous position of someone who performs a state-like role (protective and sometimes violent) while also trying to express or embody some version of how the nation prefers to understand its own core values.
And because the superhero genre in general tends to be both durable and repetitive (the supervillain is necessarily a master of variations on a theme), the nationalist superhero can change, within limits, over time. During his stint in World War II, Captain America killed plenty of people in combat with plenty of gusto and no qualms. It seems that he was frozen in a block of ice for a good part of the 1950s, but was thawed out somehow during the Johnson administration without lending his services to the Vietnam War effort. (He went in Indochina just a couple of times, to help out friends.) At one point, a writer was on the verge of turning the Captain into an overt pacifist, though the publisher soon put an end to that.
Even my very incomplete rendering of Dittmer’s ideas here will suggest that his analysis is a lot more flexible than Legman’s denunciation of the superhero genre. The book also makes more use of cross-cultural comparisons. Without reading it, I might never known that there was a Canadian superhero called Captain Canuck, much less the improbable fact that the name is not satirical.
But in the end, Legman and Dittmer share a sense of the genre as using barely conscious feelings and attitudes in more or less propagandistic ways. They echo the concerns of one of the 20th century's definitive issues: the role of the irrational in politics. And that doesn't seem likely to become any less of a problem any time soon.
In his inaugural address, President Obama referred repeatedly to education – but exclusively to education in STEM disciplines, as if only those fields had a defensible public purpose. Sadly, this is no aberration: in December the White House issued a report entitled "Transformation and Opportunity: The Future of the U.S. Research Enterprise," which completely overlooked research in the humanities and social sciences, even in its brief history of the growth of research at American universities.
Such a narrow focus is surprising, as the president himself apparently consults historians (and probably other scholars); and it is counterproductive, whether in strict dollars and cents terms or broader ones. Some politicians have gone further, aggressively asserting that various humanities and social science disciplines are useless, and attempting to impose higher tuitions on students who major in them, making it all the more important that those who know better actively affirm the value of teaching and research beyond the STEM fields.
I will focus here on the case for history: it is what I know best, and since history straddles the line between humanities and social sciences, many arguments for its importance apply to various allied fields. One might loosely group these into three categories, ranging from the most social scientific to the most humanistic. The first applies to lessons drawn from circumstances relatively close to our own; the second to learning about times and places we know are quite different. The third applies to research showing that some currently accepted ideas are actually fairly novel, and that people not so different from us saw did without them; engaging the concepts they used instead may help us see additional possibilities in the world, whether for good or ill.
Examples of the first category underlie almost any sound public policy debate, as well as many private deliberations. Take, for example, the 2009 stimulus bill. By itself, no mathematical calculation could assess the relative accuracy of the more-or-less Keynesian models suggesting that the stimulus would help the economy and the "real business cycle" models, which predicted that it would be an expensive waste. The difference lay in historical research about how various modern economies had responded to historically specific policy initiatives. Other examples abound, though most are less well-known: closest to home in this regard would be evaluating options for STEM investment in light of the vast literature on what has given rise to specific clusters of innovation in the past, and which innovations proved most beneficial. One would also expect development efforts to gain from examining research on past relationships among, say, education, urbanization, birthrates, and investment.
The benefits of research into the importance of understanding differences in the context of policy decisions abound, with special clarity emerging in what we might call "area studies" knowledge – an enormous part of the growth of U.S. research universities after WWII. Surely we could have saved lives and money had policy-makers known more about religious differences within Iraqi society, the political and social history of Afghanistan, or class relations and popular nationalism in Vietnam before military interventions in those places. The same, I would argue, goes for using research into the evolution of Chinese notions of ethnicity, nationality, race, and geopolitics to understand likely governmental and popular reactions to possible American policies on Tibet, trade, the Diaoyu/Senkaku Islands, and so on.
Perhaps less obvious, but equally important, is the usefulness of research that shows that many ideas we may take to be "natural," or at least of very long standing, are actually relatively new.. Some of these insights may be "just" a contribution to increased self-understanding, but others bear directly on public issues. Urgent debates over how fixed the concept of "marriage" has been come first to mind, but there are many more actual and potential examples. Recognizing that the term "ethnic group" is barely 75 years old reminds us how mutable are our understandings of the basis and implications of human groupings; that "gross national product" is of roughly the same vintage suggests maximizing that particular measurement is not inevitably the paramount goal of economic policy.
It hardly seems a stretch to think that a world facing our current challenges might benefit from awareness of other ways that people have thought about the relationship of work, citizenship, adult status, "independence" and dignity, or about consumption, economic growth, leisure and the nature of progress. Or to take some narrower examples, consider the implications of learning how relatively recently life insurance went from seeming like a morally dubious gambling on death to a taken-for-granted tool for managing risk. Or that, while (as Thomas Ricks noted in a recent Atlantic) almost no U.S. generals were removed from their commands for poor performance during Vietnam, Afghanistan or Iraq, many were so removed during World War II – suggesting that the recent situation does not represent an inevitable feature of government, much less of hierarchy generally. Historical knowledge of this kind does not provide lessons as straightforward as “deficit spending can work,” but it can add significantly to our understandings of what is possible, for better or worse, and how things may become, or cease to be, unthinkable.
Research that produces these results, both testing earlier certainties and responding to new questions , thus seems a useful, even necessary complement to research in the STEM fields. Fortunately, most historical research is also relatively cheap, but it does not thrive on complete neglect.
Kenneth Pomeranz is University Professor of History at the University of Chicago and president of the American Historical Association. The views expressed here are his alone.
Everything would have been perfectly ordinary that October morning in my freshman writing course at Stanford University. Bright autumn light reflected up from the Main Quad to our third floor. Unfed, sleepy-eyed freshmen offered ideas about the assigned reading, which I tracked on the board.
As I often do, I drew a doodle to describe a concept in the reading. This doodle — so I thought — demanded less artistry and complexity than my usual sketches of Thomas Hobbes’s "arrant Wolfe," for which I hash out two mangy-looking wolves squinting at each other, or Immanuel Kant’s famous "crooked timber," for which a bent log suffices to get the idea across. Here, I simply tossed up a rectangle with a triangle inside.
My students gasped.
"What’s wrong?" I asked.
“Um … everything." They wagered cautiously.
"Well," I tried. "This is just like the one Lockhart shows in his essay." I was referring to a drawing in Paul Lockhart’s famous 2002 "Lament" about the state of mathematics education. Here it is, precisely as it appears in the essay, not the version I drew in class.
"Sorry … no … not really, well … it’s not even close," they ventured, as if not to hurt my feelings.
My students, mostly young aspiring mathematicians, found themselves so ill at ease here, because their teacher with a humanities doctorate had not bothered to notice that the triangle inside the rectangle touches both corners of the same length and thus forms several other triangles. My doodle — whatever it looked like, I can’t remember — was simply an approximation, a lonely triangloid adrift in a rectangular sea of lopsidedness.
My students had expected greater precision. After all, the course title "Rigorous and Precise Thinking" had suggested as much. Secondly, this was a college writing course, which, as the rumor goes, is supposed to be a smackdown of style, argument and organization, where freshmen quickly learn they must jettison comfortable high school formats and every illusion of their personal literary genius. Expectations for rigor and many other new adventures ran high in this new course, an experimental hybrid college writing/mathematical thinking and proof writing class, one of five liberal arts courses in a new program called Education as Self-Fashioning.
Like the other four ESF classes, this one intended to "engage actively in the types of thinking promoted through these different conceptions of education for life, so as to try those lives on for ourselves ..." and offer students a “chance to shape [their] educational aspirations in dialogue with fellow students and an exciting group of faculty from across a wide range of disciplines — from the humanities and social sciences through the natural sciences and mathematics." I was the writing instructor paired with Professor Ravi Vakil, an American-Canadian mathematician working in algebraic geometry.
Vakil invented the course concept as a rejoinder to C.P. Snow’s "Two Cultures" hypothesis with the hope of showing undergrads, and even the world, that writing in the humanities and writing in math gained force and excellence through similar structures of precise reasoning. Vakil more than delivered on the rigor and precision. His lectures introduced students to proof writing, number theory, set theory, and many other advanced forms of math most academics expect to address only with advanced university students. For my part, I was simply to help students elaborate the readings from Plato, Descartes, Douglas Hofstadter, Bertrand Russell, Paul Lockhart and many others, while teaching writing.
Tellingly, my imprecise doodle proved to be not my first, second, nor even third example of lack of rigor. In fact, the moment seem to demonstrate the deep divide between Snow’s "two cultures," since I evidently betrayed a lack of familiarity with the basic truths of measurement, "mass, or acceleration, pretty much the scientific equivalent of a humanist asking skeptically, Can you read?" Without a doubt, much of that difference proved disciplinary — the very limit this course hoped to transgress.
Yet, we experienced no ordinary rift between the two cultures. The class had read Snow’s famous 1959 Rede Lecture and chuckled at his description of subverbal grunting mathematicians ruining a young humanist’s dinner party experience. My students saw themselves as beyond what old Stanford lingo designates as the split between "fuzzies" and "techies." Interested equally in learning all things humanist and STEM, e.g., Shakespeare and thermodynamics and beyond, these students insisted that math and math culture far surpassed the cartoonish figures of Snow’s dinner party. Nor (my students believed) were humanists so incorrigibly "fuzzy" as to not be able to reproduce a mathematical doodle — or were they?
Had I inadvertently proven Snow’s point, right before the eyes of my epistemologically optimistic students? In fact, both the students and I discovered that many of the clichés about our respective fields proved instructive. I really do need to be more careful in my doodling — and thinking about my doodling — if I am drawing triangles (with mathematical aspirations) and not wolves (no matter how humanistically inclined).
The awkward doodle moment proved not the existence of two never-the-twain-shall-meet cultures, but rather a need for me to look more closely at the other side. Once I recovered from the initial jolt of difference, I began to realize the opportunity for me to reconsider my pedagogy. Not having seen a university math professor teach proof writing before, I witnessed several fascinating interactions while attending Vakil’s sections of our course. Most striking, when Vakil wrote a problem on the board, the room jumped to life with students calling out and frantically waving their arms. He would ask: "How can you prove the square root of 2 is irrational?" and it was as though Vakil were standing at the board waving a bloody steak at a group of famished tigers. Everyone wanted to offer some solution.
Seldom have I been bombarded with solutions or suggestions when I ask students to show me "textual proof" that Sigmund Freud has a Hobbesian view of nature … hint hint … homo homini … wolf sketch, ... Civilization and Its Discontents, try page number and reference…Freud 1930a [I929], SE 21:111. That special classroom enthusiasm surely arose from Vakil’s charisma and love of his subject, but the response was new to me because humanities courses that I know at least demand a very different kind of invention. Vakil asked a question and students racked their brains trying to imagine which set of mathematical tools or ideas they might use to solve the problem. Confident that they all share these tools, or at least know of such tools, the students seemed to feel much more at ease trying out different approaches.
In humanities courses, previous knowledge certainly helps, especially with literary references, but at the end of the day, a humanist’s tools remain much more contested and may not be applicable in different contexts. For example, students asked me why I requested they not use the third-person plural perspective "we." I told them writing in the humanities differs from math, where one can simply write in a proof “we assume that x=2.” Humanists can neither be sure who that “we” is, nor what to "assume" nor how one can know x. All such terms are permanently available for debate.
In contrast, the mathematicians’ particular disciplinary certainty also revealed a fierce loyalty and love of the subject, which produced a very different discourse than I traditionally hear from humanities students who feel a strong affinity with their work. These math students spoke a Russellian language of awe toward the "cold and austere" "supreme beauty" and "elegance" of math. Perhaps other humanists have encountered students who express an emphatic humility before their subjects, but that this for me was as new as the students’ shock at my imprecise drawing. For I learned that day, that my students had not yet adopted a humanistic skepticism toward mathematical precision. For them precision is very real, especially in a world of increasing complexity and Gödelian incompleteness.
For humanists, precision lies elsewhere, side by side with ambiguity, and we pursue it with nuance rather than with proofs. My task therefore became one of translation. I understood little of the doodles and equations that Vakil and the students so hotly debated in his sections, but I knew that I had helped my students articulate arguments within the very different confines of humanistic inquiry. Where they were convinced of certain mathematical truths in the landscape of defined terms, they nevertheless arrived in my class with the classic freshman enormity of themes.
Asked to find “precise” topics in math to write about for their research papers, nearly all 29 students first chose grandiose topics like "the definition of intuition," "the connections between art and math" or "math and humanistic knowledge." With such great ambitions in mind, they also fervently believed in math as a liberal art capable of teaching the exact same virtues of critical (self) reflection as any of the great classical texts I teach from Greek virtue ethics to Rawls.
Most provocatively, they claimed that by practicing mathematical reasoning they were indeed preparing themselves in the fashion of liberal arts education for ethical citizenship. They claimed with confidence their rigorous and precise thinking could lead them to ethical reasoning as equally well as a discussion of the Plato’s “Apology.” For my part, I could not see how debating a triangle or even practicing some form of applied math as statistics would help me lead the "examined life" in a qualitative fashion.
In class, Vakil often reflected on the limits of mathematical reasoning in a mode reminiscent of Greek virtue ethics; that is, perfecting one’s art whether mathematical or literary skill, is surely a virtue, but not one that can replace ethical action. When asked whether excellence in math could prevent one from doing evil, no one doubted the inadequacy of that proposition. History has no shortage of evil uses of math, and the students could quite easily number these. Yet, many of the students persisted in their strong claims for math.
One student asserted a mathematical imperative in times of emergency: "Just imagine it’s war or a crisis: you have a moral obligation to shut up and do the math." By which she meant one is ethically compelled to run a statistical analysis to develop a more concrete understanding of actual dangers. Another student expressed less certainty about quantitative methods. "Statistics aren’t bulletproof, you know; what matters ultimately is thinking clearly, and math trains the mind for such emergencies."
Vakil softened these strong claims for both applied and pure math:
I'm less certain that this [mathematical reasoning] in any way replaces the approach to the virtues of critical self-reflection through great philosophical texts. I hope that our students will better appreciate the importance of such texts, because of an appreciation of the problems that earlier thinkers were grappling with (and that we should grapple with today). Similarly, I doubt that this is sufficient to lead them to ethical reasoning, although I would make a milder claim that thinking clearly in this way can assist in carrying out ethical reasoning.
Vakil also elaborated ways in which math could serve ethics, both by providing empirical data and asking Socratic questions about knowledge and decision-making. In the end, we hoped the students finished the course knowing a bit more about practices of rigorous thinking in our respective disciplines, and that they would see these as equally essential and complementary. Could this sprawling, seven-unit course provide a model for future courses? We’re not sure, but are happy to share our data and materials.
Ruth Starkman writes on higher education and teaches college writing, biomedical ethics and social media at Stanford University.