The first distinguished speaker at the recent forum on "Justifying the Humanities" followed a recent trend by asserting that the humanities were invented in the American university of the 1930s as an organizational convenience. The second distinguished speaker explained that in their current "somewhat dated" form the humanities are a product of the Cold War, developed in the 1950s through courses in the Great Books and Western Civilization. By the time the final distinguished speaker began his remarks I feared that we would be told the humanities were invented yesterday in sudden meta-post-Postmodernist fabrication.
First, the good news. It is true that the familiar triadic American curricular structure of liberal education (natural science, social science and the humanities) is relatively recent. Hence, the form of humanistic studies is not chiseled in ancient marble, but has changed and can and should continue to change in response to new circumstances.
The bad news is that recent history is only a small part of the story. The foreshortening perspective on the humanities comes at a price. It’s not just that it overlooks a tradition that reaches back to the Stoic philosophers of ancient Greece, Cicero in ancient Rome, Petrarch and Boccaccio in Italy and the amazing scholars of the Renaissance. Nor is it just that we deprive ourselves of the benefits of breakthroughs in contemporary scholarship. It’s that we risk losing sight of what motivated the great era of humanism.
Renaissance humanists, such as Joseph Justus Scaliger, Marsilio Ficino and Lorenzo Valla, applied immense energy and learning to establishing reliable texts of ancient authors, commenting on them, making them accessible through translations, and teaching them in a way that created an understanding of human beings and moral agency not restricted by the dictates of medieval theology. Philosophy, literature, history and the visual arts were transformed by such humanism. Soon universities were transformed as well.
When I asked Paul Grendler, a professor of history emeritus at the University of Toronto and an expert on education in the Renaissance, about this transition, he reminded me that this change was revolutionary. "A group of 15th-century Italian scholars decided that the best way to train men (and a few women) to be learned, eloquent, and morally responsible leaders of society was to introduce them to the great authors and texts of ancient Greece and Rome.… They coined the phrase studia humanitatis (humanistic studies) for this new, revolutionary school curriculum." This transformative sense of purpose accounts, I believe, for the energy and enduring excitement of their work.
At the university level great changes began around 1425 when humanists began teaching in Italian universities such as Bologna, Florence and Padua. They taught rhetoric, poetry and what they sometimes called humanitas, meaning more or less what Cicero had meant by it, "the knowledge of how to live as a cultivated, educated member of society," as Grendler phrase it. In general these humanists connected this goal to the stadium humanitatis – we would say classical studies broadly conceived. That terminology spread from Italy to the British Isles where, for example, the Scotstarvit chair of humanity was established at the University of St. Andrews in 1620. By 1800 literae humaniores were part of examinations at Oxford. The pattern was revised in the mid-19th century into the famous "Greats" program, which later provided the model for "Modern Greats," that is, Oxford’s degree program Philosophy, Politics and Economics. Humanism, it turns out, is not only adaptable to modern circumstances; it can be infectious.
The term "humanities" did not, then, drop out of the sky into the unknowing laps of American academic bureaucrats. Leaders of colleges and universities in the early 20th century consciously and deliberately evoked the tradition of Renaissance humanism in an effort to develop some equivalent amid mass education in the modern world. We may argue about how successful they were, but they saw the challenge.
It's still the challenge today, almost a century later. In responding to it, we can still learn from those Renaissance scholars. If we neglect them, we overlook an important part of the background to contemporary humanistic studies, but we also we risk replicating, validating, and promulgating one of the gravest failings of the humanities as currently practiced – "presentism," that is, an exclusionary focus on the most highly modernized societies of the contemporary world, and the uncritical judging of the past by today’s interests and standards. In so doing one severs contact with what so motivated and energized these great humanist scholars and with the perspective on human life and conduct that they opened up.
If this root of the humanities is severed by ignorance, neglect or hostility, it will not be surprising if humane learning begins to look a little withered, and if students find what they have learned soon wilts and leaves them without the perspective and depth of understanding that a rigorous and wide-ranging education in the humanities should provide.
W. Robert Connor is senior advisor at the Teagle Foundation.
If you didn’t know any better, you might think that the main thing conservatives learn in college English classes is how to complain about college English classes.
Shortly before his recent and untimely death, conservative entrepreneur Andrew Breitbart reminisced about the English and American studies classes he took as an undergraduate in the late '80s and early '90s. Those classes, he said in an interview with the National Review Online, were representative of "any humanities department, USA." Going into them, he had expected to read "the Founding Fathers" along with authors like "Mark Twain": stuff he assumed would offer a 'benign approach to the American experience." But he was shocked, shocked, that his courses did not offer that "benign approach."
Now, it’s not clear what parts of Mark Twain that Breitbart expected to be "benign." His ruthless critiques of the slave system in Pudd’nhead Wilson? His disgust with organized religion in A Connecticut Yankee in King Arthur’s Court? His vocal opposition to American foreign policy as vice president of the Anti-Imperialist League? It seems possible Breitbart accidentally confused Mark Twain the guy with Mark Twain the boat, at Disneyland. (Which is, admittedly, a pretty benign pleasure cruise.)
But the real problem, Breitbart continued, was that he "was hearing the words 'deconstruction' and 'semiotics' a bit too many times." He eventually concluded English classes weren’t really English classes. Instead, they were classes in — ominous pause — "cultural Marxist theory."
Dissatisfaction with college English classes has long held a special place in the conservative imagination. In perhaps the earliest modern example, William F. Buckley Jr. devoted his first book to criticizing liberal orthodoxy at Yale University in the late 1940s. Because "the field of English literature and poetry" was one in which “values are heavily involved,” he declared, the English classroom was especially prone to “value-inculcation.” (Needless to say, he didn’t seem to think it inculcated good values.)
Since then, a number of right-leaning writers haven’t been able to resist the urge to generalize, to speculate about what goes on, in English classes. Rebecca Newberger Goldstein has claimed that in English classes, the "study of literature as an art form" has been entirely replaced by "Theory," presumably of the "cultural Marxist" variety. Michael Ellsberg thinks that the type of writing taught in an English class is so "formulaic" that "passable versions of it can be produced automatically by a computer program." Similarly, Judith Halberstam, an English professor who wouldn’t otherwise agree with Breitbart about anything, has declared that English professors aren’t really "doing" English anymore. Bruce Fleming also thinks that college English classes aren’t really teaching English, just a parade of isms -- "structuralism, deconstruction, Foucauldianism, and multiculturalism" — that distract from a different, and somehow more authentic, program of study.
The more theatrical of these Cassandras try to link what they assume is the content of an English class to an overall decline in English as a college major. William M. Chace, former president of Emory and Wesleyan, attributes a decline in the number of English majors to changes in the content of the English classroom. "No sense of duty remains toward works of English or American literature," he declares. Like Breitbart, Chace thinks the problem is that English professors just aren’t doing English anymore. Instead of "English or American literature," he writes, classes are now filled with non-literary subjects like "comic books or studies of trauma among soldiers or survivors of the Holocaust." (Because everyone knows survivors of the Holocaust never wrote literature. Take that, Elie Wiesel, you cheap hack!)
It's true that the percentage of undergraduates majoring in English has declined over the last 40 years. Chace makes much of the fact that English majors composed 7.6 percent of the undergraduate population in 1971, and only 3.9 percent in 2004 — a decline of nearly 50 percent. But in fact, English’s current popularity is actually closer to its historical norm now than it was 30 years ago. It’s the early 1970s that were the anomaly, not today.
All American universities expanded dramatically in the aftermath of World War II. From 1945 to 1975, undergraduate enrollments increased by almost 500 percent. This growth was unprecedented, but also unrepeatable. In the grand scheme of things, though, it was just a temporary blip. Degrees in all of the liberal arts — not just English — declined from 1900 to 1945, then grew from 1945 to about 1973, and then began to decline again.
So it’s simply not true that curricular changes caused a decline in the popularity of English as a major. After all, the percentage of mathematics degrees fell over the same period by about 66 percent. And they don’t teach much "cultural Marxist theory" in math courses. (Probably.)
But what’s interesting about this latest flock of lit-crit Chicken Littles is not that they’re saying anything that’s actually true, but that people think it could be true.
There are at least two things going on here. The conservative dislike for English classes depends on ill-informed generalizations about the day-to-day business of the English classroom. But it also depends on an assumption that the day-to-day business of the profession of English is somehow not the business of the members of that profession.
To put it another way, a guy like Andrew Breitbart would never have thought to assume he knows what happens in all of the hundreds of classes nationwide in, say, cardiac surgery. And he would never have tried to inform a professor of surgery about the proper content of a course in surgery. Yet he had and others have no problem confidently stating that they know what happens in all English classes, and that they also know what should happen.
The first of these critiques — that all English classes, everywhere, are doing a particular thing uniformly — is the hardest to rebut, mostly because it's such an insane critique in the first place. There are nearly 2,400 four-year colleges and universities in America, and the vast majority of them offer some sort of education in English language and literature. That means there are tens of thousands of English courses every year. I don’t know what goes on in all those classes any more than Breitbart did or anyone else does. But I will say that their generalizations don’t ring true in the slightest.
Every English literature class I have ever taken, taught, or observed has spent the vast majority of its time on exactly what all these writers claim is missing: the study of literature. In my experience, English classes do pretty much what they’ve always done. Students read literature closely, and then talk about how it works and what it means. The courses I teach in American literature today contain pretty much the same authors you would have expected 20 or 30 years ago: Twain, Emerson, Dickinson, Douglass, Melville, Wharton, and so on. Of course, people teach some newer authors, too (Toni Morrison and Don DeLillo tend to show up often), and some authors are not taught quite as frequently (D.H. Lawrence, for instance), but English departments are not eternal guardians of a frozen literary heritage. They change a little over time, sure, but they still do what you’d expect.
For that matter, Breitbart’s English departments did pretty much what you’d expect, too. He had to take two semesters of American literature at Tulane University, and as Mark Howard and Alexander Zaitchik have reported, students in those courses were assigned to read Emerson, Thoreau, Twain, Hawthorne, Stowe, and so on. Not much "cultural Marxist theory," in other words.
I'd venture to guess that this basic classroom focus on individual literary works is far more common than not. If you like, you can check for yourself. Just type "English literature syllabus site:.edu" into Google, without the quotation marks, and see what people around the country are teaching in their English classes.
But this still leaves the other problem, about who determines the proper content of a college classroom. Even if English professors were to stop teaching Shakespeare, or start teaching nothing but “cultural Marxist theory” (whatever that is), those changes would nevertheless be a result of the normal practices of the profession of English. If such a thing were ever to happen, it would be because new knowledge had been formed, new books and articles written, new best practices established. This does not mean that English is an intellectually bankrupt profession, but rather simply that it is a profession, and a profession is defined by the decisions of its members.
When conservatives declare that English classes don’t teach literature anymore, what they’re really trying to do is deprofessionalize the profession of college-level English. In a profession, such as law or medicine or academia, the members of that profession have the ultimate verdict on its practices, credentialing process, and knowledge base. Cardiac surgeons — and not conservative commentators, journalists, or the government — have the final say over the norms and practices of cardiac surgery. Finance professors have the final say over the norms and practices of the academic study of finance.
And English professors have the final say about the content of a college English class. For someone to pretend otherwise, even if that someone is an English professor, is to suggest that an English class somehow exists apart from the practices of the English professors teaching that class. You can't complain about English professors not teaching real English because what English professors teach is real English — even if you think it shouldn’t be.
So people don’t need to be worried that we’re witnessing the "death of English" as a result of English professors not teaching literature. First of all, it's not true. English majors still read Shakespeare, Twain, Emerson, Austen, Milton, and all the rest.
But even if such a thing were true, the sky still wouldn’t be falling, simply because of basic professional norms. After all, no one worries that professors of medicine aren’t teaching students how to use leeches like they did in the Middle Ages.
But one thing these sorts of attacks on the content of the English classroom tell us is that English, far from being irrelevant, apparently matters a lot to conservatives. Decades after the "culture wars" of the 1980s, conservatives and commentators of all stripes are still using the English classroom as a convenient shorthand for American universities as a whole.
And there’s nothing wrong with that. But it’s important to do so honestly. English departments still teach all the stuff you remember them teaching. But English departments are also composed of members of the profession of academic English, and professions are, like it or not, self-regulating.
After all, if someone like Andrew Breitbart can’t tell a dentist he should pull your teeth, then he also shouldn’t tell English professors they should teach The Fountainhead.
Stephen J. Mexal is assistant professor in the department of English, comparative literature, and linguistics at California State University at Fullerton.
Recently, the University of Iowa Press announced a new book series: Humanities and Public Life. Within 58 minutes, we had our first inquiry. In the days after the announcement, we had enthusiastic inquiries from historians, philosophers, architects, museum curators and humanities councils. Our first formal proposal arrived in less than a week.
My co-editor, Anne Valk; the acquisitions editor at the press, Catherine Cocks; and I find ourselves using phrases like "pent-up demand" to explain the outpouring of interest. We are all the more encouraged because neither the term "public" nor "humanities" offers much clarity these days. Not everyone connects the dots among these sometimes overlapping, sometimes alienated cultures. That's what we hope this series will do.
Faculty members and students participate in public life in innumerable ways -- from community clean-up days to Rotary talks -- in activities conventionally referred to as volunteerism and outreach. While we admire what are often called public intellectuals, who weigh in on serious issues, "the world" is their forum. Our series will focus on publicly engaged scholarship — deep, meaningful collaborations in which scholars and artists from institutions of higher learning (which could include cultural organizations such as museums and libraries) are working with rather than for communities.
Such publicly engaged humanities projects grow from reciprocal relationships that can include faculty members, academic staff, students, community leaders, nonprofit organizations, neighborhoods, museums, K-12 schools, and a host of other local, national or global partners. The series will therefore be likely to include books co-authored by project directors, who often form the crucial connection between colleges and universities and the communities in which they are located. That way the series can explore what does and doesn’t work from multiple points of view.
Many engaged projects have come to light in recent years through the organization Imagining America: Artists and Scholars in Public Life. Our home institutions — the University of Iowa and Brown University — are members, and our two centers are collaborating with the University of Iowa Press on this series. The mission of Imagining America has strongly influenced our objectives for the series. We want to capture the rigorous and radical working relationships that evolve during the reciprocal, mutually beneficial and mutually transformative process that characterizes the best publicly engaged scholarship. The series will also try to embrace the full scope of "projects."
These tend to sprawl across the categories of research, teaching and service and to spill out into public policy, community activities, classes, and documentation, of which published scholarship may be only one of many outcomes. Imagining America's report on engaged scholarship, including advice for tenure committees, richly describes and illustrates such projects, as does the "Engaged Scholarship Toolkit" created by TRUCEN (The Research University Civic Engagement Network), a wing of Campus Compact. Our most challenging objective? We want books in the series to convince skeptical colleagues that scholarship in humanities disciplines can sometimes be made more rigorous, provocative, and insightful through public engagement.
More precisely, we want to document projects that show how scholars in architecture and design, classics, history, law, languages, literatures, museum studies, philosophy, religious studies, visual and performing arts, humanistic social sciences like anthropology and archaeology, hybrid fields like law and literature or the medical humanities, and more are working with public partners and, in the process, enriching both our communities and their disciplines. The artists and scholars who share their work at Imagining America’s annual conference believe that their knowledge about lives represented in art, literature, history, and ethics is enlivened by the experience of community partners — shelter directors, neighborhood school teachers, human rights and health care workers, librarians, environmentalists. Furthermore, they believe these collaborations nudge us toward a more just and generous public culture. We know. We know. That sounds outrageously idealistic. It is. We are. Yet every year at Imagining America we learn of another innovative project that edges participants not only toward empathy but also action.
Evidence suggests that many colleagues inside and outside the academic world share our vision. At the University of Iowa, political science professor David Redlawsk and I co-founded the Obermann Graduate Institute on Engagement and the Academy six years ago. More than 75 competitively selected graduate students have participated. Those from the humanities are especially eager to use their passion for literature, art, and a reflective, interpretive approach to address the larger world's complexities.
They develop oral history projects with local neighborhood centers, mural collaborations that link troubled teenagers in Iowa and Burundi, ways to negotiate social conflict through literature. One student recruited an entire Iowa town to help map cancer and track family stories about the disease for what became an award-winning dissertation that traversed geography and public health. Another student developed a film cooperative with the senior center. The result? The partnership has fueled documentaries, theories of visualization, and an intergenerational art scene for almost a decade.
We’re not alone. Our institute was inspired by similar activities at the University of Washington’s Simpson Center for the Humanities that have evolved into a certificate program. I am struck by how many of the affiliated faculty mentors are assistant professors. Like many graduate students, a significant number of junior faculty members long for more collaborative, connected forms of scholarship in the humanities.
Encouragement to expand the ways we conduct and "count" work in the humanities is also coming from professional associations. While the individual, text-based, finely argued analysis in monograph form that has long been the hallmark of the humanities remains central to our disciplines, our associations also see value in extending the reach of academic humanities through experimental, engaged practices. The Public Philosophy Network has almost 750 members. That group and the American Philosophical Association’s Committee on Public Philosophy encourage philosophers to find ways to "use" philosophy, for example by working with public policy organizations.
The president of the American Historical Association, William Cronon, urged historians to avoid the threat of "professional boredom" by "not talking only with each other. By welcoming into our community anyone and everyone who shares our passion for the past and who cherishes good history." Organizations like Campus Compact and the Campus-Community Partnerships for Health provide resources and support for partnerships among individuals, schools and communities.
We want our series to make clear that engaged scholarship does not "dumb down" the disciplines. Discussing concepts, practices, and difficult texts in accessible terms certainly pressures the humanities. If that pressure pushes scholars and students in new, more public directions, their work as cultural interpreters is likely to be intensified, not diminished.
When students in a traditional class write analytical essays about Charles Dickens' Bleak House or Emily Bronte's Wuthering Heights, I urge them to answer the question "so what?" — why should this topic matter and to whom? — and to advocate for their answer in every line. Publicly engaged scholars and teachers are driven by that question to seek sites and partners who turn out to value the resources of the humanities in ways we humanities scholars sometimes forget to do.
We know from our own experience that public projects are alive, ongoing, and constantly evolving. We want the series to offer authors and readers a way to wrestle with and make sense of the intellectual, scholarly, ethical, methodological, pedagogical and political complications that challenge and enrich public humanities work. We picture the books as hybrid texts — part exhibition, part analysis, part documentation. We anticipate working with authors, the press, and colleagues in the digital humanities to create books rich with images and archives that live past publication through an interactive companion Web presence.
Many engaged artists and scholars are so committed to developing projects with their communities that they do so even when their institutions dismiss complex, intellectually rich, sustained versions of the public humanities as "service." The University of Iowa Press can illuminate projects that produce innovative humanities scholarship while also connecting scholars and students to communities through collaborative work. Our hope is that the series will help tenure committees as well as fellow engaged scholars and community partners understand and evaluate arts and humanities scholarship.
Finally, like all series editors, we seek authors and collaborators with intelligence and vision. For the Humanities and Public Life series we also look forward to working with colleagues whose wisdom, daring, and civic commitment have inspired them to reach across the boundaries of disciplines, campuses, organizations and communities in shared pursuit of intellectual and civic knowledge and change.
Teresa Mangum is the director of the Obermann Center for Advanced Studies and associate professor of English at the University of Iowa.
I recently spent four days at the AWP Carnival at the Chicago Hilton; there were, according to various reports, anywhere from 9,300 to 10,000 in attendance, and I saw most of those attendees standing ahead of me in line at Starbucks or waiting for a seat at Kitty O’Shea’s Pub. This was the annual convention of the Association of Writers and Writing Programs, where “writers, editors, and publishers come together.” And like most carnivals, it dealt in dreams.
There were 450 panels to choose from — all holding the promise of some magical connection, some dim and dimly borrowed light. This last was sometimes the literal case: a session on writing for radio involved the audience sitting in the dark and listening to the panelists’ favorite segments. Their advice: storytelling is key (well, yes) and audience members should feel free to look up any of the panelists online.
Interestingly, the session audiences’ biggest applause seemed to be reserved not for resume line-items involving publishing coups (such as one, two or even three memoirs -- that particular author deserved a round of applause for the sheer stamina involved not only in the life she lived but also her determination to write -- and write -- about it) but for announcements by panelists regarding tenure. At one session, a mystery writer announced that her recent MFA in playwriting had led to a tenure-track appointment; at another, the crowd literally went wild when a poet panelist announced that she had just received tenure. The irony of the fact that she was part of a panel promising to reveal what sort of work outside academia could bring MFA graduates, if not fame and fortune, then at least enough money to pay off their loans, went largely unnoticed. As for that session, the lead presenter was absent, and so the others valiantly soldiered on. It turned out that for these panelists, at least, “outside academia” meant working on the edges of academia. The advice included:
Hold creative writing salons in your home.
Be fortunate enough to have a thesis adviser who is selected to be Poet Laureate; then work as an intern for him/her.
Go back to school! Specifically, go back to school for an MLS degree. (Libraries are among the first to be hit in recessions. A master's in library science will only qualify graduates to attend future sessions entitled “What to Do with Your Library Degree.")
No one mentioned going back to school for classes in business or info tech or community planning. No one mentioned that you can be an accountant (or a health care worker or a plumber) and still write. The single poet most responsible for changing poetry in the .21st century was a doctor who made house calls. But there was no recognition of William Carlos Williams or of any other physician writer. Nor did anyone mention Wallace Stevens, who combined a career in life insurance with a life of poetry. No one mentioned the missing panelist, who has admirably combined a life of business and poetry and who served as Chairman of the National Endowment for the Arts. No one mentioned that there are, in fact, plenty of paying writing jobs available. Or that a one-time prize of $1,000 or a free trip to a writers’ conference isn’t enough, in the long run, to sustain a life. Or that one might apply imagination and creativity to finding or creating a job. Yes, poetry is the news that stays new. But you can do something else and still write poetry. And someone should have told you that before you started your MFA program.
Of the 9,300 to 10,000 attendees, one third, according to AWP executive director David Fenza were graduate students. Of these 3,000+ individuals, a handful seemed to be interested in nonfiction (or at least the memoir category of nonfiction) or playwriting (playwriting! Why not, at least, screenplay writing?); a number were engaged in fiction writing, but the vast majority were poets. The final (and recently tenured) panelist suggested volunteer work and offered a twofold rationale: that volunteer work might lead to (academic) connections and that poets already receive nothing for their work, so why not consider doing more work for nothing? This line received the most laughter that I heard in two days, and was far more amusing, albeit in a grim existential sort of way, than the ones I heard at a session titled “How to Tell a Joke.”
Of course, if you’re a poet or a jokester, you didn’t even have to buy a conference pass; you could skip the panels and just cruise the hotel lobby. Or go straight to the bars. Or you could, on the last day of the conference, hang out for free at the midway, the literally underground portion of the event — the book fair with its more than 550 exhibitors’ booths located in the basement of the Hilton. Here a few big-name academic publishers (whose displays featured textbooks about writing for teachers of writing) and venerable publishing houses shared space with many more small presses, small literary magazines, several individuals selling their single works, and reps for MFA programs. The atmosphere, like that of any other carnival, was crowded and noisy, with hawkers pushing their wares and onlookers seeking the lucky chance. Most attendees that I observed followed a similar pattern: upon first arriving, attentive perusal of each table, to be replaced, by the fifth row, by a sort of quick jog down the middle of the aisles.
There were some striking moments. Donovan Hohn, author of Moby-Duck: The True Story of 28,800 Bath Toys Lost at Sea..., delivered one of the best conference presentations that I have ever heard. Derek Alger and his panel of writers talking about memoir writing were funny and frank. Esmeralda Santiago and Jesmyn Ward read and spoke powerfully and beautifully.
The two most interesting people that I met during my time in Chicago were Margaret Atwood, the famous Canadian author who delivered the keynote address, and Cindy, the cab driver who drove me to and from the hotel. “Met,” in the case of Atwood, is a slight exaggeration; along with 139 other devotees, I had won a lottery for the book signing. By the time I approached her at the signing table, she looked so exhausted that I contemplated jumping the velvet guide rope and running away. As the woman waiting next to me on the line said, “My God, do you think we’re killing her?”
Atwood’s speech, listed in the program for an hour-and-a-half slot, ran about 25 minutes. This meant, if I added up the registration fee, the plane fare, the hotel bill, the bar bill, and Cindy’s rides to and from the airport, that I had actually paid about $75.00 per minute to sit in her presence. But it was, after all, Atwood, and it was worth it to see her and to hear her — wryly brilliant as ever — deliver a speech that began with her remarking that when she stated writing, there were no organizations like AWP — it was just her, writing and then tearing up drafts and then writing again.
As for Cindy, she’s been driving a cab for 18 years, or nearly all of her adult life. She’s looking, however, to get out of the business, and so she’s going back to school next year. Someday, she told me, she’s going to write about her life as a cab driver. In the meantime, she’s signed up for a community-college program -- in radiology.
Carolyn Foster Segal left a full-time tenured position in Dec. 2011. She currently works as an adjunct at Muhlenberg College and as a book-group facilitator for the Pennsylvania Humanities Council. She has had over 25 other jobs, including waitressing, sitting as an artists’ model, and working on the assembly line in a pickle factory.
The Modern Language Association has now issued its official, authoritative, and precisely calibrated guidelines for citing tweets – a matter left unaddressed in the seventh edition of the MLA Handbook (2009). The blogs have been -- you probably see this one coming -- all a-Twitter. The announcement was unexpected, eliciting comments that range from “this is really exciting to me and i don’t know why” to "holy moly i hate the world read a damn book." (Expressions of an old-school humanistic sensibility are all the more poignant sans punctuation.) Somewhere in between, there’s this: "when academia and the internet collide, i am almost always amused."
Yet the real surprise here is that anyone is surprised. The MLA is getting into the game fairly late. The American Psychological Association has had a format for citing both Twitter and Facebook since 2009. Last summer, the American Medical Association announced its citation style after carefully considering "whether Twitter constituted a standard citable source or was more in the realm of ‘personal communications’ (such as e-mail),” finally deciding that tweets are public discourse rather than private expression.
The AMA Style Insider noted that a standard format for Twitter references should “help avoid citations sounding like a cacophony of Angry Birds.”
How long was the possibility of an MLA citation format been under consideration? Was it a response to MLA members needing and demanding a way to bibliograph tweets, or rather an effort to anticipate future needs? Rosemary Feal, the organization’s executive director, was the obvious person to ask.
"The release of the tweet citation style,” she said by e-mail, “came in response to repeated requests from teachers, students, and scholars (most of them received, perhaps unsurprisingly, over Twitter). We debated the particulars on staff for some weeks. We're certain that the format we've announced is just a first step; user needs will change over time, as will technologies.”
Having exact, authoritatively formulated rules is clearly an urgent, even an anxiety-inducing matter for the MLA’s constituency. “Every time people asked me on Twitter about citing tweets,” Feal said, “I told them MLA style was flexible. Just adapt the format.” And as a matter of fact, the current MLA Handbook does have a format for citing blog entries – which would seem to apply, given that Twitter is a microblog.
“But because people wanted something very specific,” Feal said, “I asked staff to think about it…. Our hope is to remain nimble enough to respond to circumstances as they develop.” In that case, it might be time to start brainstorming how to cite Facebook exchanges, which can certainly be recondite enough, if the right people are involved. At least the Twitter citation format will be part of the eighth edition of the MLA Handbook -- though Feal indicated it would take at least another year to finish it.
Directing scholarly attention to the incessant flow of 140-character Twitter texts can yield far more substantial results than you might imagine, as explained in this column almost two years ago. Often this involves gathering tweets by the thousands and squeezing them hard, via software, to extract raw data, like so much juice from a vat of grapes. Add the yeast of statistical methodology, and it then ferments into the fine wine of an analogy that’s already gone on far too long.
So let’s try that again. Social scientists have ways of charting trends and finding correlations in tweets en masse. Fair enough. But recent work by Kaitlin L. Costello and Jason Priem points in a different direction: towards Twitter’s role in the more narrowly channeled and discussions taking pace within scholarly networks.
Costello and Priem, who are graduate students in the information and library science at the University of North Carolina at Chapel Hill, have been gathering and analyzing information about academics who tweet. Their findings suggest that Twitter has become a distinct and useful -- if exceedingly concentrated -- mode of serious intellectual exchange.
In one study, they examined the departmental web pages at five universities in the United States and Britain, compiling “a list of all the scholars (defined as full-time faculty, postdocs, and doctoral students) at each one, yielding a sample of 8,826.” Through a process of elimination, they were able to generate a pool of 230 scholars with active Twitter accounts. Out of the initial pool, then, they found one scholar in 40 using Twitter – not a lot, although it’s definitely an underestimation. Some in the pool were removed because Costello and Priem could not establish a link between faculty listing and Twitter profile beyond any doubt. (In the case of people with extremely common names, they didn’t even try.)
The most striking finding is that the scholars who used Twitter were almost indistinguishable from those who didn’t. Status as faculty or nonfaculty made no difference. Natural scientists, social scientists, and humanists were represented among the Twitterati at rates nearly identical to their share of the non-tweeting academic population. Scholars in the formal sciences (math, logic, comp sci, etc.) proved less likely to use Twitter than their colleagues – though only slightly.
A large majority of tweets by academics, about 60 percent, were of a non-scholarly nature. A given tweet by a faculty member was about twice as likely to have some scholarly relevance than one by a nonfaculty person. While the share of traffic devoted to strictly scholarly matters is not enormous, its importance shouldn’t be underestimated – especially since a significant portion of it involves the exchange of links to new publications.
In an earlier study (archived here) Costello and Priem conducted interviews with 28 scholars – seven scientists, seven humanists, and 14 social scientists – as well as harvesting more than 46,000 of their tweets. For each subject, they created a set of the 100 most recent tweets containing links that were still active. (A few didn’t reach the 100 mark, but their data was still useful.)
Six percent of the tweets containing hyperlinks fell into the category of what Priem and Costello call “Twitter citations” of peer-reviewed scholarly articles available online. One of their subjects compared linking to a scholarly article via Twitters to citing it in a classroom or seminar setting: “It’s about pointing people in the direction of things they would find interesting, rather than using it as evidence for something.”
At the same time, tweeting plays a role in disseminating new work in particular: 39 percent of the links were to articles less than a week old -- with 15 percent being to things published the same day.
The researchers divided citation tweets evenly into two categories of roughly equal sizes: direct links to an article, and links to blog entries or other intermediary pages that discussing an article (usually with a link to it). Not surprisingly, 56 percent of direct links lead to open-access sources. About three-quarters of the indirect links went to material behind a paywall. “As long as intermediary webpages provide even an abstract-level description,” write C&P, "our participants often viewed them as equivalent.”
One scholar told them: “I don’t have time to look at everything. But I trust [the people I follow] and they trust me to contribute to the conversation of what to pay attention to. So yes, Twitter definitely helps filter the literature.” Another said, “It’s like I have a stream of lit review going.”
At this level, Twitter, or rather its users, create a quasi-public arena for the distribution of scholarship – and, to some degree, even for its evaluation. Costello and Priem suggest that harvesting and analyzing these citations could yield “faster, broader, and more nuanced metrics of scholarly communication to supplement traditional citation analysis,” as well as strengthening “real-time article recommendation engines.”
On October 16, she made one of her papers available through the UCL online repository. Two people downloaded it. She tweeted and blogged about it on a Friday, whereupon it was downloaded 140 times in short order, then re-tweeted it on Monday, with the same effect. “I have no idea what happened on the 24th October,” she writes. “Someone must have linked to it? Posted it on a blog? Then there were a further 80 downloads. Then the traditional long tail, then it all goes quiet.”
In all, more than 800 people added the article to their to-read collections in a couple of months – which, for a two-year old paper called "Digital Curiosities: Resource Creation Via Amateur Digitisation," from the journal Literary and Linguistic Computing, is not bad at all.
That may be another reason why citation formats for Twitter are necessary. One day, and it might be soon, an intellectual historian narrating the development of a theory or argument may have to discuss someone’s extremely influential tweet. Stranger things have happened.
Sometimes I get a little fancy in the final comment of a student paper. Usually my comments are pretty direct: two or three things I like about the paper, two or three things I think need revision, and two or three remarks about style or correctness. But once in a while, out of boredom or inspiration, I grasp for a simile or a metaphor. Recently I found myself writing, "Successfully rebutting counter-arguments is not unlike slaying a hydra.”
I started with great confidence, but suddenly I wasn’t so sure I knew what a hydra is: a multiheaded creature? Yes. But how many heads? And can I use the word generically or do I have to capitalize it? Would “slaying the Hydra” be the correct expression?
Since I have no Internet connectivity at home, never have, and don’t miss it, I grabbed my Webster’s Seventh New Collegiate Dictionary from 1965 — the kind of dictionary you can get for free at the dump or from a curbside box of discarded books — and looked up hydra. On my way to hydra, however, I got hung up on horse, startled by a picture of a horse busily covered with numbers. I knew a horse has a face, a forehead, a mouth. A nose, ears, nostrils, a neck. A mane, hooves, a tail.
Pressed for more parts, I might have guessed that a horse had a lower jaw, a forelock (which I would have described as a tuft of hair between the ears), cheeks, ribs, a breast, haunches, buttocks, knees, a belly.
I don’t think I would have guessed flank, loin, thighs, and shoulders, words I associate with other animals, humans, or cuts of meat. I know I wouldn’t have guessed forearm or elbow.
What I’d thought of as an animal with a head, a mane, a tail, hooves, and a body has 36 separate parts, it seems, all enumerated in a simple design on page 401 of my dictionary. Had I not forgotten the precise definition of a hydra, I may never have learned that a horse also has a poll, withers, a croup, a gaskin, a stifle, fetlocks, coronets, pasterns, and cannons. (The withers are the ridge between a horse’s shoulder bones.)
Hoof is defined and illustrated on the page opposite the horse, an alphabetical coincidence. That picture too caught my eye, now that I was in an equine frame of mind. For the moment, I wanted to learn everything I could about the horse. The unshod hoof, it turns out, has a wall with four parts — the toe, the sidewalls, quarters, and buttresses — a white line, bars, a sole, and a frog, behind which lie the bulbs.
Eventually I returned to my original search. A Hydra with a capital H is a nine-headed monster of Greek mythology whose power lies in its regenerative abilities: if one head is cut off, two will grow in its place unless the wound is cauterized. With a lower case h, the word stands for a multifarious evil that cannot be overcome by a single effort. After all this dictionary work, I’m not sure hydra is the word I want.
I've been thinking about dictionaries lately. The writing center at Smith College, where I work, is transitioning from paper schedules to an online appointment system, and yesterday we spent part of the morning moving furniture around trying to create room for a new computer station dedicated to scheduling. One of my younger colleagues suggested getting rid of the dictionary stand, which, he said, "nobody uses." I bristled. It’s a beautiful thing, the dictionary, an oversize third edition of the American Heritage Dictionary, just a hair over 2,000 pages. For more than a dozen years it’s resided in a cozy nook on a well-lit lectern below a framed poster publicizing the 1994 Annual Katharine Ashen Engel Lecture by Murray Kiteley, then Sophia Smith Professor of Philosophy. The poster was chosen as much for its elegance as for the lecture’s title: "Parts of Speech, Parts of the World: A Match Made in Heaven? Or Just Athens?"
For years I had an office across from the dictionary and never used it myself, preferring the handiness of my taped-up 1958 American College Dictionary by Random House. The American Heritage is too massive. It takes me too long to find a word and I get easily distracted: by illustrations and unusual words. I continue to find my college dictionary completely adequate for my purposes. I’ve never needed a word that I couldn’t find in it.
Another colleague within earshot spoke up for the American Heritage, claiming he used it once in a while. "Maybe," I thought. More likely, he didn’t want to contemplate the loss of the big dictionary while he still mourned the loss of the blue paper schedules. The dictionary stayed: words, that’s what a writing center is about, and the dictionary is where they live.
I cannot remember the last time I saw one of my students using a paper dictionary, much less one carrying one around, not even an international student. Have today’s students ever instinctively pulled out a paper dictionary and used it to look up a word or check its spelling? Is a paper dictionary as quaint as a typewriter? Have things changed that much? I wonder. Is it partly my fault? It’s been many years, after all, since I’ve listed "a college dictionary" among the required texts for my writing course.
I doubt my students use dictionaries much, of whatever kind. You have to care about words to reach for the dictionary, and I don’t think they care very much about words. At their age, I probably didn’t either, though I think I did care more about right and wrong. I was embarrassed when I used the wrong word or misspelled a word. I still remember the embarrassment of spelling sophisticated with an f in a college paper, something a modern spell checker doesn’t allow. But it does allow "discreet categories" for "discrete categories," another unforgettably embarrassing error — this one in graduate school!
My students appear cheerfully to accept whatever the spell checker suggests, or whatever word sounds like the one they want, especially if they’re in roughly the same semantic domain. They are positively proud to confess that they’re bad spellers — who among them isn’t? — and really don’t seem to care much that they have used the wrong word. Words don’t appear to be things you choose anymore. They’re things that pop up: in autocorrect, in spell checkers, in synonym menus. They are not things you ponder over, they are things you click, or worse, your laptop decides to click for you.
When I meet with a student about her paper, we always work with a paper copy. Even so, more often than not I still have to remind her to take a pencil so she can annotate her draft as we discuss it. Toward the end of our meetings, we talk about word choice and the exchange often goes like this:
"Is this the word you want?"
"I think so."
"I think here you might have meant to say blah."
"Oh, yeah, that’s right" and out comes the pencil — scratch this, scribble that, lest it affect her final grade. No consideration, no embarrassment. I used to pull out the dictionary "to inculcate good habits," but no more. In the presence of today’s students, pulling out a dictionary feels as remote as pulling out a typewriter or playing a record.
Sometimes the situation is not so clear-cut. The student might, for example, write a word like security in a context where it makes a bit of sense, but after some gentle prodding and, yes, a few pointed suggestions, she might decide that what she really means is privacy. Out comes the pencil again. Scratch "security," scribble "privacy." What she really means is safety, though, I think, but I let it go. If I push too hard, she’ll stop thinking I'm being helpful and begin to think I have a problem: "What a nitpicker! The man’s obsessed with words!" I imagine her complaining to her friends. "But it matters! It matters!" goes the imaginary dialogue. "What precisely were the opponents of the ERA arguing, that it would violate security, invade privacy, or threaten safety?"
I have used the online Webster's on occasion, of course, and recognize the advantages of online dictionaries: They can be kept up-to-date more easily, they can give us access to more words than a standard portable dictionary, they can be accessed anywhere at any time, they take up no shelf space, etc. I'm not prejudiced against online reference tools. In fact, unlike many of my colleagues, I'm a great fan of online encyclopedias and a lover of Wikipedia. Online dictionaries leave me cold, though. They should fill me with awe the way Wikipedia sometimes does, but they don't. I marvel at the invention of the dictionary every time I look up a word in my paper copy; at the brilliant evolutionary step of such a book; at the effort of generations of scholars, professionals and lay people that led to such a comprehensive compendium of words; at how much information — and not just word meanings — it puts at my fingertips; at how much I still have to learn; and at how much my education could still be enhanced if I read my college dictionary cover to cover.
I think of The Autobiography of Malcolm X, in which the author makes a powerful statement about the dictionary as a pedagogical tool. Frustrated with his inarticulateness in writing while in prison and his inability to take charge of a conversation like his fellow inmate Bimbi, Malcolm X came to the conclusion that what he needed was "to get hold of a dictionary — to study, to learn some words." The experience was a revelation: "I’d never realized so many words existed!" He started at the beginning and read on, learning not just words but also history — about people, places, and events. "Actually the dictionary is like a miniature encyclopedia," he noted. The dictionary was the start of his "homemade education."
Online all I get is quick definition of the word I want, and I’m done. On paper I get the definition plus something akin to a small education along the way. The experience is not unlike that of slaying the Hydra: For every word I word I look up, I see two others whose meaning I don’t know. If I were Hercules I could put an end to the battle once and for all, but I’m not, and glad I’m not. The battle is far too delicious. But how to convince my students?
Julio Alves is the director of the Jacobson Center for Writing, Teaching and Learning at Smith College.