A familiar story about the modern university goes something like this: Once upon a time, the freshman arrived already knowing at least the basic mechanics of writing: what a paragraph was, how punctuation marks worked, the existence of nouns and verbs (and the obligation that they agree in a given sentence), that sort of thing. But the expansion of higher education throughout the 20th century, and especially in its second half, meant that a steadily growing portion of the student body needed basic training in such things.
The job naturally fell to professors of English, even though composition stood in relation to the study of literature roughly as long division did to algebraic topology, over in the math department. Still, it was necessary. Teaching this basic (even remedial) course helped justify offering the more advanced sections in literature. As the demand for writing instruction grew, it ceased being one task among others that the English faculty performed. It was became a function planned and administered separately from the courses on literature, and sometimes it even broke off from the department entirely, to do its own thing.
And that is why there is now a writing center on campus, probably in a basement somewhere, largely staffed by graduate students. There are faculty who specialize in composition studies, every single one of whom remembers that Cary Nelson, the president of the American Association of University Professors, once called them “comp droids” pursuing an activity devoid of any real intellectual content. That was more than 10 years ago, and in the original context it was a critique of literary scholars' attitudes. But the comp people still quote it sometimes, with bitterness, as if they've made a slogan of the militant online group Anonymous their own: “We are legion. We do not forgive. We do not forget.”
There are various problems with this narrative, including the fact that happy compositionists do exist. (I have met them.) But the most important is the fable about the golden age when secondary education produced literate students, so that the English faculty could keep its attention focused on higher things. In The Managerial Unconscious in the History of Composition Studies, published by Southern Illinois University Press, Donna Strickland quotes various exasperated statements issuing from Harvard University in the 1890s. Professors were obliged to instruct “bearded men [in] the rudiments of their native tongue,” so that “a large corps of teachers have to be engaged and paid from the College treasury to do that which should have been done before the student presented himself for admission.”
Teaching composition was the manual labor of the mind: “In quantity,” said a committee appointed by the Harvard Board of Overseers in its report from 1892, “this work is calculated to excite dismay; while the performance of it involves not only unremitted industry, but mental drudgery of the most exhausting nature.” And keep in mind that the students in question -- the raw material to be processed in the sweatshops of the Harvard English department – were typically the product of prep schools, in an era when the only distracting form of electronic communication was the telephone.
’Twas ever thus, in other words. But demolishing the belief that basic writing instruction at the college level reflects some recent dysfunction in secondary education (especially public schools) is a fairly minor element in Strickland’s argument.
The author, an assistant professor of English at the University of Missouri at Columbia, reconfigures the history of composition studies, rejecting the commonplace view that the field took shape on the margins of another discipline -- a humble (but all too necessary) pedagogical supplement to literary studies. The Managerial Unconscious is remarkably compact book, its points made with much concentration. Reading it more than once seems like a good idea. Here is a brief survey, offered with with all due trepidation by someone who has been through it just once.
The title might be a good way in. It alludes to Fredric Jameson’s The Political Unconscious (1981), which offered “always historicize” as a slogan for cultural analysis.
Strickland follows this injunction by stressing an important thing about the emergence of university-level composition courses in the U.S. at the close of the 19th century: it coincided with a rapidly growing market for white-collar labor. As companies expanded, their internal structures became more complex. Mechanization and the division of labor in manufacture increased productivity, but coordinating manufacture and distribution required new layers of managerial staff, able to turn out reports, memos, press releases, and the like.
When bearded men at university were unable to write coherently, that, rather than how prepared they were to compose a theme on Keats, was the real issue. To occupy a slot in the corporate chain of command, they had to be able to put a pen intelligibly to paper. One member of the Harvard committee that scrutinized undergraduate writing in the 1890s was the chairman of the Massachusetts Railroad Commission -- an early expert on what would soon be called systematic or scientific approaches to management. The committee’s stress on the drudgery involved in handling thousands of student compositions per semester echoes the managerial theme that work can and should be organized for greater efficiency.
“Whether the ideas of systematic management were employed consciously or unconsciously,” writes Strickland, “articulating the correct divisions of labor in the teaching of English was clearly the burden of the committee’s report.” To produce enough skilled labor to manage American business, the university itself needed to retool.
So by Strickland’s account, English professors did not shove composition out of literary studies like an unwelcome stepchild. Another dynamic was at work. Writing instruction became a discipline in Foucault’s sense – a way of inculcating both skills and the capacity to perform in a corporate workplace. Can the student rework a paper to the prof’s satisfaction, as a mid-management person might be called upon to revise a handbook? “Writing programs,” says Strickland, “… were made possible not by the devaluing of student writing in the university but by its central function in an institution that depended on writing as a tool for surveillance and assessment.”
The quest for managerial efficiency just happened to reinforce other power relationships: “The teaching of required writing [became], in the process of being divided from the English department in the name of efficiency, sometimes an entry-level position, more frequently in recent decades a position completely outside the tenure track. Because more stable, better paying faculty positions tended to be awarded to men, women often had little choice but to take on low-paying instructorships in composition.”
And by the later third of the 20th century, the consolidation of composition studies as a distinct field (with its own journals, graduate programs, academic organizations, and book series) had an odd effect. In keeping with Strickland’s title, the specialty behaves like one of Freud’s patients -- running away from “the managerial unconscious,” only to find it returning, just ahead. Comp studies established itself as an intellectual discipline. But one career track in it leads to supervising the labor of adjuncts and graduate students, preparing a syllabus that others will follow, and trying to keep the writing center’s costs down and statistics up. Still, thinking of the field as having a managerial component meets resistance, given the “negative connotations for traditional humanist intellectuals,” Strickand writes, “who have tended over the decades to distrust management as, at best, nonintellectual and, at worst, soul-murdering.” Management is where you land after doing a really good job at Pizza Hut for a couple of years.
But if the shoe fits.... "Once organizations of any kind are organized hierarchically," writes Strickland, "with a class of experts structuring and overseeing the work of a group of nonexperts, management happens. Professionalism calls for control and systematization of knowledge, and management is the group of people who reinforce that." Much of the book is devoted to how the evasion of its managerial function has played itself out over the years, even after the Council of Writing Program Administrators was established in the late 1970s. Strickland’s tone is never harsh. But when she writes that “almost from the beginning of the organization, the WPA discourse showed an aversion toward so-called managerial tasks,” somebody’s ox is being gored.
Strickland's argument implies consequences – but only implies them. Greater lucidity about how the managerial legacy of composition studies is the prerequisite for creating better working conditions; she also suggests that it will help make writing instruction a way to develop students' critical intelligence. But just how any of that will happen is left unaddressed. The Managerial Unconscious feels like the first volume of something, rather than the last word. If its implications are hard to read, that is because they remain to be drafted.
The University of North Carolina at Chapel Hill, like many other colleges, sponsors a "Summer Reading Program" for incoming students. Participating students all read the same book and, in the days before classes begin, meet with faculty members to discuss it in small seminars. Each year the university asks for recommendations, and each year I’ve suggested a book. Actually, I’ve suggested the same book every year: Tolstoy’s late novella, Hadji Murat. Four years on, the choice looks ludicrous, but the first year I suggested it some combination of arrogance and naïveté convinced me it would be picked. The book was supposed to be "intellectually stimulating," "enjoyable," capable of provoking "interesting discussion," and "appropriate for the level of incoming students," and to "address a theme applicable to the students themselves." In my submission proposal, I don't think I even made a case for the first four. Intellectually stimulating? It's by Tolstoy. Enjoyable? Capable of provoking interesting discussion? No problems there. Appropriate to the level of incoming students? I took this as code for "not too long," and, in my experience, Hadji Murat is about the briskest hundred-or-so pages I've ever read.
But what about the last criterion – "a theme applicable to the students themselves"? That, too, was easily met. Hadji Murat is, after all, about a morally-suspect empire’s attempt to suppress a guerrilla campaign waged by besieged Muslim Chechens. This alignment of forces was eerily contemporary. It was especially so in 2007-8, a year the fathers and mothers of some students might have spent waging a counterinsurgency campaign against a Muslim enemy in a land not so distant from the Caucuses – Iraq. And if not their parents then perhaps their high-school peers, particularly in a state with a rich military tradition like North Carolina. Hadji Murat is about dying for and against empire. That seemed "applicable" to the students themselves, who could participate in a summer-year reading program because someone else was waging a counterinsurgency campaign on their behalf.
But I still had an ace in the hole. I had heard enough complaining from colleagues about how poorly students fared with fiction, and I had witnessed quite a lot of it myself. I figured that the selection committee must have experienced much the same, and would fall over itself to find a novel that fit the bill. Here was a chance to teach fresh, unformed minds about fiction’s difficult riches. I honestly thought the book was a lock.
It wasn’t picked. The winner that year was Covering: The Hidden Assault on Our Civil Rights, by Kenji Yoshino. I was disappointed – not because my selection lost (I’m not that competitive), but because this just didn’t seem like a very good choice. The selection committee wrote that "Kenji Yoshino’s book forces readers to confront important issues relating to what we mean by equality and social justice, important themes indeed during a time when many mistakenly believe we live in the 'post-civil rights era.' It is both rigorously put and beautifully rendered. This book offers an excellent introduction to what rigorous critical inquiry is like at the university level. And the central topics treated – identity and self-expression – are central to most 18- and 19-year-olds."
I am sure that this is all true, but I couldn’t help feeling that the students had been done a disservice by asking them to read a work of nonfiction. Nonfiction was what they would be reading for the next four years and, though the students needed to learn about "rigorous critical inquiry" at the university level, all indications were that they needed far more to learn about the rigorous critical inquiry of fiction. Yoshino’s argument, I have no doubt, was subtle. But protecting civil rights is not an argument many students would find either surprising or objectionable. So for all the rhetoric about "confronting important issues," I wonder if a book like this doesn’t confront them in too cozy a manner. I cannot say for sure. But should it really be the first book an undergraduate meets? Just look at the description above, which notes that its central topics are "identity" and "self-expression." The selection committee was right – those are central to most 18-year-olds. So why give them exactly what they already know, in exactly the nonfictional form with which they are most familiar?
That question has nothing to do with Yoshino’s politics, about which I know nothing for certain, though I suspect they’re vaguely liberal. And this is where last year’s report from the National Association of Scholars, which also took issue with summer-reading programs, misses the mark. The report concluded, disapprovingly, that the "preponderance of reading assignments promotes liberal social causes and liberal sensibilities." Only 3 books of the 180 surveyed promoted a "conservative sensibility" and none promoted "conservative political causes." There is something methodologically dodgy about this sort of accounting, and I actually suspect the balance is much more equitable than this lets on. But it only occludes a more serious issue. The NAS’s challenge could be met, I take it, by substituting a conservative defense of civil rights (or limited government or what have you). But why would this be any better, since students would still be denied the intellectual and affective exercise that comes with clambering around the rock terrain of dense, difficult, and distant fiction? The NAS has no interest, so far as I can tell, in fiction as such. (Or just not a very good eye for fiction – it described The Adventures of Huckleberry Finn as a "not very challenging text," which makes me wish that the NAS reread Huck Finn next summer.)
Enough colleges and universities now run a first-year reading program that Princeton University Press has a section of its catalog specifically dedicated to books that fit the bill. I probably should have looked at it before suggesting Hadji Murat. Had I done so, I would have known it never stood a chance.
Here are four representative selections:
Diane Coyle’s The Soulful Science: What Economists Really Do and Why It Matters; George Akerlof and Robert Schiller’s Animal Spirits: How Human Psychology Drives the Economy and Why it Matters for Global Capitalism; Peter Leeson’s The Invisible Hook: The Hidden Economics of Pirates; Joel Waldfogel’s Scroogenomics: Why You Shouldn’t Buy Presents; Lawrence Weinstein and John Adam’s Guesstimation: Solving the World’s Problems on the Back of a Cocktail Napkin..
Also included are books on the Tea Party, UFOs, and an edited volume of Lincoln’s writings on race and slavery. There is not a single novel. One novelist, Amos Oz, is represented, but instead of a novel, the catalog suggests his essays on the Israeli-Palestinian conflict, How to Cure a Fanatic.
The books form a distinct but difficult to define family. Begin with the titles. Many of them have that format that publishers are keen on nowadays – a made-up word (Scroogenomics Superfreakonomics, Guesstimation) or a pun (Souled Out, Cop in the Hood), followed by a colon and, depending on the title, something appropriately sober or comically gee-whiz. More importantly, they are almost all nonfiction, are mostly concerned with contemporary American political and economic life, and, suitably compressed, would not be out of place on the opinion pages of the Sunday New YorkTimes. Most earned warm reviews, and their authors are responsible for some wonderful writing (and some Nobel Prizes). But there remains something distastefully topical about them all: the Tea Party, piracy, the religious right, how soulful economists really are. (For all the NAS’s griping about the absence of "conservative" titles, it might take some comfort in the fact that the "market" is warmly represented). And, again, not a single work of fiction on the list. Princeton is not primarily a publisher of fictional titles, but, in its extensive back catalog, could it not find one bit of fiction worth suggesting?
I do not know how many universities have turned to this list to stock their first-year reading programs. But a quick search shows what universities have turned to this past year: Elizabeth Kolbert’s Field Notes From a Catastrophe: Man, Nature and Climate Change; William Kamkwamba and Bryan Mealer’s The Boy Who Harnessed the Wind: Creating Currents of Electricity and Hope; Colin Beavan’s No Impact Man; Steven Levitt and Stephen Deubner’s Superfreakonomics; Reichen Lemkuhl’s, Here’s What We’ll Say: Growing Up, Coming Out, and the U.S. Air Force; and Moustafa Bayoumi’s How Does It Feel To Be a Problem: Being Young and Arab in America. This reads like a somewhat-less distinguished version of the Princeton list. Again, topicality is the order of the day – climate change, alternative energy, economics-as-the-solution-to-everything. Close behind are stories of identity and self-expression, topics so dear to 18-year-olds.
All of which is difficult to say without seeming to cast aspersions on the quality of these books. That could not be further from my mind. I haven’t read most of them (though my own garden-variety left-wing blog reading makes me think I’ve got the gist of just about every one). I’m sure some are excellent.
But do incoming students really need to be told that the world is facing a climatic Armageddon? Probably not. Do they need to be told that economics is the master discipline, ready to solve the world’s problems? Probably not. Do they need to be told that it is difficult to be gay and that gay rights should be protected? Again, probably not. Of course some students need to hear all of this – the last especially – and I hope that they leave college less bigoted and more humane than they arrived.
But I hope more ardently that they don’t leave thinking that these topics are at the center of a collegiate education. A collegiate education can, of course, be taken-up with thoughts of the timely – climate-change, the Tea Party, financial markets, and piracy. But there would be something defective about that, precisely because it ignores the way an education must be about the disinterested pursuit of the permanently untimely. And that is what these books, and these first-year reading programs, miss so egregiously. College becomes a kind of intensified continuation of blog- or opinion-page reading. Worse, it becomes training for a life in thrall to the market. (A point forcefully made by Martha Nussbaum’s Not For Profit: Why Democracy Needs the Humanities. It is a cruel irony that her book is also included in Princeton’s first-year catalog.) How else to understand the preponderance of glowing titles about economics? The idea that something utterly, irremediably foreign ought to confront the students seems nowhere in sight. That is part of the reason why I suggested Hadji Murat, and part of the reason, I think, that students have such a difficulty with imaginative literature. They simply do not confront much that they have not already encountered.
I didn’t sign up to lead a seminar on Yoshino’s book. My participation in the university’s first-year reading program was limited to my hopeless suggestion of Hadji Murat, my version of a write-in vote for Nader. And I had informally vowed not to participate until the committee selected a novel. I had given up on Hadji Murat. (Maybe it will be selected to commemorate the 25th anniversary of the Sunni Awakening). But then, this year, the committee selected Jonathan Safran Foer’s Eating Animals. It, like Yoshino’s book, violates the strictures that had kept me from participating: nonfiction, topical, and, suitably compressed, something that wouldn’t be out of place in the Atlantic. But I happened to be college friends with Safran Foer, and the bonds of friendship were sufficiently strong that I got my copy of the book, and signed up to lead a seminar.
In the course of reading Eating Animals I reflected upon my earlier reluctance to participate in the program. And I began to doubt that it was as well-grounded as I thought. I am a vegetarian, and I find Safran Foer’s book – despite his disavowals – a resoundingly clear case against eating meat. And I was secretly quite thrilled that a few thousand students had been invited by the university to read this book – secretly thrilled, that is, that few thousand students would be confronted with a powerful case for stopping the slaughter of billions of animals. “Would it have been better if they were reading Hadji Murat?,” I asked myself. I wasn’t so sure. And, during the positively gut-wrenching pages of Safran Foer’s book, pages filled with unspeakable acts of cruelty, I was absolutely certain that I couldn’t care less if no one ever read Hadji Murat again. Twenty vegetarians – or, really, five – were better than two converts to Tolstoy. If I wanted to stop the slaughter of animals for food – a slaughter that Safran Foer shows is more gruesome than I ever imagined – what could be better?
And I thought differently about all those other books I castigated for their overly parochial concerns – with climate change, the Tea Party, civil rights, or gay identity. As strongly as I felt about the slaughter of animals, weren’t these books nominated, and selected, because their supporters believed just as fervently in their causes? I’m certain they were. I was able to see, in a way I had not before, that these books were selected because someone thought the threats to our civil rights, to the environment, or from Tea Party were as urgent as I found the threat to animal life. My own parochialism had been shown up, and suitably shamed. I was sorry for my arrogance and ignorance.
As it happens, Eating Animals seems not to have made much of an impact on the students. None of the 20 who came to the seminar became a vegetarian. The best I got were vague professions about more ethical eating. "I’ll only eat free-range," said one student. "I’m only eating chicken from now on," said another. These sound like good outcomes, consistent with the aims of a program designed to get students to "think more deeply" about the topic at hand. But they are incoherent things to say after reading Safran Foer’s book, which memorably demolishes the meaningless moniker "free range." And if there were one animal you would not want to eat after reading the book, it is the chicken. That chicken’s miserably short life was spent in a warehouse with thousands of equally miserable, equally doomed birds. It moved only a few feet in the entirety of its unnatural life, and then only to escape the aggression of its crazed neighbors. Its last moments were spent in a slaughterhouse, caked in defecation and dirt, where its neck was severed by a machine of truly medieval cruelty. How had they missed those pages?
I've spent the last few weeks trying to understand why so many students got the book so wrong. There were all sorts of good reasons. Maybe they read it in the beginning of the summer and had forgotten the details. Or maybe they just didn’t read it. Either of which would have been understandable and, for first-year students who spent the summer on the beach, forgivable. But with one or two exceptions, all professed to have read the book. They just had no idea what Safran Foer had written.
And so the more I thought about the conversation the more difficult it became to resist the conclusion that students simply aren’t very good readers. This is not news. Becoming a good reader, after all, is one of the things that happens in college, not before it. But until this seminar, I had assumed that their difficulties were limited to fiction. I assumed that they all knew how to follow the argument of a piece of nonfiction, especially one as linear as Safran Foer’s. There are some subtle arguments in the book, but the overriding claim simply cannot be missed. And miss it is just what these students had done.
I was disappointed that no one had become a vegetarian – just as, I suppose, the recommenders of Yoshiro’s book would have been disappointed had someone left the discussion convinced that our civil rights were safe. But I was more disappointed for the students, that they were so inexperienced in the ways of reading that they were lost even in a book like this. I tried to see their inexperience as a reason for excitement. Just think, I tried to tell myself, how much room they have to grow as reader. But I didn’t really believe this. Sure, they would become marginally better readers. But the kind of reader I wanted them to become – the kind of reader I myself want to be – is the kind of reader one becomes only after years of reading fiction.
Others have made the case for fiction more persuasively than I ever could. And if these first-year reading programs contained a smattering of fiction and nonfiction, there wouldn’t be much reason to gripe. Fiction some years, nonfiction others is a good enough outcome. But the paucity of fiction in these programs is stunning. My university has chosen fiction once (Jhumpa Lahiri’s The Namesake) in 12 years, a ratio that seems fairly typical of colleges nationwide. (It also bravely selected Approaching the Quran in 2003. Though obviously not a work of fiction in the sense I have in mind, it poses interpretive challenges not unlike the best imaginative literature). I was heartened to see Knopf issue its own catalog of first-year titles. Although it mostly resembles Princeton’s in its emphasis on economics and gee-whiz science, it includes nearly 20 novels, some of them, like Joseph O’Neill’s Netherland, supremely beautiful and difficult. Sadly, I couldn’t find a college that chose it.
Tolstoy makes a brief appearance in Safran Foer’s book, which has a laugh at the Russian’s fatuous suggestion that the end of slaughterhouses would mean the end of war. This just goes to show that reading fiction doesn’t inoculate you against bad ideas. It doesn’t make you a more moral person either, many "save-the-humanities" cases notwithstanding. But reading fiction makes you a better reader. Somehow, administrators of these programs seem to have lost sight of that. And so the next time some college thinks about selecting Safran Foer, I’d ask them to think about the famous Russian vegetarian instead. I’ll be sorry to take away the sales from my friend, and sorry that his book won’t be read. But I’m sorrier to see his book read poorly, and, as the author of some wonderfully imaginative fiction himself, perhaps he won’t mind losing to Tolstoy.
Brendan Boyle teaches in the classics department at the University of North Carolina at Chapel Hill.
We hear these days of the "crisis of the humanities." The number of majors, jobs, and student interest in these subjects is dropping. The Boston Globe offered one report on the worries of the humanities in an article last year about the new Mandell Center at Brandeis University. The Globe asserted, "At college campuses around the world, the humanities are hurting. Students are flocking to majors more closely linked to their career ambitions. Grant money and philanthropy are flowing to the sciences. And university presidents are worried about the future of subjects once at the heart of a liberal arts education."
Such gloom must be placed in context. Doubts about the humanities have been around at least since Aristophanes wrote The Clouds. The playwright claimed that if a man engaged in the "new" Socratic form of teaching and questioning, he could wind up with big genitals (apparently seen as a negative side effect) due to a loss of self-control. But the Socratic humanities survived, in spite of the execution of their founder, through the schools of his intellectual son and grandson -- the Academy of Plato and the Lyceum of Aristotle.
I don't think that the humanities are really in a crisis, though perhaps they have a chronic illness. Bachelor's degrees in the humanities have held relatively steady since 1994 at roughly 12-13 percent of all majors. Such figures demonstrate that the health of the humanities is not robust, as measured in terms of student preferences. In contrast, the number of undergraduate business majors is steadily and constantly increasing.
So what has been the response of university and college leaders to the ill health of the humanities?
It has been to declare to applicants, students, faculty, and the public that these subjects are important. It has included more investments in humanities, from new buildings like the Mandel Center at Brandeis, to, in some cases, hiring more faculty and publicizing the humanities energetically. Dartmouth College's president, Jim Yong Kim, recently offered the hortatory remark that "Literature and the arts should not only be for kids who go to cotillion balls to make polite conversation at parties."
I couldn't agree more with the idea that the humanities are important. But this type of approach is what I call the "eat it, it's good for you" response to the curricular doldrums of humanities. That never worked with my children when it came to eating broccoli and it is even less likely to help increase humanities enrollments nationally today.
The dual-horned dilemma of higher education is the erosion of the number of majors in the humanities on the one hand and the long-feared "closing of the American mind" on the other, produced in part by the growing number of students taking what some regard as easy business majors. Yet these problems can only be solved by harnessing the power of culture, by understanding the ethno-axiological soup from which curriculums evolve and find their sustenance. Jerome Bruner has long urged educators to connect with culture, to recognize that the environment in which we operate is a value-laden behemoth whose course changes usually consume decades, a creature that won't be ignored.
It is also vital that we of the humanities not overplay our hands and claim for ourselves a uniqueness that we do not have. For example, it has become nearly a truism to say that the humanities teach "critical thinking skills." This is often correct of humanities instruction (though certainly not universally so). But critical thinking is unique neither to the humanities nor to the arts and sciences more generally. A good business education, for example, teaches critical thinking in management, marketing, accounting, finance, and other courses. More realistically and humbly, what we can say is that the humanities and sciences provide complementary contexts for reasoning and cultural knowledge that are crucial to functioning at a high level in the enveloping society.
Thus, admitting that critical thinking can also be developed in professional schools, we realize that it is enhanced and further developed when the thinker learns to develop analytical skills in history, different languages, philosophy, mathematics, and other contexts. The humanities offer a distinct set of problems that hone thinking skills, even if they are not the only critical thinking game in town. At my institution, Bentley University, and other institutions where most students major in professional fields, for example, English develops vocabulary and clarity of expression while, say, marketing builds on and contributes to these. Science requires empirical verification and consideration of alternatives. Accountancy builds on and contributes to these. Science and English make better business students as business courses improve thinking in the humanities and sciences.
If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That's like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.
So what is there to do? How do we harness the power of culture to revive and heal the influence of the humanities on future generations? Remember, Popeye didn't eat his spinach only because it was good for him. He ate his spinach because he believed that it was a vital part of his ability to defend himself from the dangers and vicissitudes of life, personified in Bluto. And because he believed that it would give him a good life, represented by Olive Oyl.
Recently, an alumnus of Bentley told me over dinner, "You need business skills to get a job at our firm. But you need the arts and sciences to advance." Now, that is the kind of skyhook that the friends of the humanities need in order to strengthen their numbers, perception, and impact.
While I was considering the offer to come to Bentley as its next dean of arts and sciences, Brown University and another institution were considering me for professorial positions. Although I felt honored, I did not want to polish my own lamp when I felt that much in the humanities and elsewhere in higher education risk becoming a Ponzi scheme, which Wikipedia defines accurately as an "...operation that pays returns to separate investors, not from any actual profit earned by the organization, but from their own money or money paid by subsequent investors."
I wanted to make my small contribution to solving this problem, so I withdrew from consideration for these appointments to become an administrator and face the issue on the front line. And Bentley sounded like exactly the place to be, based on pioneering efforts to integrate the humanities and sciences into professional education -- such as our innovative liberal studies major, in which business majors complete a series of courses, reflections, and a capstone project emerging from their individual integration of humanities, sciences, and business.
Programs that take in students without proper concern for their future or provision for post-graduate opportunities -- how they can usewhat they have learned in meaningful work-- need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.
The cultural zeitgeist requires of education that it be intellectually well-balanced and focused but also useful. Providing all of these and more is not the commercialization of higher education. Rather, the combination of professional education and the humanities and sciences is an opportunity to at once (re-)engage students in the humanities and to realize Dewey's pragmatic goal of transforming education by coupling concrete objectives with abstract ideas, general knowledge, and theory.
I have labeled this call for a closer connection between the humanities and professional education the "Crucial Educational Fusion." Others have recognized this need, as examples in the new Carnegie Foundation for the Advancement of Teaching bookRethinking Undergraduate Business Education: Liberal Learning for the Profession illustrate. This crucial educational fusion is one solution to the lethargy of the humanities -- breaking down academic silos, building the humanities into professional curriculums, and creating a need for the humanities. Enhancing their flavor like cheese on broccoli.
Daniel L. Everett
Daniel L. Everett is dean of arts and sciences at Bentley University.
I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.
First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.
Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.
Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.
Problem One: Outmoded Presentations
We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.
I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?
The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.
I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.
Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.
Problem Two: Expense
Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.
Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)
An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.
Problem Three: Victimized Grad Students
I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.
So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.
Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?
The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.
As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.
Problem Four: No-Shows
You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?
As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.
Problem Five: Urban Sprawl
What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.
In Praise of Small Conferences
There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.
Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)
Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.
I was pleased to see this session in the conference program, organized around a topic to which I’ve dedicated much of my professional life, and I think you presenters have done a wonderful job to an extent. I think we all know what a labor of love organizing a conference session can be, especially when it is on a topic that is fairly complicated – a topic, perhaps, that only a handful of scholars have truly engaged and perhaps upon which only one or two have done any truly definitive work. The panel organizers might have thought to invite a central figure in this field to anchor the session, someone who has covered much of this ground already and is acknowledged to have done the first and still the best work on this question, but I’m told the organizers wanted fresh (as opposed to what, I don’t know!) voices and they invited some, well, emerging scholars to contribute. I think we’d all agree they did a fine job after a fashion; we hardly missed the usual contributors that often present papers on this topic.
But to return to my question – I promise there’s a question in here! – as I sat, rapt, listening to these fine presentations, I started wondering if the panelists were perhaps giving short shrift to some of the definitive findings on this topic that have proved quite sound and durable for almost two decades; I’m sure everyone in the room can tick off the titles of the groundbreaking publications that helped define this field – and, with all due respect, I began to suspect that some of the presenters were taking a rather … cavalier … direction, given the enduring centrality of those seminal works of scholarship with which all of us are familiar. So as I listened I began to formulate a response — we can’t all help but notice that a panel this good often cries out for a respondent, a prominent scholar to draw all the presentations together under the existing — and still quite valid — paradigm.
Thus my query, which should be prefaced with a reminder that in our discipline conference panels like this one ought to be informed by a thorough understanding of, if not respect for, the earlier work that created the very conditions that allow for the continued study of this issue. I hesitate to say "standing on the shoulders of giants," but I would hazard that some of the panel participants have failed to accommodate, much less cite – yes, I said cite – the key sources, which are as relevant today as they were when first published. At the risk of detracting from all this freshness, I can’t help put note that the papers I heard today can do no more than elaborate upon disciplinary principles already well established — footnotes to Plato and all that. And yet, certain experts went unmentioned. Certain still-relevant and available authorities could have spoken today, had one been invited to this session. My question, at last:
Don't you know who I am?
Daniel J. Ennis
Daniel J. Ennis is a professor of English at Coastal Carolina University.
Community colleges need to be cautious before hiring Ph.D.s who haven’t been trained for teaching at two-year institutions, and unemployed doctorates shouldn’t assume they are prepared for such jobs, writes Kory Lawson Ching.
It’s not always easy for professors to embrace technology. We find ourselves questioning everything from whether the transition to cyber-learning is really worth it to wondering how and when we will "master" working with computers. In a tenured profession, some think it’s better to stay in our safe, traditional worlds of literacy. Nevertheless, most of us realize that we cannot avoid our new century and the new pedagogical challenges created by its technological advances.
Having made that decision, more questions arise. If we change our courses, do we risk lowering the quality of our teaching because we are simultaneously learning how to teach with computers while also learning how to evaluate their effects? And even as we multi-task, writing e-mails to students while we surf the Web for the perfect text to teach tomorrow, can we ever keep up?
As we begin to realize the benefits and drawbacks of computerized writing, it may seem like there is an endless road of new learning challenges. As academics, we don’t often move as quickly as technology, and yet we also know that analyzing and reflecting on innovation is our ethical responsibility. Are professors today doing enough to use, improve, reflect and criticize our use of computers as tools to teach writing and support learning across the curriculum?
Despite the fears and uncertainties conveyed by these questions, it is beyond doubt that our students’ learning and literacies have changed because of the use of computers. We must understand and adapt to computers, hypertext, and ultimately learn as much as possible about our disciplines’ experiences in cyberspace because our students demand it of us, and because language is always changing. Essentially, we must move to the point where we focus on ways to fuse academic discourse with our students’“netspeak.”
Since the 1980s, writing teachers have increasingly focused on the need for their teaching to reflect the ways technology and traditional literacies are converging. In her 1998 keynote address to the Conference on College Composition and Communication, Cynthia Selfe, an early adapter and leading authority on teaching writing with computers, remarked that we "have, as a culture, watched the twin strands of technology and literacy become woven into the fabric of our lives."
All college teachers, and particularly writing teachers, must now learn to avoid overly narrow, official, 20th century versions of literacy practices or skills if we are going to effectively reach our students as readers, writers, and thinkers. Whether we teach mathematics, biology, or literature, we all know that literacy skills are really the responsibility of all educators. It is no longer acceptable for professors to claim ignorance of using computers and the Internet while claiming to be literacy teachers in the 21st century.
Innovative thinkers like Selfe not only get at why we need to teach with technology, they make it clear that we need to offer concrete ways to use hypertext. Even "Web in a can" programs like Blackboard, Blogger and WebCT can be used effectively to get students and teachers reading, writing and thinking critically about all the literacies that are used in any college classroom. Through a blend of theory and practice, we want to encourage readers of Inside Higher Education to realize that just as our profession’s news has begun to move from paper to screen, we professors must make the same move too.
Computer technology has swiftly become our key writing tool but it’s too easy to imagine everyone "gets it." Just as we take writing samples to learn about the literacy levels of our students when we teach composition, we need to determine what computing skills our students bring to our classes because we need to teach students how to fuse traditional and online writing skill. For example, instructors of first-year college writing typically work with students to teach them how to do academic research and it has become increasingly clear that it no longer makes sense to shun the Internet for the "safe" confines of the library. However, research on the net means much more than typing a few words in to Google.
A more sophisticated approach to teaching students how to do Internet research involves showing students some of the ways online searches use Boolean logic, and this is simply accomplished by visiting the Google Guide.
This self-teaching Google tutorial will sharpen awareness of how the search engine works, and it will also help students in their library research as well. Also, by using checklists and guides, we can help students to critically evaluate sources on the Internet -- not just accepting what is written as an alternate form of "the gospel." A good example of this can be seen in some checklists, created by Jan Alexander and Marsha Tate, that can easily be used with any research-based college assignment. These checklists ask that students classify and validate Web sites, as well as help students think more carefully about the qualities of information that the Web sites present.
Of course it isn’t only students who have to think carefully and critically about computers, the way they convey information, and the way that they are perceived as learning tools. Teachers have to do the same thing because computers have changed our writing and learning worlds, and as educators we can never take these changes for granted. Andrea Beaudin, a professor at Southern Connecticut State University who teaches in a wireless, laptop equipped classroom is constantly amused at the ways her students perceive the technology in the classroom. Recently, she recalled “being surprised during a writing lab when I asked students to take out paper to start jotting down ideas, and a student said, “now it’s a real English class.”
Andrea’s story reminds us that there will be times and places for other types of literate activities in a computerized classroom. Just as Andrea asked her students to use the “old” technology of paper and pen to do the work of learning because she practices hybrid notions of literacy, we will continue to work at what Cindy Selfe calls “multilayered literacy” -- a literate practice where people “function literately within computer-supported communication environments” by layering “conventions of the page and conventions of the screen.”
However, conventions of page and screen certainly converge more smoothly in theory than they ever do in practice. Something as simple as deciding to create a Web page, choosing Web weaving software, and learning it, can be a huge step for most teachers. Both of us have experienced working through steep learning curves while learning to use new Web technologies. Four years ago, Chris moved from using raw HTML code to working with Adobe GoLive and Photoshop. At times, he wanted to shot-put his monitor through any open window. However, the end result was a personal Web page that looks better, contains more useful information for students, and is much easier to update. Right now Will is working through learning DreamWeaver, and he has already started to see new possibilities for his page.
Our point here is that even techie teachers get technological blues. However, once we begin to figure a few things out, then interesting and good things begin to happen. We learn a new skill, our students get better Web resources, and both teacher and students have yet another new technology to think through practically and critically.
Professors and students both need to think critically about technology. As a key part of the critical thinking, teachers need to focus on pedagogy and how it is affected and changed by computers. Maybe some things haven’t changed -- traditional, academic literacy has always converged with new ways to use language -- though it’s fair to say that computers certainly seem to speed up an exciting convergence of language uses. We educators are working in an exciting point in literacy development and we can be more mindful of why and how to use our computers. As the traditional classroom adds cyberspace, we must work closely with students and teachers to ensure that we enter new learning spaces with critical awareness and pedagogical wisdom.
Will Hochman and Chris Dean
Will Hochman is an associate professor of English and Christopher Dean is an assistant professor of English at Southern Connecticut State University.