A warning: This week’s column will be miscellaneous, not to say meandering. It updates earlier stories on Wikipedia, Upton Sinclair, and the Henry Louis Gates method of barbershop peer-review. It also provides a tip on where to score some bootleg Derrida.
Next week, I’ll recap some of my talk from the session on “Publicity in the Digital Age” at the annual conference of the Association of American University Presses, covered here last week. The audience consisted of publicists and other university-press staff members. But some of the points covered might be of interest to readers and writers of academic books, as well as those who publish them.
For now, though, time to link up some loose ends....
One blogger noted that the comments following my column on Wikipedia were rather less vituperative than usual. Agreed -- and an encouraging sign, I think. The problems with open-source encyclopedism are real enough. Yet so are the opportunities it creates for collaborative and public-spirited activity. It could be a matter of time before debate over Wikipedia turns into the usual indulgence in primal-scream therapy we call "the culture wars." But for now, anyway, there’s a bit of communicative rationality taking place. (The Wikipedia entry on "communicative rationality" is pretty impressive, by the way.)
A few days after that column appeared, The New York Times ran a front-page article on Wikipedia. The reporter quoted one Wikipedian’s comment that, at first, “everything is edited mercilessly by idiots who do stupid and weird things to it.” Over time, though, each entry improves. The laissez faire attitude towards editing is slowly giving way to quality control. The Times noted that administrators are taking steps to reduce the amount of “drive-by nonsense.”
The summer issue of the Journal of American History includes a thorough and judicious paper on Wikipedia by Roy Rosenzweig, a professor of history and new media at George Mason University. Should professional historians join amateurs in contributing to Wikipedia? “My own tentative answer,” he writes, “is yes.”
Rosenzweig qualifies that judgment with all the necessary caveats. But overall, he finds that the benefits outweigh the irritations. “If Wikipedia is becoming the family encyclopedia for the twenty-first century,” he says, “historians probably have a professional obligation to make it as good as possible. And if every member of the Organization of American Historians devoted just one day to improving the entries in her or his areas of expertise, it would not only significantly raise the quality of Wikipedia, it would also enhance popular historical literacy.”
The article should be interesting and useful to scholars in other fields. It is now available online here.
This year marks the centennial of Upton Sinclair’s classic muckraking novel, The Jungle, or rather, of its appearance in book form, since it first ran as a serial in 1905. In April of last year, I interviewed Christopher Phelps, the editor of a new edition of the novel, for this column.
Most of Sinclair’s other writings have fallen by the wayside. Yet he is making a sort of comeback. Paul Thomas Anderson, the director of Boogie Nights and Magnolia, is adapting Sinclair’s novel Oil! -- for the screen; it should appear next year under the title There Will Be Blood. (Like The Jungle, the later novel from 1927 was a tale of corruption and radicalism, this time set in the petroleum industry.) And Al Gore has lately put one of Sinclair's pithier remarks into wide circulation in his new film: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”
That sentiment seems appropriate as a comment on a recent miniature controversy over The Jungle. As mentioned here one year ago, a small publisher called See Sharp Press claims that the standard edition of Sinclair’s text is actually a censored version and a travesty of the author’s radical intentions. See Sharp offers what it calls an “unexpurgated” edition of the book -- the version that “Sinclair very badly wanted to be the standard edition,” as the catalog text puts it.
An article by Phelps appearing this week on the History News Network Web site takes a careful look at the available evidence regarding the book’s publishing history and Sinclair’s own decisions regarding the book and debunks the See Sharp claims beyond a reasonable doubt.
In short, Sinclair had many opportunities to reprint the serialized version of his text, which he trimmed in preparing it for book form. He never did so. He fully endorsed the version now in common use, and made no effort to reprint the "unexpurgated" text as it first appeared in the pages of a newspaper.
It is not difficult to see why. Perhaps the most telling statement on this matter comes from Anthony Arthur, a professor of English at California State University at Northridge, whose biography Radical Innocent: Upton Sinclair has just been published by Random House. While Arthur cites the “unexpurgated” edition in his notes, he doesn’t comment on the claims for its definitive status. But he does characterize the serialized version of the novel as “essentially a rough draft of the version that readers know today, 30,000 words longer and showing the haste with which it was written.”
A representative of See Sharp has accused me of lying about the merits of the so-called unexpurgaged edition. Indeed, it appears that I am part of the conspiracy against it. (This is very exciting to learn.) And yet -- restraining my instinct for villainy, just for a second -- let me also point you to a statement at the See Sharp website explaining why the version of The Jungle that Sinclair himself published is a cruel violation of his own intentions.
Memo to the academy: Why isn’t there a variorum edition of The Jungle? There was a time when it would have been a very labor-intensive project -- one somebody might have gotten tenure for doing. Nowadays it would take a fraction of the effort. The career benefits might be commensurate, alas. But it seems like a worthy enterprise. What’s the hold-up?
In February 2005, I attended a conference on Jacques Derrida held at the Cardozo Law School in New York, covering it in two columns: hereand here. A good bit of new material by “Jackie” (as his posse called him) has appeared in English since then, with more on the way this fall. Next month, Continuum is publishing both a biography of Derrida and a volume described as “a personal and philosophical meditation written within two month’s of Derrida’s death.”
Bet you didn’t know there was going to be a race, did you?
In the meantime, I’ve heard about a new translation, available online, of one of Derrida’s late-period writings. It is part of his engagement with the figure of Abraham, the founding phallogocentric patriarch of the three great monotheistic religions. The translator, Adam Kotsko, is a graduate student at the Chicago Theological Seminary. (See this item on the translation from his blog.)
The potential for “open source” translation may yet open more cans of worms than any team of intellectual-property lawyers can handle. I’ll throw this out as a request to anyone who has thoughts on the matter: If you’ve committed them to paper (or disk) please drop me a line at the address given below.
And finally, a return to the intriguing case of Emma Dunham Kelley-Hawkins -- the most important African-American writer who was not actually an African-American writer.
In a column last spring, I reported on the effort to figure out how the author of some rather dull, pious novels had become a sort of cottage industry for critical scholarship in the 1990s. After a couple of days of digging, I felt pretty confident in saying that nobody had thought to categorize Kelley-Hawkins as anything but a white, middle-class New England novelist before 1955.
That was the year a bibliographer included her in a listing of novels by African-American writers -- though without explaining why. And for a long time after that, the scholarship on Kelley-Hawkins was not exactly abundant. Indeed, it seemed that the most interesting thing you could say about her fiction was that all of the characters appeared to be white. Kelley-Hawkins did make a very few references to race, but they were perfectly typical of white prejudice at its most casually cruel.
Only after Henry Louis Gates included her work in a series of reprints by African-American women writers did critics begin noticing all the subtle -- the very, very subtle -- signs of irony and resistance and whatnot. Why, the very absence of racial difference marked the presence of cultural subversion! Or something.
So much ingenuity, in such a bad cause.... Subsequent research suggests that Kelley-Hawkins was Caucasian, by even the most stringent “one drop” standards of white racial paranoia in her day.
A recent item by Caleb McDaniel discusses the most recent work on Kelley-Hawkins. The puzzle now is how the initial re-categorization of her ever took place. Evidently that bibliography from 1955 remains the earliest indication that she might have been African-American. (A second puzzle would be how anyone ever managed to finished reading one of her novels, let alone embroidering it with nuance. They can be recommended to insomniacs.)
McDaniel also quotes something I’d forgotten: the statement by Henry Louis Gates that, if he had put up a photograph of Kelly-Hawkins in his barbershop, “I guarantee the vote would be to make her a sister."
You tend to expect a famous scholar to be familiar with the concept of the sepia tone. Evidently not. Here, again, is where Wikipedia might come in handy.
One of the first things a graduate student in the humanities and “softer” social sciences learns is that communication is rarely simple. Words carry latent values and vestigial biases, they are told, and over time the consequences of a word’s usage exceed its ostensible meaning. Post-bac training begins with that distinction, and students advance by attuning themselves to the tacit and the subtextual. “Language is not transparent,” announces the favorite T-shirt of a colleague, and to interpret statements accordingly isn’t just common wisdom. It’s a professional duty.
I’ve felt its pull many times, once while watching a debate on television around 1991 when the campus had become a central theater of the culture wars. Catharine Stimpson, Stanley Fish, and two others took on John Silber, William Buckley, Dinesh D’Souza, and Glenn Loury, with the canon, speech codes, and political correctness the topics. At one point, when Silber asserted the silliness of substituting the title “chair” for “chairman” -- women “calling themselves furniture,” he put it -- Fish replied with a point about the “deep culture of the language.” Often, he argued, “linguistic assumptions can be so deeply assumed that the society that uses them is not aware of them,” and when scholars and teachers unveil them, people feel threatened and confused. It’s a common premise, and it makes it easy to cast the academics as tenured meddlers going against common sense. The academics, in turn, feel that the more figures such as D’Souza resist, the more they know they’re on to something. That some of these expressions carry discriminatory baggage sharpens the analytic radar and adds a moral imperative to the labor. Indeed, no mandate has granted literary scholars so strong sense of mission in the last 25 years.
It certainly touched me, and I recall judging Buckley et al as obtuse anti-intellectuals and cheap-shot artists pitiably ignorant of advanced arguments. With a fresh Ph.D. in hand, and infused with Heidegger and Derrida, I believed fervently in the interpretative calling, disdaining what phenomenologists called the “natural attitude,” the outlook that takes things at face value. Added to that, I claimed language and literature as a professional subject, which meant that my livelihood depended upon the under- or other side of words, and that it took a special acumen to access it.
Fifteen years later, though, after countless written and spoken readings that lifted the political sediment out of ordinary and extraordinary language, the practice sounds pedestrian and predictable. In some cases, the search for “linguistic assumptions” exposed sexist and racist attitudes underlying different discourses, invisible but operative -- for instance, Gilbert and Gubar’s analysis in The Madwoman in the Attic of patriarchal motifs in critical discussions of creativity -- and it also reflected handily upon the institutional circumstances of them. But when it ascended into a theoretical premise, and soon after settled into a professional habit, the conclusions it drew lapsed into routine. Indeed, much queer theory has involved the extraction of queer subtexts from canonical texts and popular culture, influentially enough that assertions such as that of a lesbian undercurrent in "Laverne and Shirley," as one book offered several years ago, produces the effect of either whimsical curiosity or a rolling of the eyes.
The theory provided no guidelines as to where it did and did not apply, and so it was stretched too thin. It provided no means for distinguishing between content that was invisible from content that actually wasn’t there. The professors saw implicit meaning everywhere, much of it political or identity-oriented. Persons outside the academy looked at the whole of their exchanges and found most of them uncomplicated and transitory. The surface was all. To that audience, conservatives such as Silber had a better grasp of the nature of “linguistic assumptions” than the professors did. And it didn’t help that so many professors shared Theodor Adorno’s belief in “the stupidity of common sense.” That, indeed, may explain why conservative intellectuals routed the professors in public settings over the years -- not because they lacked nuance, played on irrational fears, or traded in simplistic, but telegenic gibes. Rather, they understood better when to analyze and when to assert, when to dismantle and when to affirm.
Both camps would agree, however, that the disclosure of assumptions and biases in language does apply to certain contexts, especially those in which an institution weighs heavily upon the utterances. When the protocols of communication are strict, when a statement reflects a speaker’s knowledge and legitimacy, when misstatements violate a group’s sense of mission, when entry into the discourse requires a long and regulated preparation by the entrant -- such settings are “overdetermined,” and they need detailed analysis and thick description. The terms are loaded and the topics authorized. Statements impart norms as well as ideas, mores as well as referents. The expressions licensed there reinforce the institution and echo its rationale. The subtext is dynamic, and if we don’t analyze it, then we do, indeed, break our promise to critique.
For this reason, it has been astonishing to watch the professors respond to indictments leveled recently by conservative, libertarian, and First Amendment figures against academic practice and politics. These figures cited voter registrations, campaign contributions, and occasional acts of oppression, but most of the time the first exhibit of bias and illiberalism was a sample of institutional language. Scholarly articles such as a 2003 study of the “conservative personality” that found fear and aggression at the heart of conservatism (“Political Conservatism as Motivated Social Cognition,” Psychological Bulletin. May 2003); course descriptions such as those gathered by American Council of Alumni and Trustees in a report issued last month; speech codes targeted by the Foundation for Individual Rights in Education; paper titles culled by Frederick Hess and Laura LoGerfo from the last meeting of the American Educational Research Association ... these formed the evidence. They served well because of their patent absurdity, or because of their offense to public taste, or their adversarial dogma (anti-American, anti-capitalist, etc.).
But while the manifest content had an immediate impact, sometimes entering national circulation as a reviled token (e.g., “little Eichmanns”), many claimed a deeper meaning for them. In a word, they were offered as symptomatic expressions, an index of the values, norms, biases, and interests of academics. Conservatives and others presented them as precisely the kind of language packed with “linguistic assumptions,” performing subtextual feats, and ripe for socio-political analysis.
And yet, how have the professors responded? Not by taking up the critical challenge and carrying out the analysis. Not by bouncing the samples off of the institution in which they appeared. Instead, they shot the messenger. They declared the samples isolated and un-representative, or they denied to them the symptoms alleged by the critics. The course description wasn’t a fair stand-in for the course itself, they protested. Ward Churchill’s post-9/11 rant was an aberration. The conference paper title was just a way to garner an audience, so let’s not confuse it with the real substance of the paper. In sum, they put the most benign construction on the samples. That turned the allegations back upon the people who cited them, David Horowitz, Anne Neal, and the rest, who were cast as sinister crazies pushing a vile political agenda.
One can understand the professors’ defensiveness, but to let it squelch the exercise of a practice that they have at other times wielded so boldly is a breach of their own ideals. Have they lived so long and so closely to “social justice,” “social change,” “queer,” “whiteness,” and “gender equality” that they do not recognize them as loaded terms? Have they imbibed the political currents of the campus so thoroughly that they regard a polemical phrasing in a course description as merely a lively description? By their own instruction, we should regard the widespread attention to race, gender, and their social construction as emanating from a world view and signaling an ideological commitment. When Ward Churchill’s notorious speech made headlines, the professors were correct to cite his First Amendment rights and reprove those calling for his job. But as more information came to light, and his political attitudes seemed to bear a closer relation to his scholarship, academic doctrine demanded that the institution that rewarded him be reviewed. Roger Bowen, general secretary of the American Association of University Professors, has assured the Commission on the Future of Higher Education that “Faculty members are accountable for their work in many ways,” including peer review of scholarship and grant applications and annual departmental review for salary and promotion. What, then, is the relationship between Churchill’s high ascent in the profession and his discredited writings? Humanities and social science professors work backward from institutional statements to the culture of the institution itself all the time. Why exempt academic language from the process?
The academic defense comes down to this: conservatives and libertarians read too much into bits and pieces of language -- an ironic turnabout, given that they used to make the same charge against literary theorists 20 years ago. Tim Burke, responding to the ACTA report, chooses the term “Eurocentric” as a case in point. While ACTA’s report selected a course description containing the term as an instance of bias, Burke replied, “I’ll let them in on a little secret: it can also be just a plain-old technical term for historiographical models that argue that modern world history has primarily been determined by factors that are endogamous to Europe itself.” So it can, but even if we accept that as one meaning of Eurocentric, it doesn’t erase the occasions when, as Burke concedes, “the term is also used as a fairly dumb epithet by nitwitted activists.” That is precisely one of the dangers of loaded terms. They can function neutrally or tendentiously, and when pressed the users can always fall back upon claims of innocence.
The question rests upon the frequency of biased meanings, “the existence of telling linguistic patterns,” as Erin O’Connor puts it while commenting on the issue. When a call for papers foregrounds anti-union corporatist practices, is that a tendentious usage, or are the libertarian commentators who cite it being oversensitive? The answer largely depends upon one’s relation to the institutional setting. When a libertarian delivers a talk at a symposium sponsored by Reason Magazine, the mention of government will have over- and undertones different from those issuing from government at a meeting of social justice advocates. From my perspective in 1991, I regarded Eurocentric, theory, patriarchy, and even the blank terms race and gender as descriptive ones. Yes, they had a political thrust, but essentially they were justified because they were accurate names for real phenomena in history and society. Indeed, it was the other discourse that was politicized, the one from which race etc. were absent. Now, having watched those terms in action, I see them as more often tendentious than not. In the majority of cases, their “institutional meaning” overshadows their denotative meaning.
That’s my experience, and maybe it’s too partial to count. But we can’t know for certain so long as leading academics remain as quick to deny the possibility that a narrow political agenda underlies academic discourse. Apart from the wall it erects against further inquiry, the reflex draws them into a vulnerable position. First of all, it results in overt intellectual blunders. For example, in the article cited above on the conservative personality, the authors define “conservatism” as, at heart, “opposition to change,” a simplistic and sweeping characterization that allows them to conclude, “One is justified in referring to Hitler, Mussolini, Reagan, and Limbaugh as right-wing conservatives ... because they all preached a return to an idealized past.” (They also add Stalin, Khrushchev, and Castro to the list of political conservatives.)
A second and more damaging problem in neutralizing their own terminology is the double standard it represents. Academics recognize the tension in terms such as race and sexuality, but they attribute its source to the resistances of others, persons who can’t give up their own biases and anxieties. That tactic will only work behind the campus walls. Try it in an outside setting and the arrogance comes across immediately. The hypocrisy shows, too, as academics fail their own standard. They present themselves as hard-headed, clear-sighted analysts, but in this case they prove selective in their labor. People outside the campus recognize that academia is just the kind of Establishment that calls out for ideological and social criticism, and its language is one place to begin. Academics already have a credibility problem when discussing their own practices, and if they wish to face down their many critics, they need to start extending those criticisms by themselves. Public observers realize, however reluctantly, that the best people to conduct that examination are the professors themselves, if only they will stop acting so proprietary. If academics don’t assume the lead, then they will find their credibility falling still further, having revised one of their favorite dicta to their own advantage -- “a ruthless criticism of everything existing,” everything, that is, but their own.
Mark Bauerlein is professor of English at Emory University.
Recently, a colleague asked me if I thought students were writing more poorly than in the past. Each time I hear this question, my heart sinks. Professors get worn down and frustrated in all kinds of ways, and it’s most obviously demonstrated in their cries of despair about the sorry state of students today. And when it comes to complaints about student writing, there’s no group more outraged than the faculty.
“What in the world did they learn in high school?”
“How did they get into college in the first place?”
“You won’t believe how bad my students are!”
“It must be that damned instant messaging that’s screwing them up!”
It would be difficult to discover if students write worse now than students did decades ago. Which students are we talking about? What evidence would we examine? What does it mean to write well? What constitutes a writing error? What standards should we apply? But my inquisitor must have assumed that I would be able to answer her question because I teach college writing. Because I have been keeping track. Or because English faculty members have years and years of yellowing bluebooks in a closet somewhere that would reveal the sorry truth. Maybe our scholarly journals were filled with evidence of this nasty decline.
And perhaps she assumed that we English types must feel the same frustration. That we were in the business of constantly despairing about students’ writing. (What a sad career that must be!) Or maybe she was just looking for a colleague to share her misery. “You won’t believe how awful they write! Ain’t it a shame we have to put up with these numbskulls? Let’s go get a pint.” (By the way, no one has ever asked me if I thought students read worse than they used to. But that would be hard to prove, too.)
It would also be hard to prove how many of our colleagues are in their cups about what their students don’t know or can’t do, but it depresses the hell out of me when I hear professors ask questions like these because they sometimes seem more interested in bashing students than getting at solutions. (See my recent “Pitching Writing” for how teachers contribute to student writing problems.)
Still I did respond to her by saying that I think the main difference between students then and now exists mostly in our heads, since in many cases what we are really doing is contrasting our students' experiences with our experiences in school. By that I mean, our expectations are pretty out of whack if we expect our students to be the kind of students we once were, because once upon a time we were the kind of students who went on to graduate school and became scholars in a particular discipline. Most of our college classmates didn't. And that's who most of our students are. And quite a few other folks besides.
Given that reality, we shouldn’t be surprised when students don’t always rise to expectations built on false nostalgia. But the earlier we discover what students can do in writing (and in reading, too), the better off we’ll all be. That’s why we should get writing samples early in the term, rather than gnash our teeth and waste red ink when reading those final term papers. In other words, instead of assuming we can apply the same old syllabi, lesson plans, and assignments semester after semester, year after year, we need to study our students and then adjust our instruction as necessary. A stitch in time.
Still, I also could have pointed my colleague toward some empirical research on college students’ writing errors that shows they don’t write any worse than previous generations and, in some cases, don’t write any worse than writers they’re asked to emulate.
In a 1986 study described in College Composition and Communication under the title “Frequency of Formal Errors in Current College Writing, or Ma and Pa Kettle Do Research,” Robert J. Connors and Andrea A. Lunsford discovered that “college students are not making more formal errors in writing than they used to." They compared error patterns identified by researchers in 1917 and 1930 and found that though the length of paper assignments had consistently increased over nearly 80 years, “the formal skills of students have not declined precipitously."
Further they claim, “[i]n spite of open admissions, in spite of radical shifts in demographics of college students, in spite of the huge escalation in population percentage as well as in sheer numbers of people attending American colleges, freshman are still committing approximately the same number of formal errors per 100 words they were before World War I."
It may be that this 20 year old study is dated and student writing has gotten worse since then, but subsequent studies of student error are absent in our scholarship -- even though the complaints about alleged errors continues. However, it’s also possible that student writing has actually improved over that period. In the last two decades, word processors, campus computer labs, university writing tutors, and spelling and grammar checkers have become commonplace and have helped students better understand writing as a complex process of planning, drafting, revising, and editing.
In another College Composition and Communication article, published in 1990 and titled “Frequency of Errors in Essays by College Freshmen and by Professional Writers,” Gary Sloan both confirmed the Connors and Lunsford study and discovered that even though professional writers are often served up as models for student writers, their writing may contribute to student confusion about correctness because their essays contain almost as many errors as first-year themes. Sloan selected 20 published essays from a college composition reader and 20 student essays composed during the last week of an introductory writing course. He then analyzed these two samples using an error analysis technique derived from a grammar handbook commonly used in college writing courses.
His conclusion? “Connors and Lunsford found 9.53 errors per essay or 2.26 errors per 100 words; my figures for the same are 9.60 and 2.04. The professionals have 8.55 errors per writer and 1.82 per 100 words." Further, given the fact that misspelling was the most common error in student writing, but absent in professional writing, the student error count would have actually been less than the professional average if students had only spellchecked their essays -- again an editing technology not available to many students in 1990.
Interesting stuff, but these studies may not affect the deeply ingrained attitudes some faculty hold about student writing. In the long run, it shouldn’t matter to us whether students write worse, or better, or just about the same as they always have. Or whether they were raised on a diet of instant messaging.
Our responsibility is to find out where they are as early as we can and to discover the best methods for getting them where they need to be -- even if that means mandatory treks to the writing center. And if they don’t succeed (or won’t play ball), then we shouldn’t play like they did. As always, students should be held responsible, too. There’s no question about that.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Why do narratives of decline have such perennial appeal in the liberal arts, especially in the humanities? Why is it, year after year, meeting after meeting, we hear laments about the good old days and predictions of ever worse days to come? Why is such talk especially common in elite institutions where, by many indicators, liberal education is doing quite well, thank you very much. I think I know why. The opportunity is just too ripe for the prophets of doom and gloom to pass up.
There is a certain warmth and comfort in being inside the “last bastion of the liberal arts,” as B.A. Scott characterized prestigious colleges and research universities in his collection of essays The Liberal Arts in a Time of Crisis (NY Praeger, 1990). The weather outside may be frightful, but inside the elite institutions, if not “delightful,” it’s perfectly tolerable, and likely to remain so until retirement time.
Narratives of decline have also been very useful to philanthropy, but in a negative way. As Tyler Cowen recently noted in The New York Times, “many donors … wish to be a part of large and successful organizations -- the ‘winning team’ so to speak.” They are not eager to pour out their funds in order to fill a moat or build a wall protecting some isolated “last bastion.” Narratives of decline provide a powerful reason not to reach for the checkbook. Most of us in the foundation world, like most other people, prefer to back winners than losers. Since there are plenty of potential winners out there, in areas of pressing need, foundation dollars have tended to flow away from higher education in general, and from liberal education in particular.
But at the campus level there’s another reason for the appeal of the narrative of decline, a genuinely insidious one. If something goes wrong the narrative of decline of the liberal arts always provides an excuse. If course enrollments decline, well, it’s just part of the trend. If students don’t like the course, well, the younger generation just doesn’t appreciate such material. If the department loses majors, again, how can it hope to swim upstream when the cultural currents are so strong? Believe in a narrative of decline and you’re home free; you never have to take responsibility, individual or collective, for anything having to do with liberal education.
There’s just one problem. The narrative of decline is about one generation out of date and applies now only in very limited circumstances. It’s true that in 1890, degrees in the liberal arts and sciences accounted for about 75 percent of all bachelor’s degrees awarded; today the number is about 39 percent, as Patricia J. Gumport and John D. Jennings noted in “Toward the Development of Liberal Arts Indicators” (American Academy of Arts and Sciences, 2005). But most of that decline had taken place by 1956, when the liberal arts and sciences had 40 percent of the degrees.
Since then the numbers have gone up and down, rising to 50 percent by 1970, falling to 33 percent by 1990, and then rising close to the 1956 levels by 2001, the last year for which the data have been analyzed. Anecdotal evidence, and some statistics, suggest that the numbers continue to rise, especially in Research I universities.
For example, in the same AAA&S report ("Tracking Changes in the Humanities) from which these figures have been derived, Donald Summer examines the University of Washington (“Prospects for the Humanities as Public Research Universities Privatize their Finances”) and finds that majors in the humanities have been increasing over the last few years and course demand is strong.
The stability of liberal education over the past half century seems to me an amazing story, far more compelling than a narrative of decline, especially when one recognizes the astonishing changes that have taken place over that time: the vast increase in numbers of students enrolled in colleges and universities, major demographic changes, the establishment of new institutions, the proliferation of knowledge, the emergence of important new disciplines, often in the applied sciences and engineering, and, especially in recent years, the financial pressures that have pushed many institutions into offering majors designed to prepare students for entry level jobs in parks and recreation, criminal justice, and now homeland security studies. And, underlying many of these changes, transformations of the American economy.
The Other, Untold Story
How, given all these changes, and many others too, have the traditional disciplines of the arts and sciences done as well as they have? That would be an interesting chapter in the history of American higher education. More pressing, however, is the consideration of one important consequence of narratives of decline of the liberal arts.
This is the “last bastion” mentality, signs of which are constantly in evidence when liberal education is under discussion. If liberal education can survive only within the protective walls of elite institutions, it doesn’t really make sense to worry about other places. Graduate programs, then, will send the message that success means teaching at a well-heeled college or university, without any hint that with some creativity and determination liberal education can flourish in less prestigious places, and that teaching there can be as satisfying as it is demanding.
Here’s one example of what I mean. In 2000, as part of a larger initiative to strengthen undergraduate liberal education, Grand Valley State University, a growing regional public institution in western Michigan, decided to establish a classics department. Through committed teaching, imaginative curriculum design, and with strong support from the administration, the department has grown to six tenured and tenure track positions with about 50 majors on the books at any given moment. Most of these are first-generation college students from blue-collar backgrounds who had no intention of majoring in classics when they arrived at Grand Valley State, but many have an interest in mythology or in ancient history that has filtered down through popular culture and high school curricula. The department taps into this interest through entry-level service courses, which are taught by regular faculty members, not part timers or graduate students.
That’s a very American story, but the story of liberal education is increasingly a global one as well. New colleges and universities in the liberal arts are springing up in many countries, especially those of the former Soviet Union.
I don’t mean that the spread of liberal education comes easily, in the United States or elsewhere. It’s swimming upstream. Cultural values, economic anxieties, and all too often institutional practices (staffing levels, salaries, leave policies and research facilities) all exert their downward pressure. It takes determination and devotion to press ahead. And those who do rarely get the recognition or credit they deserve.
But breaking out of the protective bastion of the elite institutions is vital for the continued flourishing of liberal education. One doesn’t have to read a lot of military history to know what happens to last bastions. They get surrounded; they eventually capitulate, often because those inside the walls squabble among themselves rather than devising an effective breakout strategy. We can see that squabbling at work every time humanists treat with contempt the quantitative methods of their scientific colleagues and when scientists contend that the reason we are producing so few scientists is that too many students are majoring in other fields of the liberal arts.
The last bastion mentality discourages breakout strategies. Even talking to colleagues in business or environmental studies can be seen as collaborating with the enemy rather than as a step toward broadening and enriching the education of students majoring in these fields. The last bastion mentality, like the widespread narratives of decline, injects the insidious language of purity into our thinking about student learning, hinting that any move beyond the cordon sanitaire is somehow foul or polluting and likely to result in the corruption of high academic standards.
All right, what if one takes this professed concern for high standards seriously? What standards, exactly, do we really care about and wish to see maintained? If it’s a high level of student engagement and learning, then let’s say so, and be forthright in the claim that liberal education is reaching that standard, or at least can reach that standard if given half a chance. That entails, of course, backing up the claim with some systematic form of assessment.
That provides one way to break out of the last bastion mentality. One reason that liberal education remains so vital is that when properly presented it contributes so much to personal and cognitive growth. The subject matter of the liberal arts and sciences provides some of the best ways of helping students achieve goals such as analytical thinking, clarity of written and oral expression, problem solving, and alertness to moral complexity, unexpected consequences and cultural difference. These goals command wide assent outside academia, not least among employers concerned about the quality of their work forces. They are, moreover, readily attainable through liberal education provided proper attention is paid to “transference.” “High standards” in liberal education require progress toward these cognitive capacities.
Is it not time, then, for those concerned with the vitality of liberal education to abandon the defensive strategies that derive from the last bastion mentality, and adopt a new and much more forthright stance? Liberal education cares about high standards of student engagement and learning, and it cares about them for all students regardless of their social status or the institution in which they are enrolled.
There is, of course, a corollary. Liberal education can’t just make the claim that it is committed to such standards, still less insist that others demonstrate their effectiveness in reaching them, unless those of us in the various fields of the arts and sciences are willing to put ourselves on the line. In today’s climate we have to be prepared to back up the claim that we are meeting those standards. Ways to make such assessments are now at hand, still incomplete and imperfect, but good enough to provide an opportunity for the liberal arts and sciences to show what they can do.
That story, I am convinced, is far more compelling than any narrative of decline.
"But I was not a good reader. Merely bookish, I lacked a point of view. I vacuumed books for epigrams, scraps of information, ideas, themes -- anything to fill the hollow within me and make me feel educated.” --Richard Rodriguez, Hunger of Memory
Some years ago, I was walking down a crowded hallway to class and almost stumbled over a student sitting on the floor against the wall. She sat cross-legged with a book in her lap and a yellow highlighter in her hand. On the floor next to her was a copy of the same book. It looked to me to be one of those massive science textbooks, biology maybe or chemistry. As I recovered my footing, I turned again to look down at her, and saw that she was copying the highlighted text from the book on the floor onto the pages in her lap.
For the past several years, I have been studying how first-year students at my university visualize what happens when they read. This research began with my interest in the learning relationships students develop with writing and reading in college. Initially, I studied how students’ attitudes toward writing interfered with or contributed to their chances for success in first-year composition. More recently, I’ve investigated how students depict their reading habits through drawing.
My preliminary research revealed that students who had high ACT scores in reading, who self-reported positive attitudes toward reading, and who earned high grades in their composition classes tended to represent their reading habits metaphorically. They drew pictures that symbolized their feelings or the ways reading affected them. One of these drawings (shown at right) depicts an open book with a reader poised to dive into the pages.
Students who had lower ACT scores, who reported negative attitudes, and who earned lower grades tended to represent their reading habits realistically. Common among these were self-portrait stick figures falling asleep in bed or sitting at a desk distracted by noise from another person or a television.
As I continued to examine these drawings, especially those including imaginative representations of reading, I began to investigate the various ways reading is analogized and to make a list of these metaphors.
Here is a sample of 20 from my ever-expanding collection.
1. Reading is grafting, and the reader connects new text to another text read.
2. Reading is dancing, and the reader follows the lead and steps of the text, including its rhythm, music, lyric, genre, and flow.
3. Reading is sorting, and the reader puts knowledge and experience and dramatic elements of text into categories.
4. Reading is surveying, and the reader examines the territory of the book, its surface, size, structure, scope, distinguishing features, divisions, boundaries, etc.
5. Reading is integrating, and the reader incorporates new knowledge into other knowledge; blending and kneading together.
6. Reading is counting, and the reader is concerned with the number of pages in the text or how many pages are left until they can escape the text (also envision the image of a prisoner marking off days on calendar).
7. Reading is soaking up, and the reader absorbs the text like a sponge.
8. Reading is a vehicle, and the reader travels to another place.
9. Reading is eating, and the reader consumes and is nourished (or poisoned) by the text.
10. Reading is a mirror, and the reader sees reflection in text.
11. Reading is a machine, and the reader feeds the text through a mechanical process.
12. Reading is a transaction, and the reader and text exchange value: the reader receives knowledge and experience, the text receives meaning, and the newly produced response is the receipt or proof of the transaction.
13. Reading is exercise, and the reader gains intellectual agility and strength.
14. Reading is mining, and the reader digs into the text for answers.
15. Reading is a good investment, and the reader’s efforts pay off.
16. Reading is planting, and the reader receives seeds of knowledge that grow into new understanding.
17. Reading is unwrapping, and the reader opens the text to reveal a hidden message.
18. Reading is translating, and the reader moves the meaning from one language to another.
19. Reading is a friend, and the reader enjoys the companionship of the text.
20. Reading is wrestling, and the reader struggles with the text.
In Metaphors We Live By, Lakoff and Johnson write, “Our ordinary conceptual system, in terms of which we both think and act, is fundamentally metaphorical in nature.” In addition, they argue that these metaphorically-determinate conceptual frameworks are unconsciously meta-cognitive; that is, we reason and engage automatically without understanding the powerful metaphors shaping our interactions with each other and the world around us.
Thus, metaphorical concepts also impact students’ relationships with texts. My research so far suggests that many students have not developed adequate reading habits because they bring with them incapacitating conceptions or analogies of reading. They see it as torture or a lullaby. They also assign human agency to the text. They blame it for being hard to understand, when in fact they lack the understanding to engage the text successfully.
Rather than positioning themselves to become the reader the text wants them to be -- to go out and find the knowledge the text assumes the reader already owns -- students lash out at the unresponsive novel, poem, play, essay, or textbook chapter. They also sometimes see reading assignments as lifeless information to be transferred from one place to another (like the student described in the anecdote above copying highlighted words from one book to another), or as Paulo Freire analogizes in Pedagogy of the Oppressed, they see knowledge as temporary commodities to be banked in their memories until withdrawn by an instructor at test time.
But these faulty conceptions of reading didn’t magically appear out of thin air. Students learned them, and many certainly learned them one way or another, implicitly or explicitly, in school. That so many of our students have come to hate reading (and writing, of course, too) is a cultural disgrace. Therefore, we need specific counter-cultural methods of instruction to adequately respond to the inappropriate metaphors of reading students bring to the classroom.
For my part, I want to discover which of my students have, knowingly or not, embraced these self-defeating notions of reading and then provide them the means to replace those conceptual roadblocks with more effective and empowering metaphors.
I also propose that professors across the curriculum actively identify and more effectively deploy metaphors of reading. In other words, rather than assume our students already know what it means to read in their disciplines, we should reflect on the kinds of reading we expect our students to practice, examine the metaphorical concepts at the heart of those reading strategies, and then present those metaphors in the classroom.
For example, what are the metaphors that might help students better visualize and practice the reading logics of comprehension, application, analysis, synthesis, and evaluation? How might these metaphors help us model more effectively for our students what it means to read in our fields? However we picture and present them, metaphors we read by should be highlighted and paraded down the crowded hallways of learning.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Whether or not your college or university offers a course in public speaking probably has escaped your notice. Nevertheless, it might be worthwhile to give the matter a minute or two of consideration. You might find that the availability or unavailability of this course says something about how diligently a college meets its students' needs, and also about how robust are its humanities offerings.
At first glance, public speaking is an unassuming course of study -- not apparently a canary in a coal mine. Taught in many places by grad students with teaching stipends, or by last-minute, part-time hires, public speaking is no glamour queen, and has less prestige than even college composition. Writing in 1970, in Language Is Sermonic, Richard Weaver noted that whereas once intellectual giants, men of subtle reasoning and wit, taught rhetoric, now it is taught by "beginners, part-time teachers, graduate students, faculty wives, and various fringe people...." Being a fringe sort of person myself -- a former administrator, adjunct, and perpetual faculty wife -- I can see his point. But it was not always so.
Up until the beginning of the 20th century, rhetoric was the most important course of study for young men who wanted to get ahead in the world. In Classical Greece, it was the only one. In the agora, if you found yourself a good sophist, you were a made man. So what if being rhetorically trained and well spoken disqualified you from becoming Plato's philosopher-king. Plato was telling a morally edifying fairy tale for a mundus imaginalis, while the sophists were teaching Athenians to communicate effectively with fellow citizens in the real world.
But at top universities, Plato's view of rhetoric has won out, and not simply as a result of a kind of puritanical suspicion of smooth talking. About rhetoric's fall from grace, Weaver argues that the elevation of science as a mode of thought is significant. It would seem that rhetoric, with its focus on probability has been the victim of the irresistible charm and glamour of the scientific method. Weaver also argues that in our relations with other human beings to appeal only to logic, as science would have us do, is to appeal to part of a human being. Placing such a limit on intellectual inquiry and communication ignores important complexities. He points out that the rhetorician addresses "historical man," a person experiencing the stream of history and the political and moral exigencies history presents and the choices these exigencies require.
Literature, of course, does the same thing, but in a more attenuated way. In teaching a person how to communicate with other persons, practical rhetoric inculcates along with appreciation of human complexity those devoutly worshiped "critical thinking skills." A discipline steeped in human complexity and teaching the skills to deal with convoluted layers of human experience would seem to fit very well within the traditional province of the humanities.
Unfortunately, as Weaver has pointed out, appreciating human complexity means exploring human emotion. And this kind of exploration has been a problem for an academy wed to science. So, along with its unusually modest goal of deliberative probability instead of scientific certainty, rhetoric's teaching of emotional appeals along with logical and ethical ones has seriously undermined academic confidence in the discipline. This rejection of emotion in persuasion by the academic top-tier is probably priggish and short-sighted. Anyone who uses language to persuade knows that it is impossible to fully engage others in an argument without using emotion. Considering emotional appeals to be simply matters of superficial style rather than of argumentative substance is to fail to appreciate rhetoric as a fully humane discipline.
Given the humanity and practicality of rhetoric, it is interesting to observe how the discipline has fared vis a vis literary studies. The downward trajectory of rhetoric's academic standing is the exact opposite of the fate of its academic cousin -- "literary studies" -- which in rhetoric's heyday was, as Weaver points out, the domain of intellectual plebeians, those faculty wives and other marginal types. Now literature departments are so intellectually lofty that to offer completely non-instrumental instruction is a badge of honor, while to teach something for use in the marketplace, something not solely for the sake of pure, inapplicable knowledge, is to be intellectually despoiled.
It is no wonder, then, that Harvard does not teach public speaking. As Emily Nelson writing in The Harvard Crimson notes, "A quick browse through the Courses of Instruction will yield classes on topics as specific as medieval Welsh literature and the theory of the individual in Chinese literary culture. However, even a thorough search would not reveal the words 'Public Speaking' in any course title."
Granted, the Harvard College Committee on Curricular Review recommended in 2004 that the college's writing program be subjected to review and that those supervising instruction in college majors ensure that "instruction and feedback on written and oral communication [are] an integral part of the concentration program." If the committee reviewing instruction in oral and written communication finds a need for public speaking at Harvard, and if the college then does not ignore the recommendation, Harvard would be lonely in the Ivy League in offering a separate course in public speaking to liberal arts students.
Engineering programs, on the other hand, do widely offer and require that students demonstrate competence in oral argument. This is the case in the Ivy League and in the top-tier of public universities. While such universities as Michigan, Virginia , and Berkeley do not offer courses in rhetoric to their liberal arts majors, their engineering and business students generally are required to take courses in rhetoric, discrete offerings available only to engineering and business students.
Why are liberal arts students denied this resource? Provosts explain that liberal arts majors receive ample opportunity to hone skills in oral reasoning by means of class discussion. However, given class sizes at public universities, and the not universal tendency to speak out in class, this rationale seems overly optimistic.
So, Harvard and Berkeley (oddly, having its own department of Rhetoric) do not teach liberal arts majors public speaking skills. On the other hand, rhetoric -- in its most common form, public speaking -- is taught all over the country. You would be hard-pressed to find a land grant university or community college that did not offer public speaking to its students, who enroll in these courses in large numbers.
Top-tier rejection of rhetorical instruction, especially in the form of public speaking, seems to be about fundamental failures of undergraduate education in general and about failures of the humanities in particular. It is especially curious that in the face of calls for accountability in regard to student learning public universities have opted out of providing students with some very useful knowledge, while also failing to recognize the value of the discipline to humane studies.
The Association of American Universities may call for "reinvigorating the humanities," and the joint conference of the American Council of Learned Societies and the AAU may express the intention "to develop a shared agenda for raising the profile of the humanities inside and outside of academia," but criticism of the status quo is stifled by reassuring boilerplate about the "vigor" of the humanities in today's higher education. Case Western Reserve University's then-president, Edward Hundert announced at the conference that the humanities are in great shape except "when it comes to funding, when it comes to new ways of harnessing information technology for new kinds of research and new collaborative paradigms for that research, and in communicating a more coherent message so that the humanities might gain more visibility, public support, prestige, and funding both within the university and society at large." Perhaps before issuing reports and convening conferences about the status of the humanities, someone should pick up a copy of Aristotle's Rhetoric.
Margaret Gutman Klosko
Margaret Gutman Klosko formerly taught public speaking at the University of Virginia and at Piedmont Virginia Community College. She is a freelance writer based in Charlottesville.
As I write this essay, I am without my computer, a laptop I use every day for work and countless other activities. It belongs solely to me, and is my own responsibility. I adore the machine in a way that might be viewed as sinful by some. Certainly, I have an attachment, as Buddhists might say, to this appliance. My life and work are to a certain extent invested in this device, and suddenly it began to fail: screen flickering, hard drive whirring and clacking. As it did so, I too began to flicker and whir in mounting panic, "ohnoohnoohno" escaping my lips. All my files flashed before my eyes and the work I had been progressing steadily, if slowly -- on an edition I'm preparing, essays short and long, everything I've written in four years, and all my teaching files -- seemed in a dire predicament. Hyperventilating, I staved off panic long enough to burn a CD when the computer stopped its antics briefly. I worked with the same speed I'd used to get to the basement with my cats and dogs one year when I lived in Kansas and the tornado sirens blared at 2 a.m. Panic can be a swift motivator. Just as the CD finished, the screen went black as the drive continued to clatter. Oh no.
The computer did not die, but after several queries, I learned I would have to return my precious laptop to the mothership for repair. I would be without it for at least a week or more. Only a week, but my work was progressing so well. I'd just achieved a helpful rhythm in my daily schedule. I feared the torpor a week might trigger, particularly a week filled with worry and self-pity, cut off from my habitual writing process. The computer was how I wrote! My attachment was dangerous as it now threatened to derail my progress on numerous projects, not the least of which was a manuscript I hoped to finish editing by the end of August. Losing time seemed impossible. I had to push through and figure out a way to adapt, to change the way I was working. I had to challenge my fixed thinking about my writing.
As my initial panic subsided, I saw what I could do alone, independent of my favorite technology. I printed my edition manuscript and pulled out pencils. I found the notebooks I'd kept several years ago and reviewed what I had written by hand using the old technology I reserve for writing in my journals: my Waterman fountain pen. It came as a shock to find that I had forgotten how effective writing this way could be. Used to the exciting tools on the computer -- the thesaurus, the word count -- I'd forgotten that I had written the entire preface of the edition in long-hand well before transferring the draft to my computer. It's not that I never used my pen or the notebooks I cached. I have, in fact, delighted in the sensual pleasures of the flowing ink and the lovely Japanese paper that fills the notebooks. That paper does provide, as the cover proclaims "most advanced quality" and "gives best writing features." I love the way the ink works with this paper but I usually reserve that pleasure for my journal writing, preferring the illusion of speed in my other work. "Work" proceeds more effectively on the computer, or so I told myself.
In my break from computer assistance, I discovered a new truth: writing by hand can make my thinking go faster. This was a jolt to my fixed ideas indeed. As I developed my working life, my writing process, and my consciousness around my adored computer, I had ignored several strategies that worked as well or better to enhance my work. Though forced as I was to adapt because of my loss, the change in my own perceptions of my writing process were dramatic and refreshing. Instead of stagnating, I transformed my thinking. Instead of falling into inertia, I pursued my work, developing new energies as I did so. I discovered I could write 500 words in one hour.
Perhaps my computer had become more of a task-master than I imagined. Unlike the singular relationship pen has to paper, my computer holds all my tasks, so when I open the desktop's folders, my attention remains divided among the projects I must sort through before starting on the one I choose. Putting pen to paper isolates the task at hand to the plain work of putting words on paper. Plain like Jane Eyre, without adornment, straightforward. My computer had become Blanche Ingram, right down to her alabaster skin.
What unnerved me at the onset of my computer failure was not only the loss of files and the dread of expensive repairs, but also the fear that my writing could only proceed in one way: through the habits I had become used to. Of course, those habits weren't working particularly well, but they were familiar, comfortable, and had worked well enough before. At the moment of change, my first response was to seize the fleeing past. I heard echoed in my panic the same excuses, reasons, and rationales I've heard from first-year students on the cusp of change. Desperate to use what had worked before, they stand on the shifting terrain of their transition to college life, hoping that what they knew will work again in these different circumstances, afraid that it won't but not knowing what might. This time, I had to say to myself "I am asking you to work differently." I needed to release my attachment not only to the computer, but also to the idea of my habitual routine, to unfix my thinking. The computer is, after all, a tool of the mind, not the mind itself, at least not yet.
I did press on, and I accomplished much in the days my computer was gone, working with change, and developing new strategies for writing. It has come home now, recovered, repaired, working normally, but our relationship has changed. When the box was delivered, I did not rush to unpack it. I was in the midst of writing this essay, after all, and the computer could wait. Once I reached a stopping place, I opened the computer and began reclaiming my territory. The hard drive had been wiped clean, so I began reinstalling software. Everything that made the machine my personal computer had vanished in the repair, from the notification sounds to the desktop photo. Small things were amiss, like default fonts and colors. It will take time to rebuild what was important and leave what was gone behind. And, of course, I did eventually type my handwritten essay into the computer. I'm not so foolish as to think I won't grow dependent again on this wonderful device, but at least I've learned that a change of habit can trigger more than panic, and may yet lead to new discoveries.
As a teacher of writing and literature at Salem State College, I hear a lot of stories. My students, although they may never have ventured more than 20 miles from where they were born, bring hard lessons of endurance to the classroom that seem more profound than any I'd had at their age. For years I've believed that they bring a certain wisdom to the class, a wisdom that doesn't score on the SAT or other standardized tests. The old teaching cliché -- I learn from my students -- feels true, but it is hard to explain. I'm not particularly naïve. I know that life can be difficult. So it is not that my students initiate me into the world of sorrow. It is that they often bring their sorrows, and their struggles, to the material, and when they do, it makes life and literature seem so entwined as to be inseparable.
This past year, for the first time, I taught African American literature: two sections each semester of a yearlong sequence, around 22 students per section. The first semester we began with Phyllis Wheatley and ended with the Harlem Renaissance. The second semester we started with Zora Neale Hurston and Richard Wright and ended with Percival Everett's satire, Erasure, published early in the new millennium.
The students in these classes weren't the ones I typically had in my writing classes. About half were white, and the other half were black, Latino, or Asian. They were generally uninterested or inexperienced in reading, simply trying to satisfy the college's literature requirement. One day before spring break I was assigning the class a hundred pages from Toni Morrison's Sula, and one student looked aghast. "We have to read during vacation?" he sputtered. I learned from them the whole year.
In the fall semester, I was teaching W. E. B. Du Bois's The Souls of Black Folk. As classes go, it had been fairly dull. Du Bois's essays didn't have the compelling story line of the slave narratives that we had read earlier in the semester. We had just begun examining Du Bois's idea of "double consciousness." It is a complicated notion that an African American, at least around 1900 when Du Bois was writing, had "no true self-consciousness" because he was "always looking at one's self through the eyes of others ... measuring one's soul by the tape of a world that looks on in amused contempt and pity." In class, I read this definition, paraphrased it, then asked, "Does this make sense to you?"
There was the usual pause after I ask a question and then, from Omar, a large, seemingly lethargic African American, came a soulful, deep-throated "yeah." The word reverberated in the haphazard circle of desks as we registered the depths from which he had spoken. The room's silence after his "yeah" was not the bored silence that had preceded it. The air was charged. Someone had actually meant something he had said. Someone was talking about his own life, even if it was only one word.
I followed up: "So what do you do about this feeling? How do you deal with it?"
Everyone was staring at Omar, but he didn't seem to notice. He looked at me a second, then put his head down and shook it, slowly, as if seeing and thinking were too much for him. "I don't know, man. I don't know."
The rest of the heads in class dropped down, too, and students began reviewing the passage, which was no longer just a bunch of incomprehensible words by some long-dead guy with too many initials.
Every book that we studied after that day, some student would bring up double consciousness, incorporating it smartly into our discussion. Omar had branded the concept into everyone's minds, including mine.
One idea that arises from double consciousness is that, without "true self-consciousness," you risk giving in and accepting society's definitions of yourself, becoming what society tells you that you are. Such a capitulation may be what happens to Bigger Thomas, the protagonist of Richard Wright's Native Son, a novel we read during the second semester. Native Son is a brutal book. Bigger, a poor African American from the Chicago ghetto, shows little regret after he murders two women. His first victim is Mary, the daughter of a wealthy white family for whom Bigger works as a driver. After Bigger carries a drunk, semiconscious Mary up to her room, he accidentally suffocates her with a pillow while trying to keep her quiet so his presence won't be discovered. Realizing what he has done, he hacks up her body and throws it in the furnace. Emboldened rather than horrified, he writes a ransom note to the family and eventually kills his girlfriend, Bessie, whom he drags into the scheme. In the end, he's found out, and, after Chicago is thrown into a hysterical, racist-charged panic, he's caught, brought to trial -- a very long trial that contains a communist lawyer's exhaustive defense of Bigger that is an indictment of capitalism and racism -- and sentenced to death.
Readers, to this day, are not sure what to make of Bigger. Is he to be pitied? Is he a warning? A symbol? A product of American racism?
During the second week of teaching Native Son, I was walking through the college's athletic facility when I heard my name, "Mr. Scrimgeour. Mr. Scrimgeour..."
I turn and it is Keith, an African American from the class. "Hey, I wanted to tell you, I'm sorry."
"Sorry?" He has missed a few classes, but no more than most students. Maybe he hasn't turned in his last response paper.
"Yeah, I'm going to talk in class more." I nod. He looks at me as if I'm not following. "Like Bigger, I don't know.... I don't like it." His white baseball cap casts a shadow over his face so that I can barely see his eyes.
"What don't you like?"
"He's, like," Keith grimaces, as if he isn't sure that he should say what he is about to say. "He's like a stereotype -- he's like what people -- some people -- say about us."
On "us," he points to his chest, takes a step back, and gives a pained half grin, his teeth a bright contrast to his dark, nearly black skin.
"Yeah," I say. "That's understandable. You should bring that up in the next class. We'll see what other people think."
He nods. "And I'm sorry," he says, taking another step back, "It's just that...." He taps his chest again, "I'm shy."
Keith has trouble forming complete sentences when he writes. I don't doubt that my fourth-grade son can write with fewer grammatical errors. Yet he had identified the criticism of Wright's book made by such writers as James Baldwin and David Bradley, whose essays on Native Son we would read after we finished the novel. And he knew something serious was at stake -- his life -- that chest, and what was inside it, that he'd tapped so expressively. Was Bigger what Baldwin identified as the "inverse" of the saccharine Uncle Tom stereotype? Was Wright denying Bigger humanity? And, if so, should we be reading the book?
To begin answering these questions required an understanding of Bigger. For me, such an understanding would come not just from the text, but from my students' own lives.
That Keith apologized for his lack of participation in class is not surprising. My students are generally apologetic. "I'm so ashamed," one student said to me, explaining why she didn't get a phone message I'd left her. "I live in a shelter with my daughter." Many of them feel a sense of guilt for who they are, a sense that whatever went wrong must be their fault. These feelings, while often debilitating, enable my students, even Keith, to understand Bigger, perhaps better than most critics. Keith, who -- at my prompting -- spoke in class about being pulled over by the police, understood the accumulation of guilt that makes you certain that what you are doing, and what you will do, is wrong. Bigger says he knew he was going to murder someone long before he actually does, that it was as if he had already murdered.
Unlike his critics, Richard Wright had an unrelentingly negative upbringing. As he details in his autobiography, Black Boy, Wright was raised in poverty by a family that discouraged books in the violently racist South. There was little, if anything, that was sustaining or nurturing. Perhaps a person has to have this sense of worthlessness ground into one's life to conceive of a character like Bigger. Like my students, one must be told that one isn't much often enough so that it is not simply an insult, but a seemingly intractable truth.
"I'm sorry," Keith had said. It was something Bigger could never really bring himself to say, and in this sense the Salem State students were much different from Bigger. Their response to society's intimidation isn't Bigger's rebelliousness. Wright documents Bigger's sense of discomfort in most social interactions, particularly when speaking with whites, during which he is rendered virtually mute, stumbling through "yes, sirs" and loathing both himself and the whites while doing so.
Although my students weren't violent, they identified with Bigger's discomfort -- they'd experienced similar, less extreme discomforts talking to teachers, policemen, and other authority figures. As a way into discussing Bigger, I'd asked them to write for a few minutes in class about a time in which they felt uncomfortable and how they had responded to the situation. I joined them in the exercise. Here's what I wrote:
As a teenager, after school, I would go with a few other guys and smoke pot in the parking lot of the local supermarket, then go into the market's foyer and play video games stoned. While I felt uncomfortable about smoking pot in the parking lot, I didn't really do much. I tried to urge the guys I was with to leave the car and go inside and play the video games, but it wouldn't mean the same thing: to just go in and play the games would be childish, uncool, but to do it after smoking pot made it OK -- and once I was in the foyer, it was OK.; I wouldn't get in trouble. But mostly I did nothing to stop us. I toked, like everyone else. I got quiet. I didn't really hear the jokes, but forced laughter anyway. I was very attentive to my surroundings -- was that lady walking out with the grocery cart looking at us? Afterward, when we went in and manipulated those electronic pulses of light and laughed at our failures, we weren't just laughing at our failures, we were laughing at what we had gotten away with.
After they had worked in groups, comparing their own experiences to Bigger's, I shared my own writing with the class. Of course, there were smiles, as well as a few looks of astonishment and approbation. I had weighed whether to confess to my "crime," and determined that it might lead to learning, as self-disclosure can sometimes do, and so here I was, hanging my former self out on a laundry line for their inspection.
What came of the discussion was, first of all, how noticeable the differences were between my experience and Bigger's. I was a middle class white boy who assumed he would be going to college. I believed I had a lot to lose from being caught, while Bigger, trapped in a life of poverty, may not have felt such risks. Also, the discomfort I was feeling was from peer pressure, rather than from the dominant power structure. Indeed, my discomfort arose from fact that I was breaking the rules, whereas Bigger's arose from trying to follow the rules -- how he was supposed to act around whites.
But there was also a curious similarity between my experience and Bigger's. Playing those video games would have meant something different had we not smoked pot beforehand. The joy of wasting an afternoon dropping quarters into Galaga was about knowing that we had put one over on the authorities; it was about the thrill of getting away with something, of believing, for at least a brief time, that we were immune to society's rules. Like me after I was safely in the supermarket, Bigger, upon seeing that he could get away with killing Mary, felt "a queer sense of power," and believed that he was "living, truly and deeply." In a powerless life, Bigger had finally tasted the possibility of power.
My students know Bigger moderately well. They don't have his violent streak; they don't know his feelings of being an outsider, estranged from family and community despite hanging out with his cronies in the pool hall and being wept over by his mother.
What they understand is his sense of powerlessness. They have never been told that they can be players on the world stage, and, mostly, their lives tell them that they can't, whether it's the boss who (they think) won't give them one night off a semester to go to a poetry reading, or the anonymous authority of the educational bureaucracy that tells them that due to a missed payment, or deadline, they are no longer enrolled. As one student writes in his midterm: "Bigger is an African American man living in a world where who he is and what he does doesn't matter, and in his mind never will."
I went to a talk recently by an elderly man who had worked for the CIA for 30 years, an engineer involved with nuclear submarines who engaged in the cloak-and-dagger of the cold war. The layers of secrecy astonish. How much was going on under the surface! -- the trailing and salvaging of nuclear subs; the alerts in which cities and nations were held over the abyss in the trembling fingers of men as lost as the rest of us, though they generally did not realize it.
During the questions afterward, someone asked about the massive buildup of nuclear arsenals. "Didn't anyone look at these thousands of nuclear warheads we were making and say 'This is crazy?' "
The speaker nodded, his bald freckled head moving slowly. He took a deep breath. "It was crazy, but when you are in the middle of it, it is hard to see. No one said anything."
After the talk, I fell into conversation with the speaker's son, a psychologist in training. I was noting how tremendously distant this world of espionage was from the world of my students, how alien it was. And I said that the stories of near nuclear annihilation frightened me a lot more than they would frighten them. In essence, my students saw their lives like Bigger's: The great world of money and power was uninterested in them and moved in its ways regardless of what they did. Like Bigger, they would never fly the airplanes that he, who had once dreamed of being a pilot, watches passing over the Chicago ghetto.
"It's too bad they feel so disempowered," the son said, and it is. Yet there is something valuable in their psychology, too. It is liberating to let that world -- money and power -- go, to be able to see the outlines of your existence, so that you can begin to observe, and know, and ultimately make an acceptable marriage with your life. Some might say it is the first step to becoming a writer.
After September 11, 2001, a surprising number of students didn't exhibit the depth of horror that I had witnessed others display on television. "I'm sorry if I sound cold," one student said, "but that has nothing to do with me." One of my most talented students even wrote in an essay, "The war has nothing to do with my life. I mean the blood and the death disgusts me, but I'm sorry -- I just don't care."
And then I watched them realize how it did indeed have to do with them. It meant that they lost their jobs at the airport, or they got called up and sent to Afghanistan or Iraq. The world doesn't let you escape that easily. Bigger got the chair.
It has been two months since we finished Native Son. The school year is ending, and I rush to class, a bit late, trying to decide whether to cancel it so that I can have lunch with a job candidate -- we're hiring someone in multicultural literature, and I'm on the search committee. As I make my way over, I feel the tug of obligation -- my students would benefit from a discussion of the ending of Percival Everett's Erasure, even though, or perhaps especially because, almost none of them have read it. Yet it's a fine spring day, a Friday, and they will not be interested in being in class, regardless of what I pull out of my teaching bag of tricks. I weigh the options -- dull class for everyone or the guilt of canceling a class (despite the department chair's suggestion that I cancel it). Before I enter the room, I'm still not quite sure, but I'm leaning toward canceling. I take a deep breath and then breathe out, exhaling my guilt into the tiled hallway.
I open the door; the students are mostly there, sitting in a circle, as usual. Only a few are talking. I walk toward the board, and -- I freeze -- scrawled across it is:
Why are we even here for? You already gave us the final. It's not like you're going to help us answer it.
Looking at it now, I think the underline was a nice touch, but at that moment, for a rage-filled second, I think, "We're going to have class, dammit! Make them suffer." I stand with my back to them, slowing my breath, my options zipping through my mind while sorrow (despair?) and anger bubble in me and pop, pop into the afternoon's clear light.
So much for learning. Were our conversations simply for grades? Was that the real story of this year?
When we discussed Native Son, we talked about how easy it was to transfer feelings of guilt to rage at those who make you feel guilty. Bigger's hatred of whites stems from how they make him feel. He pulls a gun and threatens Mary's boyfriend, Jan, when Jan is trying to help him, because Jan has made him feel he has done wrong. In the book, Wright suggests that white society loathes blacks because they are reminders of the great sin of slavery. Is my rage from guilt -- guilt that we haven't really accomplished much this year, guilt that I was willing to cancel a class because I didn't want to endure 45 minutes of bored faces? Pop ... pop.
I dismiss the class and stroll over to the dining commons to collect my free lunch.
Erasure is a brilliant satire, one that contains an entire novella by the book's protagonist, a frustrated African American writer, Monk Ellison, who has been told one too many times by editors that his writings aren't "black enough." The novel within a novel lifts the plot of Native Son almost completely, and it presents a main character, Van Go Jenkins, as the worst stereotype of African American culture, someone without morals, whose only interests are sex and violence. At one point, Van Go slaps one of his sons around -- he has four children by four different women -- because the mentally handicapped three-year-old spilled juice on Van Go's new shirt.
It's clear that Erasure's narrator, Monk, is appalled by the book he writes, and that he's appalled by Native Son and the attitudes about race and writing the novel has fostered. When we do discuss the book in class, I point to a snippet of dialogue that Monk imagines:
D.W. GRIFFITH: I like your book very much.
RICHARD WRIGHT: Thank you.
"So this is a real question Erasure raises," I say. My pulse quickens. I can sense them listening, waiting. "Is this book right about Richard Wright? Is this book fair to him? To Native Son? Has the creation of Bigger Thomas been a disaster for African Americans? Has it skewered the country's view of race in a harmful way?" I pause, content. Even if no one raises a hand, even if no discussion ensues, -- and certainly some discussion will erupt -- I can see the question worming into their minds, a question that they might even try to answer themselves.
La Sauna, the student who never lets me get away with anything, raises her hand: "What do you think?"
What do I think? I wasn't ready for that. What do I think?
What I think, I realize, has been altered by what they think, and what they have taught me about the book, about the world.
There are no definite answers, but my students had helped identify the questions, and had pointed toward possible replies. After we had finished reading Native Son, I asked the class, "How many of you want Bigger to get away, even after he bashes in Bessie's head?" A good third of the class raised their hands, and, like the class itself, those who wanted this double murderer to escape were a mix of men and women, blacks and whites. There are several ways to interpret this, but I don't think it is a sign of callousness, the residue of playing too much Grand Theft Auto. They wanted Bigger to escape because Wright had gotten into Bigger's consciousness deeply and believably enough that he became real, more than a symbol or a stereotype.
I tell them this, how their response to Bigger has influenced my reading. I don't tell them Gina's story.
Gina was one of the students who read the books. She loved Tea Cake and Sula, was torn between Martin Luther King Jr. and Malcolm X. She even visited me in my office once or twice to seek advice about problems with a roommate, or a professor. An African American student from a rough neighborhood, she ended up leaving the college after the semester ended, unable to afford housing costs.
Sometime in March of that semester, Gina came to my office. She had missed class and wanted to turn in her response paper on Native Son. The class had read the essays by Baldwin and Bradley criticizing the novel, and had been asked to evaluate them. Baldwin, Gina tells me, was difficult, "but he was such a good writer."
Did she agree with Baldwin, I ask? Was Bigger denied humanity by Wright? How does she feel toward Bigger?
"I think he needs help," she says, "but I felt sorry for him. I wanted him to be able to understand his life--" I cut in, offering some teacherish observation about how Bigger shows glimmers of understanding in the last part of the book, but her mind is far ahead of me, just waiting for me to stop. I do.
"The book reminded me of the guy who killed my uncle. You probably saw it -- the trial was all over the TV last week."
I shake my head.
The man and an accomplice had murdered her uncle, a local storeowner, three years ago, and the previous week had been sentenced to life without parole. The two had been friends of the uncle's family, had played pool with the uncle the night before, planning to rob and kill him the next day.
"When I saw him sitting there, with his head down, looking all sad, I don't know, I felt sorry for him. I wanted to give him a copy of Native Son. I wanted to walk up to him and put it in his lap. It might help him to understand his life.
She looks at me, her brown face just a few shades darker than mine. She's 19. Her hair is pinned back, and some strands float loose. Her eyes are as wide as half dollars, as if she's asking me something. Without thinking, I nod slowly, trying to hold her gaze. On the shelves surrounding us are the papers and books of my profession, the giant horde that will pursue me until I die.
"My family wants him to suffer -- hard. But I want to talk to him. Do you think that's bad? I want to know why he did it, what happened. I wonder how he'd react if he saw me -- what he'd do if I gave him the book."
I imagined Native Son in the man's lap. The glossy, purple, green, and black cover bright against the courtroom's muted wood, the man's trousers. His hand, smooth with youth, holds its spine. His thumb blots out part of the eerie full-lipped face on the front. As the words of the court fall about him, the book rises and falls ever so slightly, as if breathing.
J.D. Scrimgeour coordinates the creative program at Salem State College and is the author of the poetry collection The Last Miles. This essay is part of his new collection, Themes for English B: A Professor's Education In and Out of Class, which is being released today by the University of Georgia Press and is reprinted here with permission.
Those of us in the humanities were reminded recently of our place in the universe. Here's the deal: When space was handed out, we were out having coffee and lost our place in line to ... wandering cognitive scientists. But the coffee was good and gave us a chance to ponder yet again what we thought were the very serious questions: Was Heidegger a Nazi? Was Manet an Impressionist or was that Monet? Is the universe -- oops, the university -- in ruins? We learned on August 24, however, that a decision of importance to those interested in knowledge in general was made without our input and that -- on top of it all -- this decision involved shrinking the available space in the university -- oops, the universe -- allotted to humanistic endeavors. Is this gerrymandering? You bet. And Pluto's out. We're down from nine to eight in our naming rights, and that's what humanists do -- we name things.
I was prepared to research this decision. Before going to Belize last summer, I had taken my daughter, Lucy, to the Kennedy Space Center and we had bought a book about space. Find the Constellations was written by H.A. Rey and published first in 1954. You might remember that H.A. Rey was the illustrator of the Curious George series, which his wife, Margret Rey, penned. In fact, one of the Curious George stories has George blast off into space ( Curious George Gets a Medal). H.A. was an amateur astronomer so he wrote and illustrated this guide to the wandering planets for children. Here's what it says in the index, under "Pluto": "Planet Pluto discovered as recently as 1930." Here is how the planet is described: "Ninth, and so far, last of the planets; 3,700 million miles from Sun; only about 1,400 miles across. One moon. His trip around Sun takes almost 250 Earth-years. Don't go there unless you are equipped to stand a cold of about 400 degrees below." You don't have to be a literary critic to see that Rey was promising ("so far") that even more planets would eventually be discovered.
I needed to check more sources, so I consulted Lucy's bookshelf. Here's what 1,001 Facts About Space, published in 2002, has to say about Pluto: "The most distant of all the planets, Pluto is the least understood." And here's what Dogs in Space (1993), by Nancy Coffelt, has to say: "There is very little light on Pluto. Dogs in space are far from the Sun. They are very near the edge of the Solar System, where it is cold and dark and lonely." Coffelt claims that dogs like it in space because there are no cats there -- but the dogs cramped on Pluto don't look too happy. I had learned that Pluto was small, dark, cold, lonely, and misunderstood. Was this why scientists were cavalierly jettisoning it?
I turned to the Internet and found that the body of scientists responsible for the momentous decision is called the International Astronomical Union. This organization has 8,858 members. The Modern Language Association, by contrast, has "over" 30,000 members (notice which number is more precise). Clearly, democracy was not at work. I read the IAU resolutions that were passed in August. They apparently come from what is known as the "Planet Definition Committee" (I'm not kidding). Resolution 6A creates a "new class of objects for which Pluto is the prototype" and which are called (see Resolution 6B) "plutonian objects." We are told that astronomers chose the term "plutonian" instead of "pluton" after checking with geologists. I also found out that "plutonians" are really just a sub-category of "trans-Newtonian objects." In a footnote, we read that "An IAU process will be established to assign borderline objects into either dwarf planets and [sic?] other categories." How can I get elected to the borderline objects committee?
Just when you think you are a humanist finished with playing with language for the day you discover another doosy. The IAU states that one of the main reasons for no longer considering Pluto a "classical" planet but a "dwarf" planet is that Pluto "has not cleared the neighborhood of its orbit." WOW. Neighborhood -- are we talking community here? What does Pluto need to clear its neighborhood of, exactly? I see racial overtones looming.
My colleagues in Classics are pissed, with Latinists especially up in arms. You've got the Sun at one end of the solar system -- is anyone going to mess with the designation of "the" Sun as a sun? -- so it was proper, even poetic, to have H-E-double-toothpicks at the other end, where it's really dark and lonely and cold (pace the idea of burning in Hades). Now we have, hmmm ... the Sun (let's rename it Apollo) at one end and Uranus?, Neptune? -- who can really keep them straight? -- at the other. What's next? Should we replace all the names of planets or pieces of rock out there with numbers and rely on the mathematicians to keep track of them? Are the nice stories of lions and tigers and bears going to Pluto in a hand-basket also? On that note, should we pretend there is really no hell and that our parents invented it so we would do our homework?
When I was a graduate student in the humanities I had a boyfriend who was in physics. I was pretty proud to be dating a "theorist," a word that all us would-be literary theorists liked to say as often as possible. He was odd in a good way with some odd-in-a-bad-way friends, and even though it didn't work out I've always had a crush on the discipline of physics. I can report, however, that he once told me very seriously that his professors believed they were the ones answering "the big questions" and that he had bought into this. In other words, anyone else's questions just weren't as big. Even history's. Even philosophy's. Being in a humanistic discipline that doesn't attain to such heights, I marveled at the chutzpah. In any event, I think this goes a long way toward explaining why Pluto has suddenly been cut down to size: physicists and astronomers don't only want to reserve the big questions (Where do we come from? What are we? What's going on? -- to quote Gauguin, or maybe Joyce Carol Oates) for themselves, they want to demote celestial bodies. The universe is a big chess game and someone's got to move the pieces, they imagine. (You have probably noticed that all physicists play chess.)
Finally, as a baby-boomer -- and therefore as a tenured radical -- I bring, along with my humanities baby-boomer colleagues, a perspective on Pluto's demise that may be traced to German Romanticism and all that crying over the ruins of Greece and Rome: I loved Pluto. I loved having nine planets because I could then divide them into threes. This was not only a good mnemonic device, it looked pretty. Dividing eight into fours or twos does not come natural. I also liked the recognition of the outsider, the little guy, the underdog. As children, we liked the fact that Pluto was always dark and always cold, like the spooky closet in our rooms. No matter how many times we mixed up Jupiter and Neptune and Mercury and Saturn, we knew that Pluto was there, at the end of the line, the caboose of the solar system. I know many people of my generation who would much rather have seen a man walk on Pluto than on the Moon, even if it took him 2,000 light years to get there and even if he never came back.
Other recent decisions in the scientific community have also been pushed through committees without the input of the humanities. As everyone knows, any bona fide humanist reads The New Yorker. The bona fide among you will recall a recent article in that magazine on the "Fields Medal," the big shot medal in mathematics (we thought it was the Nobel Prize -- wrong again). According to The New Yorker, this Fields Medal business could lead to increased global warming, as Russian and Chinese scholars duke it out. (By the way, the Russian guy, who lives with his mother and has no friends, sounds suspiciously like a humanist). I am not saying that if someone from, say, modern languages and literatures had been on the committee that world peace would be ensured; I am saying that that person could have communicated in the native tongues to help sort out misunderstandings -- translation is, after all, just another way of naming things.
There's another science decision that has a human aspect, but about which we have been, again, not consulted. I refer to President Bush's insane desire to get a man back on the Moon by the end of the decade and (presumably) a different guy on Mars by the end of some other decade. I know a bit about this controversy and here's what I've been able to gather: Bush is a humanist; most scientists aren't. Hmmm ... make that Bush is a media hog, most scientists aren't. I've read a lot about the history of humans going into space and I know that the friction between scientists who want to do science in space and guys who want to do road trips there has been around at least since Eisenhower. Scientists, in other words, want to learn about space; the other guys want to go there. It's kind of like Galileo and Newton debating Lewis and Clark. Now, if I truly believed that sending a guy to the Moon and to Mars would actually yield something -- say, the discovery of a lost Munch painting or the Holy Grail (to get Dan Brown off our humanist backs) -- then I might be all for it. What we do know scientists will find there, however, is in the end excruciatingly boring: sand, dust, rocks, evidence that a bazillion years ago there was water, rocks, Jesus' face on the side of a cliff, more rocks. And although some of the snippets thought up by the Apollo astronauts to describe their experiences on the Moon could be termed poetic -- "It was so empty, man" -- most showed no sign of poetic impulse, or even a poetic pulse -- "My wonker stings, too, man." If they'd send humanists to the Moon it might be a different story, but they won't. They haven't even sent a woman or a person of color of color. When NASA had the chance to send an old person to space they sent Glenn and he had already been there! Hello? Or should I say Hell-o?
I'd like to end with Georges Méliès, who started the whole "film the Moon" craze. Méliès was a wonderful silent film director and he was French. That gave him all kinds of license. He made two short films that are of interest here: A Trip to the Moon (1902) and The Eclipse (1907). In the latter film, the Sun (a woman) and the Moon (a man) flirt with each other to the point of undergoing some kind of climax, that is, eclipse. It's pretty racy. In A Trip to the Moon, a fat rocket catapults into the cheesy Man in the Moon and this is a good scene for teaching students the phrase "phallic symbol." W.E.B. Dubois is famous for having written in the early 1900s that the question of the century would be the color line; Méliès revealed the second major question, the goings-on on the Moon. Some would have it that in the 21st century we are past the color line; they are, unfortunately, wrong. Others would like to believe we are done with the Moon; they are, unfortunately, wrong. But we do seem to be done with the nine planet consortium.
Returning to nomenclature, I wonder the following: Can we take the name Pluto and give it to the Moon? Other planets' moons have names -- why can't ours? Or how about Charon? That was the name of Pluto's moon, but since Pluto is no longer a planet Charon has been recategorized as a "satellite" of Pluto. Can I get on the committee that decides these things? Who's on the committee on committees for the IAU? Will this count as "professional service"? Will I get a boost in salary?
Not in this universe -- oops, university.
Fleur LaDouleur is the pseudonym of a professor of humanities at a Midwestern university.
Submitted by Emily Toth on September 22, 2006 - 4:00am
My students like to take it on the road.
At Penn State, they used to take it to the laundromat. While their undies thrashed about and everyone around them was struggling with Principles of Accounting and Introduction to Physics, my students would be turning page after page -- laughing, crying, panting.
They still do, though now most of my Louisiana State University students also have full-time jobs. They take it with them to offices, beauty parlors, and fast food joints -- and they leave their greasy thumbprints on the best pages.
They even take it home and read it in front of their parents -- something my generation never did.
We knew we weren't supposed to read Peyton Place in front of our parents.
Grace Metalious's novel, 50 years old this month, is still a byword for lusty secrets. When South Carolina Congressman Lindsey Graham opened the impeachment hearings against Bill Clinton in 1998, he demanded, "Is this Watergate or Peyton Place?"
My students ask about Watergate -- but they know about Peyton Place.
My generation was the last one that could be ruined by a book, I tell them. Peyton Place was supposed to be the agent of our corruption. From the moment we heard that it had cost the author's husband his job ("Teacher Fired for Wife's Book") -- we were on it. Teens hungered for it. Once it came out in paperback -- not routine in those days -- we hot-footed it down to the drugstore by ourselves and purchased the small book with the black and yellow cover.
When I talked about it on "20-20" in 1981, for the book's 25th anniversary, the cameraman told me he'd sewn a special little pocket inside his leather jacket just to hold his copy of Peyton Place.
Imagine young men doing that now.
Peyton Place has always been in schools, of course. It was confiscated from school lockers everywhere. We passed it around in study hall, snickering and puzzling over the backyard scene in which a naked couple gets amorous, and his head disappears between her legs. (Peyton Place was 17 by the time Monica Lewinsky was born.)
Grace Metalious had spent some 17 years writing her succès de scandale, plugging away night after night, with her typewriter on her lap because she was too poor to buy a table. She was 32, a New Hampshire housewife with an unemployed husband and three children, when the book was published. Her life was never the same.
Though it had been an international best seller for years, Peyton Place wasn't mentioned in August, 1977, when Elvis Presley died. I wondered what had happened to Grace Metalious, that other icon of teen lust during the repressive 1950s -- and I found out that she was gone, too. Unable to cope with success, hounded by sharks and exploiters, she'd drunk herself to death by 1964. She was not yet 40.
I hunted up a dog-eared old copy of Peyton Place -- in a church rummage sale, for 25 cents -- and I became obsessed with researching the author's life. My Inside Peyton Place: the Life of Grace Metalious was published in 1981 (Doubleday), then reissued with new material in 2000 (University Press of Mississippi). Mine is still the only Metalious biography, no doubt because such a famously-lurid book cannot be considered "literature." I was writing before Cultural Studies made everything acceptable in English departments, and upholders of the Great Tradition snarled that I was "studying trash." But Penn State's School of Journalism welcomed me, to teach a new course on "Cultural Aspects of Mass Media."
My students eyed Peyton Place warily.
Some with very liberal parents had been allowed to catch "Peyton Place" when it was TV's first notorious night-time drama (1964-69), precursor to "Dynasty" and "Desperate Housewives." My students hoped the book would be racy, but they'd been fooled before. In high school they'd all read The Scarlet Letter -- a book about adultery! -- and it had been a soul-curdling bore.
Peyton Place, though, delivered the goods. It begins floridly: "Indian summer is like a woman. Ripe, hotly passionate, but fickle...." By page 15 we've met the town drunk, heard about the faithless wife who drove him to it, and seen the spinster schoolteacher, the awkward boy who's later dismissed from the army for being a "psycho-neurotic," and the rude playboy whose comeuppance is one of the novel's thrills.
My students enjoyed Peyton Place's earthy language ("green pecker," p. 10), its intertwined plots and its familiar secrets: births out of wedlock, religious hypocrisies, sneaky sexual encounters. Students noted that Peyton Place has everything a pop novel needs: strong plot, abundant action, clear characterization, and traditional values -- meaning the good people win. Besides being a wild ride through the sinful underbelly of a small New England town, Peyton Place is very satisfying. Justice wins in the end.
But it's the middle that excited the world and got the book banned in Fort Wayne, parts of Rhode Island, and all of Canada. (This year the book's anniversary, on Sunday, falls during Banned Books Week.) As media professionals-in-training, my students were eager to learn what gets a book banned and what makes a best seller -- and there was no better textbook than Peyton Place.
Still, there's more to it. My students could see that Grace Metalious had written a very feminist book. Her best women characters aren't satisfied to be spiteful housewives. (As Betty Friedan showed seven years later in The Feminine Mystique, that's what happens when women don't have outlets for their energies). Metalious's best female characters are businesswomen, teachers, and would-be novelists -- who protect, cherish, and mentor each other.
One story still stands out. Teenaged Selena Cross, sexually abused by her stepfather, kills him in self-defense and buries his body in the sheep pen, the only place where the winter ground isn't frozen. Eventually she's found out and put on trial -- whereupon the town doctor makes a sensational confession. Selena's stepfather had gotten her pregnant, and Dr. Swain had performed an abortion. His speech is a ringing defense of women's right to control their own bodies -- at a time when abortion was illegal everywhere in the United States.
For the book's first young readers in 1956, it may have been the first time they ever heard of abortion. For my Pennsylvania students in 1981, it added to underground knowledge they'd grown up with. Dr. Robert Spencer of Ashland, Pennsylvania, in the anthracite coal country, had been performing safe, cheap, and illegal abortions for more than 40 years. Everyone knew about Dr. Spencer, who died four years before Roe v. Wade legalized abortion in 1973. Everyone (including me) knew teens who'd gone to him; and when he was put on trial three times, the townspeople -- all of whom knew exactly what he was doing -- acquitted him.
Did Grace Metalious know a Dr. Spencer type in real life? My students wondered. Well, her Dr. Swain was like him, and so was her own physician, Dr. Slovack, whom I interviewed. "Their names all begin with S," my students noticed. They knew that old writer's trick for drawing on real people by using their initials. "So, was Dr. Slovack an abortionist?" they asked.
I didn't know; I hadn't thought to find out.
Our students do teach us about the gaps in our research.
But what about Peyton Place as a dirty book?
My students enjoyed the same "good parts" that the first readers found titillating. Even my 25 cent church copy opened to them. When I was researching my Metalious biography, I'd ask people what they remembered, and both sexes could recite their favorites, which were always the same. So let me now encourage you, my readers, to guess which was the girls' favorite and which was the boys'.
Rod, the rich teen playboy, takes Allison (a "nice" girl) to the school party, but goes out for a lusty make-out session with mill girl Betty in his car -- until Betty says, "Is it up, Rod? Is it up good and hard? Then go shove it into Allison McKenzie!"
Connie, the sexually repressed dress shop owner (Allison's mother) goes for a midnight swim with Tom, the massive, virile high school principal, who tells her, "Untie the top of your bathing suit. I want to feel your breasts against me when I kiss you."
Students in the early 1980s had no trouble guessing which had been whose favorite in the 1950s. But when I taught Peyton Place again, in 2002, the students were much less sure.
This time it was an English department course on "Images of Women," which I subtitled "Women's Secrets." I had moved to Louisiana State University, and most of my students had been born in the 1980s -- except for one older man who was delighted to read Peyton Place "at last." He announced that he and his wife would read the good parts aloud to each other, and try to recapture that old-time delicious sense of sin.
Meanwhile, my younger students complained that their parents made off with their Peyton Place copies -- not to censor, but to read the forbidden book at last. ("Don't tell your grandmother.") Students and parents talked about the book, too, in that open, we-have-no-boundaries way that's become so much more common. In the years since Peyton Place was first published, divorce has become routine; children know that teachers have sex; unwed motherhood has little stigma; bisexuality is commonplace; and young people know more about casual sex than about yearning for love.
My 2002 students enjoyed the jolly, strange drunken behavior in Peyton Place, but also recognized some deadly patterns. Selena's stepfather, for instance, is a classic wife beater: charming when sober, violent when drunk, and insistent on isolating his wife from anyone who might notice her bruises. "Why doesn't she go to a battered women's shelter?" my students asked -- and were appalled that no such places existed in the 1950s. Even the terms "battered women" and "domestic violence" did not exist, and what we now condemn as terrible behavior was just "married life."
The characters' eccentricities also inspired my students to try their own.
One male student, fascinated by the "psycho-neurotic" boy in the book, came to class in an orange dress and wig and wanted to be called "Bernice." The other students critiqued his wardrobe ("visible panty line") and then ignored him. We are, after all, in Mardi Gras country -- and today's young people pride themselves on being cool.
What did get them going, though, were the favorite lines from the past. Which was the girls' favorite, and which the boys'? I asked. The students' votes were split almost evenly.
The "Rod" quote, the one most beloved by young men in the 1950s and 1980s as a celebration of male power, struck half my recent students as a turn-on for women, not men. "Betty's in charge," said one student. "She tells him where he can go."
The "Untie" quote, once most beloved by young women readers who wanted to feel desirable, now struck some young men as tender and appreciative -- but many women hated it. "Domineering pig," said one. "Bordering on date rape" -- another concept unnamed and unknown in Grace Metalious's day.
Peyton Place tells us how sex roles have changed.
In an interview after her book had become a best seller, Grace Metalious was asked if Peyton Place would "last," and whether people would be reading it 25 years later.
"Absolutely not," she predicted, and she was absolutely wrong.
Though Peyton Place was rejected for the Penguin Classics series, my Metalious biography has been bought by Sandra Bullock for a feature film, to be called "Grace." Bullock, born five months after Grace Metalious died, sees a strong character whose ideas put her out of step, way ahead of her time.
For Grace Metalious was not just a trashy writer. She felt passionately protective toward the weak, the poor, and the young, and she gave Allison McKenzie her own astonishing drive to be a writer. I saw a lot of myself in Allison, in that I was always a dreamy scribbler. Grace Metalious told me to keep at it, and I wasn't alone.
A few years ago, in the Philadelphia airport, I ran into one of the Penn State students who'd read Peyton Place with me in 1981. Grace Migliaccio could still recite both sets of favorite lines, and said she often thought about Grace Metalious as a lonely feminist pioneer. And so, when she contemplated getting married, Grace Migliaccio -- now a writer and a human relations professional --- never considered changing her name. "I want to be Grace M. forever."
We could do worse, for as the original Grace M. said: "If I'm a lousy writer, then a hell of a lot of people have lousy taste."