English

'This Is Not the Ivy League'

Smart Title: 

New memoir offers no-holds-barred look at life at a rural public college.

Will's Quill or Not Will's Quill …

Smart Title: 
… that is the question, but scholars are arguing over whether they should ask it.

Resetting Priorities

Beyond looking for jobs, writes Monica F. Jacobe, new Ph.D.s need to ask themselves tough questions about what really matters.

NEH Grant Proposal #1095702H

The 19th-century Welsh novelist Henry Clairidge (1832-74) stood firmly in the British eccentric tradition, publishing only two novels during his lifetime, __________ and [              ], each consisting of 200 blank pages. A posthumously published volume was put out by his sister, Ethel, in 1876: “******,” a heavily annotated work of 200 pages, also blank.

These three books constitute the Clairidge oeuvre and his claim to literary posterity. Apart from a few contemporary reviews in The Gleaners’ Gazette, Clairidge remains mostly a tabula rasa. No critic has adequately addressed this master of Victorian minimalism, who so clearly anticipated the work of the Parisian livre vide movement in the 1890s and the pared-down appearance of late Beckett some decades later.

Occasionally, commentators have projected their own preconceptions on Clairidge’s admittedly scanty plots. New Critics had a field day filling in the gaps and differentiating between hiatuses and lacunae.

Barthes proposed 53 distinct readings of page 100 in [             ], whereas Derrida declared, “There is nothing inside the text.” Greenblatt links the genesis of Clairidge’s corpus to a blank diary found among the effects of a drowned sailor from Bristol in 1835. Several attempts by white studies scholars to claim Clairidge’s pages as an oppressed majoritarian cri de coeur have been largely ignored by multiculturalists.

These previous approaches miss the mark. Clairidge’s grand emptiness, prefiguring the existential void of the 1950s, mirrors life itself -- or at least the life of Clairidge, who spent his last 20 years at the ancestral estate in Ffwokenffodde, staring gormlessly at the hay ricks. His sister, Ethel, who doubled as his amanuensis and nurse, would occasionally turn him toward a prospect of furze, but the shift seems not to have affected his subject or style.

I contend that Clairidge’s hard-won nullity is temperamentally different from nihilism, which is to say that believing nothing is not the same as Belief in Nothing. Moreover, if Clairidge’s art takes the blankness of life as its premise, its slow-building conclusions represent a sort of après vie. Though reconstructing a writer’s faith from his art is a dicey business (and Ethel burned her brother’s blank notebooks after his death), one of the few remaining social effects sold at a charity auction in 1876 is a hay-strewn, slightly warped Ouija board. In short, this project involves the unacknowledged fourth estate of the race, gender, and class trinity: creed. Any committee members in sympathy with the current political administration, please take note.

Nothing is familiar to me. As a blocked but tenured faculty member for the past 14 years, I can attest to the power of the blank page. The study I propose would be as infinitely suggestive as Clairidge’s own work. Having already compiled over 150 blank pages of my own, I estimate that I am about halfway through a first draft.

My spurious timeline, suggested by my university’s internal grant board to indicate progress, is as follows: chapter one by March, chapter two by April, chapter three by May, and so on. More specifically, I hope to have the large autobiographical or “life” section done by May, so I can go on vacation with my family, and the “after-life” section should be done before my department chair calls me in to discuss that tiresome annual faculty activity report.

I already have papers and books strewn impressively around my office, as well as a graduate assistant to help me sort through them. An NEH grant at this stage would not only help to renovate our breakfast room, but also answer the querulous looks that the dean of  liberal arts has been giving me at public gatherings. Considering the projects you people have been funding lately, I -- but as with Henry Clairidge, words fail me. As Wittgenstein concluded in his Tractatus, “Whereof one cannot speak, thereof one must be silent.”

*****

Selected Bibliography

Clairidge, Ethel. The Selected Letters of Ethel Clairidge to Her Brother, The Corresponding Grunts of Henry Clairidge to His Sister. Eds. Renée Clairidge and Friend. Metuchen, N.J.: Methuen, 1965.
 
Clairidge, Henry. __________. Oxford: Clarendon P, 1872.

---. [              ]. Oxford: Clarendon P, 1873.

---. “******.” Oxford: Clarendon P, 1896.

Galef, David. "Notes on Blank Space." Cimarron Review 98 (1992): 95-100.

Paige, M. T. “Their Eyes Were Blank: Zora Neale Hurston and Her Homage to Clairidge.” JLI [ Journal of Literary Influence] 25.2 (1972): 10-20.

Zaire, Nottall. “Pulling a Blank: On Nullity and Art.” Hypno-Aesthetics 1.1 (2002). 30 Feb. 2002. (http://www.hypnoA.org)

Author/s: 
David Galef
Author's email: 
galef@olemiss.edu

David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).

Students Read Less. Should We Care?

A new survey of literary reading in America by the National Endowment for the Arts, " Reading At Risk " has once again raised the alarm about the cultural decline of America. This one provides the news that we read much less literature, defined as fiction and poetry, than we did some 20 years ago. Indeed, the decline is substantial (10 percent), accelerating and especially worrisome because the malady of literature non-reading particularly afflicts the younger members of society, that critical 18-24 year old group (which shows a 28 percent decline in this survey).

Academicians rushed in to analyze, comment and explain this decline, but some of the commentary both in the report itself and in the academic discussion it provoked seemed to miss the mark. The predictable villains of the visual media, the electronic media and the Internet all came in for blame.
Truth is, I am not sure that the data represent a cause for alarm.   

I know I should worry. I am a historian, after all, and if people will not read fiction, surely they will read less history. And I'm a teacher, and like everyone else in the humanities, I know students just do not read like they used to do.  

The trouble is, I am not sure the changes in our cultural context are necessarily a bad thing. I read many airplane novels, and I have to say that if the younger generation is doing something else with their time, not much is lost. I read New Yorker fiction when I feel the need to be literarily virtuous, but the pieces tend to be mostly depressing stories about lives that do not work out in rather low-level ways.  

Then I go online. Here I find a complicated world filled with the good, the bad, and the ugly. Alive and constantly changing, engaged and engaging, requiring my constant decisions about what is worth reading or seeing and what is not. From the lowest pornography to tours of the treasures of the Library of Congress, from the stupidest blogs of the radical fringes, to the most sophisticated discussions of the decline of America's reading habits, everything is there.

What is missing of course is the prescriptive, gate-keeping censorship of the academic and other cultural mandarins, sorting out what is good for me and what is not. The college students who now show up in my classroom come with an informational sophistication unimaginable in my generation. They find what they want, they use what they find, and they discard immense amounts of information made available to them.

Are they naïve about authority, methodology, logic and accuracy in these endless streams of information? Sure, they are. Who should teach them how to sort this stuff? We academics, sophisticated readers ourselves who all too frequently escape into trendy obscurantism rather than engage the real world information flow that constitutes the actual cultural context of our time.

We, the literate part of the American population, need to reconnect with the actual cultural context, rather than fight micro-academic battles of almost no interest to people outside the elite tiers of the academy. We need a better metric than reading print books, stories and poems to define the active imagination and the creative industries of our time. Why is a trashy airplane best seller more of a valuable cultural artifact than the telenovelas watched with enthusiasm and discussed in endless analytical detail by the large and growing Spanish speaking part of America? Why do we assume depressing short stories or over-hyped formulaic bestseller novels represent more significant cultural artifacts than the film version of The Lord of the Rings, the Star Wars series, or the computer game community's imaginative products?

The decline in reading may well reflect the decline in formal study of the humanities in American universities. However, the problem is not the students but the material we teach, the sectarian nature of our controversies, and our general reluctance to put the humanities in the center of our culture rather than relegating them to fragmented enclaves along the partisan byways of academic enthusiasms.

We lose influence on campus to the sciences on one side because they appear and act as if they know exactly what they are doing, how they do it, and for what purpose they do it. We lose influence on campus to the professionally oriented disciplines on the other side because they have a purpose and a method anchored directly in the center of the real world their disciplines address.  

We in the humanities, and very frequently as well in the social sciences, often do not know and do not agree on what we think we are doing. We have few common standards and we ask little of our students who have time for non-academically related campus activities. We wonder why our voices carry such little weight when our culture seems to need us so desperately to sort out fundamental issues of values and judgment.

Our weakness on campus as humanists and social scientists reflects our frequent disconnect from the major issues that drive our culture and society. We know a lot, about many topics and issues. We have complex and specialized languages that define our place in political and intellectual sectarian spaces. While the best among us teach interesting courses to many students, most of us publish and build our prestige in the academy with mostly unreadable prose using such terms of art opaque to any but the specialists.  

Although our scientific colleagues are often even more incomprehensible than we are, they have found ways to demonstrate the utility of their work so that a whole industry translates their science into terms ordinary citizens can understand. Some of our humanistic and social scientific colleagues find audiences outside the academy, but many people find it hard to distinguish between the opinionated rant of an e-zine commentator and the reasoned logic and well-researched judgment of a humanistic scholar. Often the rant is easier to read and more accessible than the reasoned argument.

What to do? I am not sure, but the first thing would be to pay close attention to what people are reading, what they are seeing, and how they do engage the common culture. The message of "Reading At Risk" is that something other than literature in print form engages more and more of our fellow citizens, and we might want to try to learn how to speak to them in the voices they want to hear.

Where better to learn how to do this than with our 18- to 24-year-old undergrads?

Author/s: 
John V. Lombardi
Author's email: 
lombardi@umass.edu

Designed to Please

If intelligent design gets taught in the college classroom, here are some other propositions we can look forward to:

Was Shakespeare the author of all those plays? Competing theories suggest that the Earl of Oxford, Francis Bacon, or even Queen Elizabeth herself penned those immortal lines. You be the judge. Henceforth, the prefaces to all those editions by “William Shakespeare” should be rewritten to give equal time to the alternate-authorship idea.

Does oxygen actually support that flickering candle flame, or is an invisible, weightless substance called phlogiston at work? First suggested by J. J. Becher near the end of the 17th century, the existence of phlogiston was eventually pooh-poohed by supporters of the oxygen hypothesis, but, as they say in the legal profession, the jury’s still out on this one.

Drop a candy bar on the sidewalk, and come back to find ants swarming all over it. Or put a piece of rotten meat in a cup and later find maggots in it, having come out of nowhere! This is called spontaneous generation. Biologists eventually decided that airborne spores, like little men from parachutes, wafted onto the food and set up shop there, but does that make any sense to you?

In the morning, the sun rises over the tree line, and by noon it’s directly overhead. At night, as the popular song has it, “I hate to see that evening sun go down.” Then why do so many people think that the earth moves instead of the sun? Could this be a grand conspiracy coincident with the rise of that Italian renegade Galileo, four centuries ago? Go out and look at the sunset! As they say, seeing is believing.

Proper grammar, the correct way of speaking, the expository essay model -- how rigid and prescriptive! There are as many ways to talk as there are people on this good, green earth, and language is a living organism. Or like jazz, an endless symphony of improvisation. No speech is wrong, just different, and anyone who says otherwise is just showing an ugly bias that supports white hegemony.

“History is bunk,” declared the famous industrialist and great American Henry Ford. All those names and dates -- why learn any of that when not even the so-called experts can agree on exactly what happened? Besides, most of those historical figures are dead by now, so what’s the point? From now on, all history departments must issue disclaimers, and anything presented as a narrative will be taught in the creative writing program.

Speaking of which, creative writing itself has long been controlled by a bunch of poets and fiction writers who determine who wins what in the world of letters. But who really knows whether the latest Nobel Prize winner is any better than, say, that last Tom Clancy novel you read. It all boils down to a matter of taste, doesn’t it?

Or what about that "Shakespeare"? Was he/she/it really any better than the Farrelly brothers? Let’s all take a vote on this, okay?

Author/s: 
David Galef
Author's email: 
david.galef@olemiss.edu

David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).

No Field, No Future

I gave a paper recently as part of a colloquium at George Washington University whose general title was "Futures of the Field." The tension in that plural -- "Future s" -- carried the weight of much of what I had to say about the current state of literary study.

My audience and I were seated around a seminar table in what has long been called, and continues to be called, an "English" department. The name "English," I pointed out, designates a primary activity involving the reading and interpreting of literary texts in English. (This would include foreign literature translated into English.) If we want primarily to involve ourselves with historical texts, we go over to the history department; philosophical, the philosophy department, and so forth. What distinguishes our department, as Judith Halberstam wrote in her essay, is the "appraisal of aesthetic complexity through close readings." Not philosophical or historical, but aesthetic complexity.

This model of the English department, and the carefully chosen canon of great aesthetic works which comprised its content, has in most colleges and universities collapsed. The value and nature of our reading (that is, when English departments feature reading at all, film, television, music, and material culture courses having displaced to some extent written texts in many schools), has radically changed, with the inclusion of cheap detective novels and poorly written political essays, for instance, now routine in departments that used to disdain prose that exhibited little aesthetic complexity and/or stylistic distinction.

On the other end, there's also now the inclusion of notoriously over-complex -- to the point of unintelligibility, never mind stylistic ugliness -- advanced critical texts in our courses. A character in Don DeLillo's White Noise says of his university's English department, "There are full professors in this place who do nothing but read cereal box tops." But there are as many professors there who read nothing but the densest, most arcane, and most poorly written critical theory.

All of which is to say that there is no "field," so there can't be any "future" or even "futures." That "s" in our GW lecture series title is trying to reassure us that instead of a profession-killing chaos what we have now is a profession-enhancing variety, richness, flexibility, ferment, inclusiveness, choose your reassuring adjective. Yet when there's not even a broadly conceived field of valuable objects around which we all agree our intellectual and pedagogical activity should revolve, there's no discipline of any kind.

Instead, there's a strong tendency, as Louis Menand  puts it, toward "a predictable and aimless eclecticism." A young English professor who has a column under the name Thomas Hart Benton in The Chronicle of Higher Education puts it this way: "I can't even figure out what 'English' is anymore, after ten years of graduate school and five years on the tenure track. I can't understand eighty percent of PMLA, the discipline's major journal. I can't talk to most people in my own profession, not that we have anything to say to each other. We don't even buy one another's books; apparently they are not worth reading. We complain about how awful everything is, how there's no point to continuing, but nobody has any idea what to do next."

The English department mainly survives as a utilitarian administrative conceit, while the English profession operates largely as a hiring and credentialing extension of that conceit.

If we wish to say that we've retained disciplinary integrity based on our continued close attention to texts of all kinds -- aesthetic and non-aesthetic -- that sharpen our ideological clarity about the world (or, as Menand puts it, texts that allow us to "examine the political implications of culture through the study of representations"), then we have already conceded the death of the English department, as Halberstam rightly notes. Indeed, since highly complex aesthetic texts tend to be concerned with personal, moral, and spiritual, rather than political, matters, we shouldn't be surprised to find in Halberstam an outright hostility to precisely the imaginative written texts in English that have more or less from the outset comprised the English department's objects of value and communal study.

Menand notes that the "crisis of rationale" I'm describing here has had serious negative consequences. Among a number of humanities departments that are losing disciplinary definition, English, he says, is the most advanced in this direction: "English has become almost completely post-disciplinary." (Menand has earlier pointed out the inaccuracy of another reassuring word -- interdisciplinary: "The collapse of disciplines must mean the collapse of interdisciplinarity as well; for interdisciplinarity is the ratification of the logic of disciplinarity. The very term implies respect for the discrete perspectives of different disciplines.") The absence of disciplines means the "collapse of consensus about the humanities curriculum," and this at a time of rapidly escalating outside scrutiny of the intellectual organization and justification of the expensive American university.

Further, "the discipline acts as a community that judges the merit of its members' work by community standards." When there's no self-defining and self-justifying community, English departments, Menand continues, become easy marks for downsizing administrators. "Administrators would love to melt down the disciplines, since this would allow them to deploy faculty more efficiently - and the claim that disciplinarity represents a factitious organization of knowledge is as good an excuse as any. Why support separate medievalists in your history department, your English department, your French department, and your art history department, none of them probably attracting huge enrollments, when you can hire one interdisciplinary super-medievalist and install her in a Medieval Studies program whose survival can be made to depend on its ability to attract outside funding?"

Halberstam acknowledges these effects and proposes that we "update our field before it is updated by some administrations wishing to downsize the humanities." By "update," though, she means provide a decent burial: "The discipline is dead, we willingly killed it," and we must "now decide what should replace it." In place of the "elitism" inherent in close readings of aesthetically complex works, Halberstam proposes an education in "plot summary," a better skill for making sense of our current reactionary political moment (as Halberstam sees it).

Indeed throughout her essay, Halberstam attacks religious Americans, conflating religious seriousness with politically reactionary positions.

Now, a huge amount of Western culture's high literature involves religious seriousness. If, like Halberstam, you regard contemporary America as a fundamentalist nightmare, and if your very definition of the American university is that it is, as she writes, "the last place in this increasingly conservative and religious country to invest in critical and counter-hegemonic discourse," then you have a problem. You either want to steer your students away from much of this literature, since, though perhaps not fundamentalist, it assumes a world permeated with religious belief (or, as in much literary modernism of Kafka's sort, as suffering from an absence of belief), or you want to present this literature in a way that undermines, to the extent possible, its own status as a document that takes religion seriously.

It's just this sort of cognitive dissonance relative to the very body of knowledge that, as an English professor, Halberstam has been trained to teach, that in part accounts for the death of English. Halberstam's primary motive as a university professor is political and social - she has situated herself in an American university because that location is our last best hope for changing the politics of the country. Indeed, if there is a "consensus" about anything in many English departments, it lies here, in the shared conviction, articulated by Halberstam, that focusing upon and changing immediate political arrangements in this country is our primary function as teachers and scholars.

One assumes, that is, a socially utilitarian attitude toward what one teaches.

There was nothing inevitable about this turn outward to the immediate exigencies of the political and social world, by the way. As Theodor Adorno writes in Minima Moralia, the intellectual is, more often than not, "cut off from practical life; revulsion from it has driven him to concern himself with so-called things of the mind." But this withdrawal also drives the intellectual's critical power: "Only someone who keeps himself in some measure pure has hatred, nerves, freedom, and mobility enough to oppose the world."

No one's arguing here that we return to a very narrow canon, to uncritical piety in regard to the literature of our culture, and to monastic withdrawal from the world. Instead, what I'd like to suggest is that we return to the one discrete thing that our discipline used to do, and still, in certain departments, does.

A few years back, in The New York Review of Books, Andrew Delbanco, an English professor at Columbia University, announced "the sad news… that teachers of literature have lost faith in their subject and themselves… . English today exhibits the contradictory attributes of a religion in its late phase - a certain desperation to attract converts, combined with an evident lack of convinced belief in its own scriptures and traditions."

Delbanco continues: "The even sadder news is that although students continue to come to the university with the human craving for contact with works of art that somehow register one's own longings and yet exceed what one has been able to articulate by and for oneself, this craving now, more often than not, goes unfulfilled, because the teachers of these students have lost faith." In similar language, Robert Scholes writes, "As our Romantic faith in the spiritual value of literary texts has waned, we have found ourselves more and more requiring knowledge about texts instead of encouraging the direct experience of these texts."

Notice the language here: direct experience, contact. The political and more broadly theoretical abstractions that have been thrown over the artwork from the outset, as it's often presented in class, block precisely this complex, essentially aesthetic experience. This experience, triggered by a patient engagement of some duration with challenging and beautiful language, by entry into a thickly layered world which gives shape and substance to one's own inchoate "cravings" and "longings," is the very heart, the glory, of the literary. Students -- some students -- arrive at the university with precisely these powerful ontological energies. Certain novels, poems, and plays, if they let them, can surprise these students, both with their anticipation of particularly acute states of consciousness, and their placement of those consciousnesses within formally ordered literary structures.

One of the noblest and most disciplinarily discrete things we can do in the classroom is to take those ontological drives seriously, to suggest ways in which great works of art repeatedly honor and clarify them as they animate them through character, style, and point of view.

One of the least noble and most self-defeating things we can do is avert our student's eye from the peculiar, delicate, and enlightening transaction I'm trying to describe here. When we dismiss this transaction as merely "moral" -- or as proto-religious -- rather than political, when we rush our students forward to formulated political beliefs, we fail them and we fail literature. Humanistic education is a slow process of assimilation, without any clear real-world point to it. We should trust our students enough to guide them lightly as they work their way toward the complex truths literature discloses.

Author/s: 
Margaret Soltan
Author's email: 
info@insidehighered.com

Margaret Soltan's blog, University Diaries, chronicles all aspects of contemporary American university life. Her essay "Don DeLillo and Loyalty to Reality" appears in the MLA's forthcoming Approaches to Teaching White Noise. She and Jennifer Green-Lewis are completing a manuscript titled The Promise of Happiness: The Return of Beauty to Literary Studies.

Last Bastion of Liberal Education?

Why do narratives of decline have such perennial appeal in the liberal arts, especially in the humanities?  Why is it, year after year, meeting after meeting, we hear laments about the good old days and predictions of ever worse days to come?  Why is such talk especially common in elite institutions where, by many indicators,  liberal education is doing quite well, thank you very much.  I think I know why.  The opportunity is just too ripe for the prophets of doom and gloom to pass up.

There is a certain warmth and comfort in being inside the “last bastion of the liberal arts,” as  B.A. Scott characterized prestigious colleges and research universities in his collection of essays The Liberal Arts in a Time of Crisis (NY Praeger, 1990). The weather outside may be frightful, but inside the elite institutions, if not “delightful,” it’s perfectly tolerable, and likely to remain so until retirement time.

Narratives of decline have also been very useful to philanthropy, but in a negative way.  As Tyler Cowen recently noted in The New York Times, “many donors … wish to be a part of large and successful organizations -- the ‘winning team’ so to speak.” They are not eager to pour out their funds in order to fill a moat or build a wall protecting some isolated  “last bastion.” Narratives of decline provide a powerful reason not to reach for the checkbook. Most of us in the foundation world, like most other people, prefer to back winners than losers. Since there are plenty of potential winners out there, in areas of pressing need, foundation dollars have tended to flow away from higher education in general, and from liberal education in particular.

But at the campus level there’s another reason for the appeal of the narrative of decline, a genuinely insidious one. If something goes wrong the narrative of decline of the liberal arts always provides an excuse. If course enrollments decline, well, it’s just part of the trend.  If students don’t like the course, well, the younger generation just doesn’t appreciate such material. If the department loses majors, again, how can it hope to swim upstream when the cultural currents are so strong?  Believe in a narrative of decline and you’re home free; you never have to take responsibility, individual or collective, for anything having to do with liberal education.  

There’s just one problem. The narrative of decline is about one generation out of date and applies now only in very limited circumstances. It’s true that in 1890, degrees in the liberal arts and sciences accounted for about 75 percent of all bachelor’s degrees awarded; today the number is about 39 percent, as Patricia J. Gumport and  John D. Jennings noted in “Toward the Development of Liberal Arts Indicators” (American Academy of Arts and Sciences, 2005). But most of that decline had taken place by 1956, when the liberal arts and sciences had 40 percent of the degrees. 

Since then the numbers have gone up and down, rising to 50 percent by 1970, falling to 33 percent by 1990, and then rising close to the 1956 levels by 2001, the last year for which the data have been analyzed. Anecdotal evidence, and some statistics, suggest that the numbers continue to rise, especially in  Research I universities.  

For example, in the same AAA&S report ("Tracking Changes in the Humanities) from which these figures have been derived, Donald Summer examines the University of Washington (“Prospects for the Humanities as Public Research Universities Privatize their Finances”) and finds that majors in the humanities have been increasing over the last few years and course demand is strong.

The stability of liberal education over the past half century seems to me an amazing story, far more compelling than a narrative of decline, especially when one recognizes the astonishing changes that have taken place over that time: the vast increase in numbers of students enrolled in colleges and universities,  major demographic changes, the establishment of new institutions, the proliferation of knowledge, the emergence of important new disciplines, often in the applied sciences and engineering, and, especially in recent years, the financial pressures that have pushed many institutions into offering majors designed to prepare students for entry level jobs in parks and recreation, criminal justice, and now homeland security studies. And, underlying many of these changes, transformations of the American economy.    

The Other, Untold Story

How, given all these changes, and many others too, have the traditional disciplines of the arts and sciences done as well as they have? That would be an interesting chapter in the history of American higher education. More pressing, however, is the consideration of one important consequence of narratives of decline of the liberal arts.

This is the “last bastion” mentality, signs of which are constantly in evidence when liberal education is under discussion. If liberal education can survive only within the protective walls of elite institutions, it doesn’t really make sense to worry about other places. Graduate programs, then, will send the message that success means teaching at a well-heeled college or university, without any hint that with some creativity and determination liberal education can flourish in less prestigious places, and that teaching there can be as satisfying as it is demanding.

Here’s one example of what I mean. In 2000, as part of a larger initiative to strengthen undergraduate liberal education,  Grand Valley State University, a growing regional public institution in western Michigan, decided to establish a classics department. Through committed teaching, imaginative curriculum design, and with strong support from the administration, the department has grown to six tenured and tenure track positions with about 50 majors on the books at any given moment. Most of these are first-generation college students from blue-collar backgrounds who had no intention of majoring in classics when they arrived at Grand Valley State, but many have an interest in mythology or in ancient history that has filtered down through popular culture and high school curricula. The department taps into this interest through entry-level service courses, which are taught by regular faculty members, not part timers or graduate students.

That’s a very American story, but the story of liberal education is increasingly a global one as well.  New colleges and universities in the liberal arts are springing up in many countries, especially those of the former Soviet Union.

I don’t mean that the spread of liberal education comes easily, in the United States or elsewhere. It’s swimming upstream. Cultural values, economic anxieties, and all too often institutional practices (staffing levels, salaries, leave policies and research facilities) all exert their downward pressure. It takes determination and devotion to press ahead. And those who do rarely get the recognition or credit they deserve.

But breaking out of the protective bastion of the elite institutions is vital for the continued flourishing of liberal education. One doesn’t have to read a lot of military history to know what happens to last bastions. They get surrounded; they eventually capitulate, often because those inside the walls squabble among themselves rather than devising an effective breakout strategy. We can see that squabbling at work every time humanists treat with contempt the quantitative methods of their scientific colleagues and when scientists contend that the reason we are producing so few scientists is that too many students are majoring in other fields of the liberal arts.  

The last bastion mentality discourages breakout strategies. Even talking to colleagues in business or environmental studies can be seen as collaborating with the enemy rather than as a step toward broadening and enriching the education of students majoring in these fields. The last bastion mentality, like the widespread narratives of decline, injects the insidious language of purity into our thinking about student learning, hinting that any move  beyond the cordon sanitaire is somehow foul or polluting and likely to result in the corruption of high academic standards.   

All right, what if one takes this professed concern for high standards seriously? What standards, exactly, do we really care about and wish to see maintained? If it’s a high level of student engagement and learning, then let’s say so, and be forthright in the claim that liberal education is reaching that standard, or at least can reach that standard if given half a chance. That entails, of course, backing up the claim with some systematic form of assessment.

That provides one way to break out of the last bastion mentality. One reason that liberal education remains so vital  is that when properly presented it contributes so much to personal and cognitive growth. The subject matter of the liberal arts and sciences provides some of the best ways of helping students achieve goals such as analytical thinking, clarity of written and oral expression,  problem solving, and alertness to moral complexity, unexpected consequences and cultural difference. These goals command wide assent outside academia, not least among employers concerned about the quality of their work forces. They are, moreover, readily attainable  through liberal education provided proper attention is paid to “transference.”  “High standards” in liberal education require progress toward these cognitive capacities.

Is it not time, then, for those concerned with the vitality of liberal education to abandon the defensive strategies that derive from the last bastion mentality, and adopt a new and much more forthright stance? Liberal education cares about high standards of student engagement and learning, and it cares about them for all students regardless of their social status or the institution in which they are enrolled.

There is, of course, a corollary. Liberal education can’t just make the claim that it is committed to such standards, still less insist that others demonstrate their effectiveness in reaching them, unless those of us in the various fields of the arts and sciences are willing to put ourselves on the line. In today’s climate  we have to be prepared to back up the claim that we are meeting those standards. Ways to make such assessments are now at hand, still incomplete and imperfect, but good enough to provide an opportunity for the liberal arts and sciences to show what they can do.

That story, I am convinced, is far more compelling than any narrative of decline.

Author/s: 
W. Robert Connor
Author's email: 
newsroom@insidehighered.com

W. Robert Connor is president of the Teagle Foundation and blogs frequently about liberal education.

A New Form of Academic Engagement

In her president’s column in the spring 2006 Modern Language Association newsletter, Marjorie Perloff focuses on the expansion of Ph.D. programs in creative writing (including doctorates in English that allow for a creative dissertation). Perloff argues that the growth of creative-writing doctorates was a reaction to politicization and specialization within the English discipline: “An examination of the catalogues of recently established Ph.D. programs in creative writing suggests that, in our moment, creative writing is perhaps best understood as the revenge of literature on the increasingly sociological, political, and anthropological emphasis of English studies.”  

She also cites recent job advertisements in English calling for candidates specializing in a range of theoretical approaches, which relegate the teaching of literature to “a kind of afterthought, a footnote to the fashionable methodologies of the day.”

Perloff is right on both counts: These are central factors that have led to the growth of creative writing Ph.D.s. But she also misses an important element, one that grows out of, but also underlies, the others. It is that people want what they think and write to matter, not just to their colleagues, but also to the world at large. Creative work, and the doctorate in creative writing, holds out this hope.

The doctorate in creative writing comes in various forms, but most are very similar to literary studies doctorates. I myself am a doctoral candidate in English at the University of Denver, writing a creative dissertation -- a book of poetry, accompanied by a critical introduction. As a graduate student, I’ve fulfilled the same coursework requirements as my literary studies peers, with the addition of four writing workshops. I’ve taken comprehensive exams in the same format as my literary studies peers. I’ve taught more or less the same courses as my literary studies peers. The only significant difference between my doctoral work and my literary studies colleague is in the dissertation.  

Sometimes, in fact, it strikes me as a bit comic to be doing the creative dissertation, but then I think about the fate of my work. I want my work to find its audience, though I realize that poetry has lost more readers, perhaps, than scholarship has over the last 50 years. Yet I believe that creative writing holds out more hope of finding readers, and of gaining readers back, than scholarship does. Hundreds or thousands of poetry books are published each year, and are more likely to find their way onto the shelves of bookstores than are scholarly studies. For fiction writers, the prospects are even better -- after all, there’s still a market for novels and short fiction.

However, it’s not just for readerly recognition that I want to do this creative work. It is because literature matters to how people live their lives, not just emotionally but intellectually. I speak here specifically of literature, but I think the principle holds true for any kind of creative work, even works we wouldn’t ordinarily think of as artistic, such as historical or psychological or anthropological studies.

Just a few days ago I was talking with a good friend of mine, a fellow graduate student working on her dissertation. My friend’s enthusiasm for the work and the discoveries that she is making, her eloquence on her subject, and her physical animation in talking about it were obvious, even if some of the nuance of her project was lost on me. But then she stopped herself and said, “Of course, nobody really cares about this.”

She described the frustration of talking about her project with non-academic friends and family members, how it takes too long to explain the work she is doing to people outside her specialty area, how their faces fall blank as she goes on too long in explaining the foundations of the debate in which she is involved. She laughed and said archly, “It’s not so bad once you get used to the idea that no one is ever going to read your dissertation except your committee.”  

I have had similar conversations with other friends working on dissertations, not just in English, but across the humanities, though the sense of writing into the void is particularly marked among those in my discipline. Let me say here that I don’t want to challenge the value of discipline-specific, specialized scholarship -- after all, it would be foolish to say that the intellectual work of teaching and writing does not require specialist knowledge, or that the ideas formulated in scholarly work don’t find their way to non-specialists through good teaching or through popularizers or public intellectuals, though we could stand a few more of them. Those academics who write for an extra-disciplinary audience, as Mark Oppenheimer pointed out in a recent essay in The Chronicle of Higher Education, play an important part in connecting the academy with the non-academic world, and shaping common conceptions of disciplines such as history. He wrote: “They have the influence that comes with writing for journals at the intersection of academe and the culture at large. They interpret scholarship for people who prefer to read journalism, and their opinions reverberate and multiply, if in ways that we cannot measure.”  

This is not a plea for greater “accessibility” or for a return to a “generalist” approach to English. Nor will I rehearse the yearly mocking that the titles of papers at the MLA convention get in major newspapers across the country.  But I do think that the sense that nobody’s listening or reading the work of scholars outside their specialized communities points to a real problem of the contemporary humanities department: the loss of audience, and with it, the loss of a sense that the work should matter to a larger, educated, non-academic audience.  

There’s no doubt that scholars have produced critical work in humanities subjects that does matter. I think of Raymond Williams, never the easiest of writers, but one who rewards the effort made in engaging with his work and who, perhaps because of his quasi-academic status, writes in such a way that his ideas could be understood outside the academy. I also think of John Berger, Susan Sontag and Fredric Jameson.  These are writers who can be exciting for readers coming from outside the academy, and who can influence the way readers experience texts, and even life.  

However, with the increasing professionalization of the university, the potential audience for scholarly work has diminished as scholarly writing has become more specialized and jargon-ridden. None of what I say is news, I know. But the creative doctorate as an approach to making scholarly research and thinking matter in the world is news, and very good news.  

I think it is important that what we do, literary scholars and creative writers both, makes a difference to how people outside academy walls think. In the history of rhetorical theory, there is a recurring, commonplace idea that the person trained in rhetoric will, through the ethical training in that discipline, constitutionally be able to contribute only to good actions or ideas that improve the state or the community. Cicero put the idea most succinctly, and most famously, in his definition of the ideal orator/citizen as “the good man speaking well.” Learning to “speak well,” however, required years of intense training in the minutiae of the discipline, a close study of the history of oratory. 

While this ideal resolutely -- and somewhat courageously -- ignores what we know about human behavior, I do think that as an ideal it offers an important model to live up to. I see the Ph.D. in creative writing as an opportunity to undertake the same kind of close study of literature and writing as ancient rhetoricians would have undergone in their study of oratory, and as a way to position myself to bring that knowledge and experience into both my writing and the classroom without having to give up, or shelve for a long period, my creative work. In fact, it was in support of my creative work that I took up doctoral study.  

Cicero’s formulation of the ideal citizen leads me back to my own ideals about the creative dissertation. The creative writer makes a difference not by telling people how to vote, or by engaging in the public sphere with anti-government screeds. Rather, the way literature can matter is by offering a model of the world as it is, in the hope that readers will be moved to action in the real world. Literature is a form of epideictic rhetoric, perhaps the form par excellence of the epideictic:a poem or a novel or a film argues for the values that its authors believe are important to the way we live our lives. 

For example, Bertolt Brecht, in his essay “The Modern Theater is the Epic Theater,” makes a list of what it is that epic theater does.  According to Brecht’s list, the epic theater:

turns the spectator into an observer, but
arouses his capacity for action
forces him to take decisions
[provides him with] a picture of the world
he is made to face something…   
brought to the point of recognition

I read Brecht’s description of epic theater’s functions as a modernist reworking of the ideal orator tradition, the tradition of the artist offering his readers more than polemic -- offering his readers an experience from which they can learn about their own lives.    

The creative Ph.D. is vital to making this possible, if it is possible, because literature (any art, in fact) does not come from nowhere. Or, more importantly, it should not come from nowhere.  Good writing comes from intense study and reading, the kind of reading that people don’t typically have time for in the frenetic world of contemporary business or the professions. Moreover, what I would call good writing, the kind of writing that, regardless of genre, has something in common with Brecht’s epic theater, requires its author to have a sense of its location between the past and the present.  

The Ph.D. in creative writing gives writers the time and training to explore their fields that they may not get in M.F.A. programs, no longer get as undergraduates, and certainly do not get in high school. At the very least, doctoral work exposes writers and artists to a liberal education that prepares them for analyzing, framing and being in the world in any number of different ways. Doctoral-level reading, doctoral-level thinking, doctoral-level writing will make possible the art that creative Ph.D.s will produce. I think here of Flaubert’s quip that, in preparation for Bouvard and Pecuchet, he had to read 300 books to write one (though the reference might cut both ways, as Julian Barnes has described that book as challenging in being “a vomitorium of pre-digested book learning”). I could call on Matthew Arnold and T.S. Eliot as well, were I eager to lay myself open to misguided charges of cultural conservatism. 

But the human need for learning through art goes beyond liberal or conservative approaches to writing and teaching. The experience of literary study at the highest level gives writers the cognizance of literary history they need to produce the epic theater, the epideictic, of our time -- to be good men and women speaking well, writing well, leading and teaching.  

The issue for creative writing is that of quality. The value of the creative doctorate is in the opportunity it offers to unite the best elements of the scholarly study of literature or art with the best elements of the study of craft. The writers and artists who come out of creative Ph.D. programs will not only be better guardians of our various and multiform cultural heritage, but they will be better teachers, better thinkers, better innovators. Their research and learning, in the form of creative and critical work, will matter both in the academy and beyond.

In her column, Perloff poses the rhetorical question of where the doctorate in creative writing leaves the idea of the doctorate as such. “Hasn’t the doctorate always been a research degree?” her concerned professor asks in the face of invading creative writers. Yes, it has been, and for creative writers, it remains vitally so.

Author/s: 
David Gruber
Author's email: 
doug.lederman@insidehighered.com

David Gruber is assistant to the director of the University Writing Program and a graduate teaching assistant in English at the University of Denver.

Digital Masonry

Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”

Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)

When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.

He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)

Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.

Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.

It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)

Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.

“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”

In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.

For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)

The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.

The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.

But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.

Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality.
Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.

“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”

But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.

“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”

Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.

“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.”
As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.

“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”

(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)

“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”

Author/s: 
Scott McLemee
Author's email: 
info@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Pages

Subscribe to RSS - English
Back to Top