English

Liberal Arts II: The Economy Requires Them

Many of us committed to the liberal arts have been defensive for as long as we can remember.

We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”

We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.

There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.

The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.

We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.

Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.

We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.

If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.

In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.

Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.

For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.

It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.

In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.

This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.

We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.

Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.

Author/s: 
Richard A. Greenwald
Author's email: 
newsroom@insidehighered.com

Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.

Liberal Arts I: They Keep Chugging Along

When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)

One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.

So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?

The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.

What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.

Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.

What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.

Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.

Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.

If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.

Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.

A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.

It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective.
And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”

Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.

One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.

But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.

Author/s: 
W. Robert Connor and Cheryl Ching
Author's email: 
info@insidehighered.com

W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.

Putting the 'Humanities' in 'Digital Humanities'

Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?

I think this digital dominance revolves around two problems.

The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.

The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.

For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.

Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.

Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.

Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.

In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.

I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.

Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.

Author/s: 
Phillip Barron
Author's email: 
newsroom@insidehighered.com

Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.

As Others See Us

A genome biologist, Gregory Petsko, has gone to bat for the humanities, in an open letter to the State University of New York at Albany president who recently (and underhandedly) announced significant cuts. (For those who haven’t been paying attention: the departments of theater, Italian, Russian, classics, and French at SUNY-Albany are all going to be eliminated).

If you are in academia, and Petsko’s missive (which appeared on this site Monday) hasn’t appeared on your Facebook wall, it will soon. And here’s the passage that everyone seizes on, evidence that Petsko understands us and has our back (that is, we in the humanities): "The real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained."

He's right. And if scientists want to speak up for the humanities, I’m all for it. But Petsko understands us differently than we understand ourselves. Why fund the humanities, even if they don’t bring in grant money or produce patents? Petsko points out "universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment."

How many us willingly embrace that interpretation of what we do? "My interest is not merely antiquarian...." is how we frame the justification for our cutting edge research. Even as we express our dismay when crucial texts go out of print, any sacred flame that we were tending was blown out when the canon wars were fought to a draw. Why should we resurrect it? Because, says Petsko, "what seems to be archaic today can become vital in the future." His examples are virology and Middle Eastern studies. Mine is 18th-century literature — and with all the imaginative vigor at my disposal, I have trouble discerning the variation on the AIDS scare or 9/11 that would revive interest in my field. That’s OK, though: Petsko has other reasons why the humanities matter:

"Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts... If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future."

Well, that would be great. I have no confidence, though, that we in the humanities are positioned to take advantage of this dawning world, even if our departments escape SUNY-style cost-cutting. How many of us can meaningfully apply what we do to "the question of just what it means to be human" without cringing, or adopting an ironic pose, or immediately distancing ourselves from that very question? How many of us see our real purpose as teaching students to draw the kinds of connections between literature and life that Petsko uses to such clever effect in his diatribe?

Petsko is not necessarily right in his perception of what the humanities are good for, nor are professionals in the humanities necessarily wrong to pursue another vision of what our fields are about. But there is a profound disconnect between how we see ourselves (and how our work is valued and remunerated in the university and how we organize our professional lives to respond to those expectations) and how others see us. If we're going to take comfort in the affirmations of Petsko and those outside of the humanities whom he speaks for, perhaps we need to take seriously how he understands what we do. Perhaps the future is asking something of us that we are not providing — or perhaps we need to do a better job of explaining why anyone other than us should care about what we do.

Author/s: 
Kirstin Wilcox
Author's email: 
info@insidehighered.com

Kirstin Wilcox is senior lecturer in English at the University of Illinois at Urbana-Champaign.

Dollars and Sense

Over the summer I was fortunate to attend a symposium for department chairs. I was glad of the experience, and grateful to my institution's leaders that they supported me in this way. I was at a swanky hotel in a fabulous city (ironically, despite the recent economic turmoil). While I was there, it was a pleasure to talk to and hear from other department chairs from across the country. The speakers were generally well-qualified, experienced, and good presenters (although the perspective was what the chief academic officer expects of the chair — and we have other constituencies, after all).

But the thing that concerned me during the experience was the focus on … well, what was it? Fiscal responsibility? I can’t complain about fiscal responsibility — that wouldn’t make sense. I’m not so quixotic that I’d counsel running an institution into the red, but I kept getting the impression that our talks weren’t entirely about fiscal responsibility as much as they were about profit margins.

In one session we were taken through some cost implications of a variety of different class offerings. The underlying idea was that the more we fine-tuned our offerings to maximize seats in classes the stronger our program would appear to the chief academic officer, and the more likely we were to be well — the cynic in me wonders, "adequately"? — funded. So, instead of offering a couple of upper-level classes that catered to six students each (a reality in a lot of small colleges, and a net loss financially to the institution), if the program could tinker with the class scheduling to offer one class that catered to 12 (or more) students, there would be a net gain for the institution.

In English — like a lot of departments that have a service component — smaller, apparently esoteric, upper-level classes are generally far outweighed by the fully enrolled composition classes (and the program thus generates significant income that offsets occasional loss classes). Still, I could still see the merits of creative scheduling. Obviously the higher the student yield in each class, the better the financial implications. So, the session was thought-provoking and challenging.

We are sometimes guilty of thinking that schedules, rotations, and course offerings are set in stone. Like my students who sometimes think that the first draft of an essay is the best and only draft they need to write, I could see that perhaps I was sometimes guilty of maintaining a status quo of class rotations that occasionally resulted in small upper-level classes where creative planning might create a program that was much more vigorous (though presumably with significantly less choice for students). We end up caught in a conflict between constituencies, between the CAO who would prefer one Shakespeare class with 12, and students (and perhaps faculty), who would probably prefer the choice of a Shakespeare seminar which might end up with six, and a Faulkner seminar with six.

Financially, the answer is straightforward, but the best situation for the institution is another matter, and a follow-up session worried me about the potential ramifications.

We were offered a scenario similar to this: Times are lean. A variety of departments are competing for a single tenure line. The English department has a prominent, and well-loved, Shakespearean scholar retiring. As chair, you must make a pitch for a new hire, and the chief academic officer has a penchant for fully enrolled classes (how unreasonable is that?). The broader context of the hire is that the business department is aggressively expanding its coverage of e-commerce, and the graphic design department wants to add writing certification to its web design classes. Only one hire is going to be authorized because of the economic situation. What is your pitch?

I have to admit that most of the room had little problem with this one. They came up with an English hire with expertise in e-writing and web publishing. The English department would get an interesting new hire: current, engaging, relevant to today’s students. The hire would also foster a dynamic interdisciplinary relationship with other schools, and the potential for cross-listed classes that would cater to a range of majors. The business department was happy, as was graphic design, and English had kept its tenure line. Everyone was happy and the CAO had a grin that rivaled the Cheshire cat. This was easy.

As a general rule, I would have been perfectly content with this kind of hire under different circumstances. I could see the benefits. I was/am excited by the current content, the interdisciplinary focus, the potential benefits to a variety of programs. But, at what cost? For me, the takeaway message here was that Shakespeare had had his day and I felt that an important line was being crossed.

It seemed to me that the very nature of the university was at stake somehow, and while most of the room was watching the numbers add up in an Excel spreadsheet, something important was being lost in the debit column. What is the role of the university, after all, and the officers of the university from the chief academic officer, to the dean, the chair and the faculty? To what, or to whom, are they answerable? Certainly they are responsible for the economic well-being of the institution. But isn’t there more to it?

Though perhaps it sounds like it, I'm not a snob. I worked for years in the community college system and was, for example, truly excited by the Harley Davidson repair certification program offered at one of my previous institutions. It was a genuinely cool program, and a terrific career opportunity for some students, and the state-of-the-art workshop looked amazing. I can see the merit of strong vocational offerings at the community college and the university, but is that the totality of the mission of the higher education tier? And are numbers the bottom line?

In the session I argued for the replacement of the Shakespearean scholar. It was a significant loss to the department, and one that would hurt the fundamentals of the program. I could see the benefits of the counter-proposal, and was broadly supportive of the rationale that came up with the web writer instead of the literature scholar. I suppose if I were more conciliatory I could have compromised and proposed a strong e-Shakespearean (no doubt they exist). But the implications of my choice were serious.

According to the strict parameters of the exercise, I had probably just lost my department a tenure line. But I had held strong to values that seemed worth it at the time. Perhaps it was an empty gesture, or a foolish one? I thought it was an intellectual position, a cultural role that was important to defend.

In the real world, and away from tricky seminars, the problems are just as profound, though the answers are seldom as obvious. What is certain, however, is that numbers play an increasingly important role in the shape of our departmental offerings. Though I wonder sometimes if the economic background is being used as an excuse to push through certain kinds of institutional reform, this one isn't likely to be going away, and it's our responsibility throughout the various levels of the university to examine our educational, intellectual and cultural responsibilities to our communities within the context of responsible fiscal management.

Author/s: 
David Mulry
Author's email: 
info@insidehighered.com

David Mulry is chair of English and foreign languages at Schreiner University.

The Year in Reading

For this week’s column (the last one until the new year) I asked a number of interesting people what book they’d read in 2010 that left a big impression on them, or filled them with intellectual energy, or made them wish it were better known. If all three, then so much the better. I didn’t specify that it had to be a new book, nor was availability in English a requirement.

My correspondents were enthusiastic about expressing their enthusiasm. One of them was prepared to name 10 books – but that’s making a list, rather than a selection. I drew the line at two titles per person. Here are the results.

Lila Guterman is a senior editor at Chemical and Engineering News, the weekly magazine published by the American Chemical Society. She said it was easier to pick an outstanding title from 2010 than it might have been in previous years: “Not sleeping, thanks to a difficult pregnancy followed by a crazy newborn, makes it almost impossible for me to read!”

She named Rebecca Skloot’s The Immortal Life of Henrietta Lacks, published by Crown in February. She called it an “elegantly balanced account of a heartbreaking situation for one family that simultaneously became one of the most important tools of biology and medicine. It was a fast-paced read driven by an incredible amount of reporting: A really exemplary book about bioethics.”

Neil Jumonville, a professor of history at Florida State University, is editor of The New York Intellectual Reader (Routledge, 2007). A couple of collections of essays he recently read while conducting a graduate seminar on the history of liberal and conservative thought in the United States struck him as timely.

“The first is Gregory Schneider, ed., Conservatives in America Since 1930 (NYU Press, 2003). Here we find a very useful progression of essays from the Old Right, Classical Liberals, Traditional Conservatives, anticommunists, and the various guises of the New Right. The second book is Michael Sandel, Liberalism and Its Critics (NYU Press, 1984). Here, among others, are essays from Isaiah Berlin, John Rawls, Robert Nozick, Alisdair MacIntyre, Michael Walzer, a few communitarians represented by Sandel and others, and important pieces by Peter Berger and Hannah Arendt.”

Reading the books alongside one another, he said, tends to sharpen up one's sense of both the variety of political positions covered by broad labels like “liberal” and “conservative” and to point out how the traditions may converge or blend. “Some people understand this beneficial complexity of political positions,” he told me, “but many do not.”

Michael Yates retired as a professor of economics and labor relations at the University of Pittsburgh at Johnstown in 2001. His most recent book is In and Out of the Working Class, published by Arbeiter Ring in 2009.

He named Wallace Stegner’s The Gathering of Zion: The Story of the Mormon Trail, originally published in 1964. “I am not a Mormon or religious in the slightest degree,” he said, “and I am well aware of the many dastardly deeds done in the name of the angel Moroni, but I cannot read the history of the Mormons without a feeling of wonder, and I cannot look at the sculpture of the hand cart pioneers in Temple Square [in Salt Lake City] without crying. If only I could live my life with the same sense of purpose and devotion…. It is not possible to understand the West without a thorough knowledge of the Mormons. Their footprints are everywhere."

Adam Kotsko is a visiting assistant professor of religion at Kalamazoo College. This year he published Politics of Redemption: The Social Logic of Salvation (Continuum) and Awkwardness (Zero Books).

“My vote," he said, "would be for Sergey Dogopolski's What is Talmud? The Art of Disagreement, on all three counts. It puts forth the practices of Talmudic debate as a fundamental challenge to one of the deepest preconceptions of Western thought: that agreement is fundamental and disagreement is only the result of a mistake or other contingent obstacle. The notion that disagreements are to be maintained and sharpened rather than dissolved is a major reversal that I'll be processing for a long time to come. Unfortunately, the book is currently only available as an expensive hardcover.”

Helena Fitzgerald is a contributing editor for The New Inquiry, a website occupying some ambiguous position between a New York salon and an online magazine.

She named Patti Smith’s memoir of her relationship with Robert Mapplethorpe, Just Kids, published by Ecco earlier this year and recently issued in paperback. “I've found Smith to be one of the most invigorating artists in existence ever since I heard ‘Land’ for the first time and subsequently spent about 24 straight with it on repeat. She's one of those artists who I've long suspected has all big secrets hoarded somewhere in her private New York City. This book shares a satisfying number of those secrets and that privately legendary city. Just Kids is like the conversation that Patti Smith albums always made you want to have with Patti Smith.”

Cathy Davidson, a professor of English and interdisciplinary studies at Duke University, was recently nominated by President Obama to serve on the National Council on the Humanities. She, too, named Patti Smith’s memoir as one of the books “that rocked my world this year.” (And here the columnist will interrupt to give a third upturned thumb. Just Kids is a moving and very memorable book.)

Davidson also mentioned rereading Tim Berners-Lee's memoir Weaving the Web, first published by HarperSanFrancisco in 1999. She was “inspired by his honesty in letting us know how, at every turn, the World Wide Web's creation was a surprise, including the astonishing willingness of an international community of coders to contribute their unpaid labor for free in order to create the free and open World Wide Web. Many traditional, conventional scientists had no idea what Berners-Lee was up to or what it could possibly mean and, at times, neither did he. His genius is in admitting that he forged ahead, not fully knowing where he was going….”

Bill Fletcher Jr., a senior scholar at the Institute for Policy Studies, is co-author, with Fernando Gapasin, of Solidarity Divided, The Crisis in Organized Labor and A New Path Toward Social Justice, published by the University of California Press in 2009.

He named Marcus Rediker and Peter Linebaugh’s The Many-Headed Hydra: The Hidden History of the Revolutionary Atlantic (Beacon, 2001), calling it “a fascinating look at the development of capitalism in the North Atlantic. It is about class struggle, the anti-racist struggle, gender, forms of organization, and the methods used by the ruling elites to divide the oppressed. It was a GREAT book.”

Astra Taylor has directed two documentaries, Zizek! and Examined Life. She got hold of the bound galleys for James Miller’s Examined Lives: From Socrates to Nietzsche, out next month from Farrar Straus and Giroux. She called it “a book by the last guy I took a university course with and one I've been eagerly awaiting for years. Like a modern day Diogenes Laertius, Miller presents 12 biographical sketches of philosophers, an exploration of self-knowledge and its limits. As anyone who read his biography of Foucault knows, Miller's a master of this sort of thing. The profiles are full of insight and sometimes hilarious.”

Arthur Goldhammer is a senior affiliate of the Center for European Studies at Harvard University and a prolific translator, and he runs an engaging blog called French Politics.

“I would say that Florence Aubenas' Le Quai de Ouistreham (2010) deserves to be better known,” he told me. “Aubenas is a journalist who was held prisoner in Iraq for many months, but upon returning to France she did not choose to sit behind a desk. Rather, she elected to explore the plight of France's ‘precarious’ workers -- those who accept temporary work contracts to perform unskilled labor for low pay and no job security. The indignities she endures in her months of janitorial work make vivid the abstract concept of a ‘dual labor market.’ Astonishingly, despite her fame, only one person recognized her, in itself evidence of the invisibility of social misery in our ‘advanced’ societies.”

Anne Sarah Rubin is an associate professor of history at the University of Maryland, Baltimore County and project director for Sherman’s March and America: Mapping Memory, an interactive historical website.

The book that made the biggest impression on her this year was Judith Giesberg's Army at Home: Women and the Civil War on the Northern Home Front, published by the University of North Carolina Press in 2009. “Too often,” Rubin told me, “historians ignore the lives of working-class women, arguing that we don't have the sources to get inside their lives, but Giesberg proves us wrong. She tells us about women working in Union armories, about soldiers' wives forced to move into almshouses, and African Americans protesting segregated streetcars. This book expands our understanding of the Civil War North, and I am telling everyone about it.”

Siva Vaidhyanathan is a professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything: (And Why We Should Worry), will be published by the University of California Press in March.

He thinks there should have been more attention for Carolyn de la Pena's Empty Pleasures: The Story of Artificial Sweeteners from Saccharin to Splenda, published this year by the University of North Carolina Press: “De la Pena (who is a friend and graduate-school colleague) shows artificial sweeteners have had a powerful cultural influence -- one that far exceeds their power to help people lose weight. In fact, as she demonstrates, there is no empirical reason to believe that using artificial sweeteners helps one lose weight. One clear effect, de la Pena shows, is that artificial sweeteners extend the pernicious notion that we Americans can have something for nothing. And we know how that turns out.”

Vaidhyanathan noted a parallel with his own recent research: “de la Pena's critique of our indulgent dependence on Splenda echoes the argument I make about how the speed and simplicity of Google degrades our own abilities to judge and deliberate about knowledge. Google does not help people lose weight either, it turns out.”

Michael Tomasky covers U.S. politics for The Guardian and is editor-in-chief of Democracy: A Journal of Ideas.

“On my beat,” he said, “the best book I read in 2010 was The Spirit Level (Bloomsbury, 2009), by the British social scientists Richard Wilkinson and Kate Pickett, whose message is summed up in the book's subtitle, which is far better than its execrable title: ‘Why Greater Equality Makes Societies Stronger.’ In non-work life, I'm working my way through Vasily Grossman's Life and Fate from 1959; it's centered around the battle of Stalingrad and is often called the War and Peace of the 20th century. I'm just realizing as I type this how sad it is that Stalingrad is my escape from American politics.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Because: a Manifesto

Because the failures of a flawed system are not my personal failures.

Because I am tired of being made to feel like a failure because I have been failed by a flawed system.

Because doing the same thing over and over again and expecting a different result is stupidity.

Because participating in a system that degrades, demeans, and disempowers you is masochism.

Because productivity for productivity’s sake is futility.

Because stupidity, masochism, and futility should not be rewarded.

Because obfuscation, elitism, arrogance, and self-righteousness should not be rewarded.

Because my talents, accomplishments, experiences, and hard work are not acknowledged or rewarded in this system.

Because I am not not nurtured, encouraged, or valued in this system.

Because those in a position to change the system do not.

Because I refuse to believe that a system that does not value me is the only one in which I can have worth.

Because I am enduring personal, financial, and professional hardship to no perceivable purpose.

Because I am being limited personally, financially, professionally, and creatively.

Because I already got what I came for -- three advanced degrees and immersion in a subject I love.

Because I want to continue to love it.

Because life is short.

Because sometimes I consider how my light is spent.

Because I don’t want to live here.

Because I am prevented from doing the work I was trained and prepared to do.

Because there are other places where that training and preparation will be rewarded, respected, and used.

Because I am capable of more than I can do here.

Because leaving the system is a reclamation of the dignity and agency it has attempted to take from me…

I am leaving the academy.

Author/s: 
Anonymous
Author's email: 
info@insidehighered.com

This piece was originally published on the blog Paraphernalian and is published here with the author's permission.

Female Complaints

Fiction writers were not yet using the term “stream of consciousness” when Charlotte Perkins Gilman published “The Yellow Wall-Paper” in 1892. The phrase itself first appeared in print that same year, when William James used it while preparing an abridged edition of his Principles of Psychology (1890), where he’d coined a similar expression, “stream of thought.” I do not know if Gilman ever studied James’s work. It’s clear from Helen Lefkowitz Horowitz’s Wild Unrest, published by Oxford University Press, that Gilman was as voracious and au courant a reader as any American thinker of her day. And she certainly took exception to the unflattering portrait of the suffragists drawn by the philosopher’s brother Henry in The Bostonians, which she read as it was being serialized in 1885. (By 1898, Gilman’s internationally famous book Women and Economics made her not just one of the most prominent adherents of that movement, but arguably its most tough-minded public intellectual.)

Either way, Gilman had her own reasons for wanting to convey the flow of awareness in a piece of fiction. Her narrator is a woman who, following childbirth, has been prescribed bed rest and virtual isolation by her husband, who is a physician. Her sister-in-law keeps an eye on the new baby, and she also seems charged with task of making sure the narrator stays in her room. Even jotting down a few lines in her diary feels like a violation of her husband’s commands. With nothing else to occupy her attention, the narrator stares at the ugly, crumbling wallpaper in her room. Her attention sinks into the pattern of swirls. She begins to notice the image of a woman who is trapped in the design, but who is somehow able to sneak out into the real world without others noticing. Boredom and depression give way to psychosis.

“Every definite image in the mind is steeped and dyed in the free water that flows round it,” writes William James. “With it goes the sense of its relations, near and remote, the dying echo of whence it came to us, the dawning sense of whither it is to lead.” The image Gilman’s narrator finds in the shabby yellow wallpaper is “steeped and dyed” in the well-meaning oppressiveness of her circumstances. Trapped in domesticity and then rendered completely passive, her stream of consciousness turns brackish. But it’s the social norms that are deranged, at least as much as her mind.

The value of Horowitz’s book -- subtitled “Charlotte Perkins Gilman and the Making of ‘The Yellow Wall-Paper’ ” -- is not that it reveals an autobiographical element in the story. The author herself made that clear in an essay from 1913. Gilman indicated that she had been subjected to a similar course of treatment following a period of postpartum depression. In 1887, a doctor gave her “solemn advice to ‘live as domestic a life as far as possible,’ to ‘have but two hours' intellectual life a day,’ and ‘never to touch pen, brush, or pencil again’ as long as I lived.” For a woman who had earned a modest living by painting and writing in her 20s, this must have felt like a kind of death sentence.

But Horowitz, a professor emerita of history at Smith College, has excavated parts of the record that go far beyond Gilman’s account of patriarchal malpractice. The doctor in question was one S. Weir Mitchell, then at the height of his fame; his reputation had been secured during the Civil War when he published a book on the neurological effect of gunshot wounds. He claimed great success in treating what were thought of then as female nervous conditions – though it’s not as if Mitchell made that sharp a distinction between mental health and mental illness with women. Horowitz quotes him commenting on “how near to disorder and how close to misfortune [a woman] is brought by the very peculiarities of her nature.”

So, yes, a sexist pig, pretty much. But Horowitz determines from the available evidence that the treatment Mitchell prescribed for his female patients wasn’t quite the nightmare of sensory deprivation portrayed in “The Yellow Wall-Paper.” While known for his “rest cure,” this didn't involve putting them under the command of their husbands. Indeed, he wanted his patients to recuperate away from their families, just to get them away from influences that might be wearing them down. Mitchell believed in the therapeutic effects of exercise, and he also encouraged women to open up to him about their unhappiness – a Yankee approximation of the “talking cure” later associated with Vienna.

The feeling of being trapped and helpless evoked by “The Yellow Wall-Paper” must be traced back to other sources, then. Horowitz suggests that the story embodies “its author’s experiences of love, ambition, and depression in her 20s.” They can be reconstructed from both Gilman’s own writings and the extensive diary kept by Charles Walter Stetson, her first husband. (They divorced in the 1890s.)

“In the late 19th century,” Horowitz writes, “a time when roughly 10 percent of American women did not marry, almost half of all women with a B.A. remained single.” While Gilman was largely self-educated, her situation was comparable. She took it as a given that having a career would mean forgoing wedlock. And vice versa, as far as Stetson was concerned. A painter of some promise if no great worldly success, he seems to have thought Charlotte ought to be content with serving as his own personal Pre-Raphaelite muse. Her desire to have any other career baffled him.

The possibility that these two people might make each other happy was not great. But that’s not to say that the husband, rather than the doctor, was the real villain. This isn’t a melodrama. Stetson wasn’t brutal or vicious, just obtuse. In Gilman's autobiography, notes Horowitz, she "lavished praise on her first husband," and seems to have directed any lingering rage at the figure of Dr. Mitchell.

Someone more imaginative and less conventional than Stetson might have made her a good spouse, though New England in the 1880s was not full of such men. Reading about their courtship is like watching a tragedy. You want to intervene and warn her, but it’s too late, of course. The feeling is especially painful as you watch Gilman persuade herself to ignore her own misgivings. The most extreme case comes when, after meeting Stetson and beginning to pitch woo with him, Gilman sat down to read Herbert Spencer’s opus The Data of Ethics. It was Spencer, not Darwin, who coined the phrase “survival of the fittest,” and he remained an immensely influential thinker well into the early 20th century. (From my perspective, here in the 21st, this is quite an enigma, since Spencer's writings often seem like something Thomas Friedman might produce after being hit on the head and deciding that he was Hegel.)

“The instincts and sentiments which so overpoweringly prompt marriage,” wrote Spencer, “and those which find their gratification in the fostering of offspring, work out an immense surplus of benefit after deducting all evils.” Gilman took this to heart, and in an unpublished poem she vowed to follow the Spencerian injunction to marry and so become "a perfect woman / worth the gladness superhuman." It did not work out that way. She ended up like the narrator of “The Yellow Wall-Paper” -- a prisoner of social expectations that left no room for argument.

But Wild Unrest, by contrast, has a happy ending. Gilman managed to escape. She reinvented herself as a writer and speaker. And then, in 1900, she married a man (her cousin George Houghton Gilman) who, Horowitz says, “relished her professional attainments and growing reputation.” I’d like to think that she found in life what Spencer had advertised: “an immense surplus of benefit after deducting all evils.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Sorry

I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.

In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?

A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.

The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.

And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?

After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.

If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.

One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.

The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.

As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.

And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.

Author/s: 
Stephen Brockmann
Author's email: 
info@insidehighered.com

Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.

School of Fish

Stanley Fish's latest book is How to Write a Sentence: and How to Read One, published by HarperCollins, and it is doing pretty well. As I write this, it is the 158th best-selling book on Amazon, and ranked number one in sales for reference books in both education and rhetoric. It is also in eighth place among books on political science. This is peculiar, for it seems perfectly innocent of political intention. The title is not playing any games. It is a tutorial on how to recognize and learn from good sentences, the better to be able to fashion one. It could be used in the classroom, though that does not seem its true remit. Fish has pedagogical designs going beyond the university. The “intended reader” (to adopt an expression Fish used during an earlier stage of his work) appears to be someone who received the usual instruction in composition, in secondary school or college, without gaining any confidence about writing, let alone a knack for apt expression. And that describes a lot of people.

His advice to them, if not in so many words, is that they learn to practice Fishian literary criticism. How to Write a Sentence offers a series of lessons in “affective stylistics,” as he called the approach he developed three or four decades ago. This is not an interpretive method but a form of close reading, focusing less on what a given line in a literary work means than on what it does: how it creates its effects in the reader's awareness. This requires taking a sentence slowly – and also taking it apart, to determine how its elements are arranged to place stress on particular words or phrases, or to play them off against one another. (One formulation of Fish's work in affective stylistics is found in this essay, in PDF.)

A fair bit of the book -- roughly half of each chapter, and sometimes more -- amounts to a a course in affective stylistics, though happily one conducted without resorting to jargon. Fish examines individual sentences from Edgar Allen Poe, Virginia Woolf, Philip Roth, and dozens of other authors to show how they work. Most are literary figures, though Martin Luther King, Jr. and Antonin Scalia also make the cut. Most of the rest of it consists of explanations of some basic stylistic modes and how they impose order on (or extract it from) the world. Fish suggests a few exercises intended to encourage readers to experiment with creating sentences that are tightly structured, or loose and rambling, or epigram-like. That is part of getting a feel for the flexibility of one's options in sentence-making, and of becoming comfortable with experimentation. The scrutiny of how a line from Hemingway or Donne functions is made in the service of demonstrating how much force can be generated by the right words in the right order. Imitating them isn't a matter of insufficient originality, but rather a way to absorb some of their power.

The result is a handbook that seems very different from Strunk and White’s Elements of Style, with its list of prescriptions and prohibitions. I don't want to bash Elements; there has been too much of that lately. But Strunk and White's emphases on brevity, clarity, and vigor of diction and syntax, while still having value, tend to imply that good writing is largely a matter of following rules. Fish's book is more open-ended and pluralistic. He shows that there are numerous ways for a piece of writing to be effective -- that there are a various registers of expression that are worth learning. And his approach recognizes the element of playfulness and experimentation with language that a writer can cultivate, making it more likely that a precise though unexpected turn of phrase might come to mind. It is not that there are no rules, and you can learn some of them from Strunk and White. But the rules are not the game.

Having now recommended the book, let me quickly register a few concerns, lest this column seem like an unqualified endorsement of Fish™ brand textual goods and services.

How to Write a Sentence is not at all innovative. The guiding principle is an ancient one -- namely, that learning to write requires taking lessons from authors who have demonstrated great skill in their craft. Not in the sense of attending semester-long workshops with them, but through years of concentrating on their work, combined with frequent, shameless pilfering of their techniques. (You read what you can, and steal what you must.) The book can't be faulted for relying on an old, reliable approach, but there's something to say for acknowledging that it does.

Fish’s account of various modes of sentence-making shows how they express attitude or mood, as well as information. This makes the book a useful introduction to thinking about form. But readers who want to pursue this would do well to go on to Kenneth Burke’s succinct but systematic “Lexicon Rhetoricae,” in his first collection of essays, Counter-Statement (1931). It ought to be at the top of the list of recommended readings at the back of the book -- if How to Write a Sentence had one, which, unaccountably, it doesn’t.

This seems ungenerous, not least to Fish's readers. He may be a one-man institution, but there are limits to self-sufficiency.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - English
Back to Top