Curriculum

'Why Are We Even Here For?'

As a teacher of writing and literature at Salem State College, I hear a lot of stories. My students, although they may never have ventured more than 20 miles from where they were born, bring hard lessons of endurance to the classroom that seem more profound than any I'd had at their age. For years I've believed that they bring a certain wisdom to the class, a wisdom that doesn't score on the SAT or other standardized tests. The old teaching cliché -- I learn from my students -- feels true, but it is hard to explain. I'm not particularly naïve. I know that life can be difficult. So it is not that my students initiate me into the world of sorrow. It is that they often bring their sorrows, and their struggles, to the material, and when they do, it makes life and literature seem so entwined as to be inseparable.

This past year, for the first time, I taught African American literature: two sections each semester of a yearlong sequence, around 22 students per section. The first semester we began with Phyllis Wheatley and ended with the Harlem Renaissance. The second semester we started with Zora Neale Hurston and Richard Wright and ended with Percival Everett's satire, Erasure, published early in the new millennium.

The students in these classes weren't the ones I typically had in my writing classes. About half were white, and the other half were black, Latino, or Asian. They were generally uninterested or inexperienced in reading, simply trying to satisfy the college's literature requirement. One day before spring break I was assigning the class a hundred pages from Toni Morrison's Sula, and one student looked aghast. "We have to read during vacation?" he sputtered. I learned from them the whole year.

In the fall semester, I was teaching W. E. B. Du Bois's The Souls of Black Folk. As classes go, it had been fairly dull. Du Bois's essays didn't have the compelling story line of the slave narratives that we had read earlier in the semester. We had just begun examining Du Bois's idea of "double consciousness." It is a complicated notion that an African American, at least around 1900 when Du Bois was writing, had "no true self-consciousness" because he was "always looking at one's self through the eyes of others ... measuring one's soul by the tape of a world that looks on in amused contempt and pity." In class, I read this definition, paraphrased it, then asked, "Does this make sense to you?"

There was the usual pause after I ask a question and then, from Omar, a large, seemingly lethargic African American, came a soulful, deep-throated "yeah." The word reverberated in the haphazard circle of desks as we registered the depths from which he had spoken. The room's silence after his "yeah" was not the bored silence that had preceded it. The air was charged. Someone had actually meant something he had said. Someone was talking about his own life, even if it was only one word.

I followed up: "So what do you do about this feeling? How do you deal with it?"

Everyone was staring at Omar, but he didn't seem to notice. He looked at me a second, then put his head down and shook it, slowly, as if seeing and thinking were too much for him. "I don't know, man. I don't know."

The rest of the heads in class dropped down, too, and students began reviewing the passage, which was no longer just a bunch of incomprehensible words by some long-dead guy with too many initials.

Every book that we studied after that day, some student would bring up double consciousness, incorporating it smartly into our discussion. Omar had branded the concept into everyone's minds, including mine.

One idea that arises from double consciousness is that, without "true self-consciousness," you risk giving in and accepting society's definitions of yourself, becoming what society tells you that you are. Such a capitulation may be what happens to Bigger Thomas, the protagonist of Richard Wright's Native Son, a novel we read during the second semester. Native Son is a brutal book. Bigger, a poor African American from the Chicago ghetto, shows little regret after he murders two women. His first victim is Mary, the daughter of a wealthy white family for whom Bigger works as a driver. After Bigger carries a drunk, semiconscious Mary up to her room, he accidentally suffocates her with a pillow while trying to keep her quiet so his presence won't be discovered. Realizing what he has done, he hacks up her body and throws it in the furnace. Emboldened rather than horrified, he writes a ransom note to the family and eventually kills his girlfriend, Bessie, whom he drags into the scheme. In the end, he's found out, and, after Chicago is thrown into a hysterical, racist-charged panic, he's caught, brought to trial -- a very long trial that contains a communist lawyer's exhaustive defense of Bigger that is an indictment of capitalism and racism -- and sentenced to death.

Readers, to this day, are not sure what to make of Bigger. Is he to be pitied? Is he a warning? A symbol? A product of American racism?

During the second week of teaching Native Son, I was walking through the college's athletic facility when I heard my name, "Mr. Scrimgeour. Mr. Scrimgeour..."

I turn and it is Keith, an African American from the class. "Hey, I wanted to tell you, I'm sorry."

"Sorry?" He has missed a few classes, but no more than most students. Maybe he hasn't turned in his last response paper.

"Yeah, I'm going to talk in class more." I nod. He looks at me as if I'm not following. "Like Bigger, I don't know.... I don't like it." His white baseball cap casts a shadow over his face so that I can barely see his eyes.

"What don't you like?"

"He's, like," Keith grimaces, as if he isn't sure that he should say what he is about to say. "He's like a stereotype -- he's like what people -- some people -- say about us."

On "us," he points to his chest, takes a step back, and gives a pained half grin, his teeth a bright contrast to his dark, nearly black skin.

"Yeah," I say. "That's understandable. You should bring that up in the next class. We'll see what other people think."

He nods. "And I'm sorry," he says, taking another step back, "It's just that...." He taps his chest again, "I'm shy."

Keith has trouble forming complete sentences when he writes. I don't doubt that my fourth-grade son can write with fewer grammatical errors. Yet he had identified the criticism of Wright's book made by such writers as James Baldwin and David Bradley, whose essays on Native Son we would read after we finished the novel. And he knew something serious was at stake -- his life -- that chest, and what was inside it, that he'd tapped so expressively. Was Bigger what Baldwin identified as the "inverse" of the saccharine Uncle Tom stereotype? Was Wright denying Bigger humanity? And, if so, should we be reading the book?

To begin answering these questions required an understanding of Bigger. For me, such an understanding would come not just from the text, but from my students' own lives.

That Keith apologized for his lack of participation in class is not surprising. My students are generally apologetic. "I'm so ashamed," one student said to me, explaining why she didn't get a phone message I'd left her. "I live in a shelter with my daughter." Many of them feel a sense of guilt for who they are, a sense that whatever went wrong must be their fault. These feelings, while often debilitating, enable my students, even Keith, to understand Bigger, perhaps better than most critics. Keith, who -- at my prompting -- spoke in class about being pulled over by the police, understood the accumulation of guilt that makes you certain that what you are doing, and what you will do, is wrong. Bigger says he knew he was going to murder someone long before he actually does, that it was as if he had already murdered.

Unlike his critics, Richard Wright had an unrelentingly negative upbringing. As he details in his autobiography, Black Boy, Wright was raised in poverty by a family that discouraged books in the violently racist South. There was little, if anything, that was sustaining or nurturing. Perhaps a person has to have this sense of worthlessness ground into one's life to conceive of a character like Bigger. Like my students, one must be told that one isn't much often enough so that it is not simply an insult, but a seemingly intractable truth.

"I'm sorry," Keith had said. It was something Bigger could never really bring himself to say, and in this sense the Salem State students were much different from Bigger. Their response to society's intimidation isn't Bigger's rebelliousness. Wright documents Bigger's sense of discomfort in most social interactions, particularly when speaking with whites, during which he is rendered virtually mute, stumbling through "yes, sirs" and loathing both himself and the whites while doing so.

Although my students weren't violent, they identified with Bigger's discomfort -- they'd experienced similar, less extreme discomforts talking to teachers, policemen, and other authority figures. As a way into discussing Bigger, I'd asked them to write for a few minutes in class about a time in which they felt uncomfortable and how they had responded to the situation. I joined them in the exercise. Here's what I wrote:

As a teenager, after school, I would go with a few other guys and smoke pot in the parking lot of the local supermarket, then go into the market's foyer and play video games stoned. While I felt uncomfortable about smoking pot in the parking lot, I didn't really do much. I tried to urge the guys I was with to leave the car and go inside and play the video games, but it wouldn't mean the same thing: to just go in and play the games would be childish, uncool, but to do it after smoking pot made it OK -- and once I was in the foyer, it was OK.; I wouldn't get in trouble. But mostly I did nothing to stop us. I toked, like everyone else. I got quiet. I didn't really hear the jokes, but forced laughter anyway. I was very attentive to my surroundings -- was that lady walking out with the grocery cart looking at us? Afterward, when we went in and manipulated those electronic pulses of light and laughed at our failures, we weren't just laughing at our failures, we were laughing at what we had gotten away with.

After they had worked in groups, comparing their own experiences to Bigger's, I shared my own writing with the class. Of course, there were smiles, as well as a few looks of astonishment and approbation. I had weighed whether to confess to my "crime," and determined that it might lead to learning, as self-disclosure can sometimes do, and so here I was, hanging my former self out on a laundry line for their inspection.

What came of the discussion was, first of all, how noticeable the differences were between my experience and Bigger's. I was a middle class white boy who assumed he would be going to college. I believed I had a lot to lose from being caught, while Bigger, trapped in a life of poverty, may not have felt such risks. Also, the discomfort I was feeling was from peer pressure, rather than from the dominant power structure. Indeed, my discomfort arose from fact that I was breaking the rules, whereas Bigger's arose from trying to follow the rules -- how he was supposed to act around whites.

But there was also a curious similarity between my experience and Bigger's. Playing those video games would have meant something different had we not smoked pot beforehand. The joy of wasting an afternoon dropping quarters into Galaga was about knowing that we had put one over on the authorities; it was about the thrill of getting away with something, of believing, for at least a brief time, that we were immune to society's rules. Like me after I was safely in the supermarket, Bigger, upon seeing that he could get away with killing Mary, felt "a queer sense of power," and believed that he was "living, truly and deeply." In a powerless life, Bigger had finally tasted the possibility of power.

My students know Bigger moderately well. They don't have his violent streak; they don't know his feelings of being an outsider, estranged from family and community despite hanging out with his cronies in the pool hall and being wept over by his mother.

What they understand is his sense of powerlessness. They have never been told that they can be players on the world stage, and, mostly, their lives tell them that they can't, whether it's the boss who (they think) won't give them one night off a semester to go to a poetry reading, or the anonymous authority of the educational bureaucracy that tells them that due to a missed payment, or deadline, they are no longer enrolled. As one student writes in his midterm: "Bigger is an African American man living in a world where who he is and what he does doesn't matter, and in his mind never will."

I went to a talk recently by an elderly man who had worked for the CIA for 30 years, an engineer involved with nuclear submarines who engaged in the cloak-and-dagger of the cold war. The layers of secrecy astonish. How much was going on under the surface! -- the trailing and salvaging of nuclear subs; the alerts in which cities and nations were held over the abyss in the trembling fingers of men as lost as the rest of us, though they generally did not realize it.

During the questions afterward, someone asked about the massive buildup of nuclear arsenals. "Didn't anyone look at these thousands of nuclear warheads we were making and say 'This is crazy?' "

The speaker nodded, his bald freckled head moving slowly. He took a deep breath. "It was crazy, but when you are in the middle of it, it is hard to see. No one said anything."

After the talk, I fell into conversation with the speaker's son, a psychologist in training. I was noting how tremendously distant this world of espionage was from the world of my students, how alien it was. And I said that the stories of near nuclear annihilation frightened me a lot more than they would frighten them. In essence, my students saw their lives like Bigger's: The great world of money and power was uninterested in them and moved in its ways regardless of what they did. Like Bigger, they would never fly the airplanes that he, who had once dreamed of being a pilot, watches passing over the Chicago ghetto.

"It's too bad they feel so disempowered," the son said, and it is. Yet there is something valuable in their psychology, too. It is liberating to let that world -- money and power -- go, to be able to see the outlines of your existence, so that you can begin to observe, and know, and ultimately make an acceptable marriage with your life. Some might say it is the first step to becoming a writer.

After September 11, 2001, a surprising number of students didn't exhibit the depth of horror that I had witnessed others display on television. "I'm sorry if I sound cold," one student said, "but that has nothing to do with me." One of my most talented students even wrote in an essay, "The war has nothing to do with my life. I mean the blood and the death disgusts me, but I'm sorry -- I just don't care."

And then I watched them realize how it did indeed have to do with them. It meant that they lost their jobs at the airport, or they got called up and sent to Afghanistan or Iraq. The world doesn't let you escape that easily. Bigger got the chair.

It has been two months since we finished Native Son. The school year is ending, and I rush to class, a bit late, trying to decide whether to cancel it so that I can have lunch with a job candidate -- we're hiring someone in multicultural literature, and I'm on the search committee. As I make my way over, I feel the tug of obligation -- my students would benefit from a discussion of the ending of Percival Everett's Erasure, even though, or perhaps especially because, almost none of them have read it. Yet it's a fine spring day, a Friday, and they will not be interested in being in class, regardless of what I pull out of my teaching bag of tricks. I weigh the options -- dull class for everyone or the guilt of canceling a class (despite the department chair's suggestion that I cancel it). Before I enter the room, I'm still not quite sure, but I'm leaning toward canceling. I take a deep breath and then breathe out, exhaling my guilt into the tiled hallway.

I open the door; the students are mostly there, sitting in a circle, as usual. Only a few are talking. I walk toward the board, and -- I freeze -- scrawled across it is:

Why are we even here for?
You already gave us the final.
It's not like you're going to help us answer it.

Looking at it now, I think the underline was a nice touch, but at that moment, for a rage-filled second, I think, "We're going to have class, dammit! Make them suffer." I stand with my back to them, slowing my breath, my options zipping through my mind while sorrow (despair?) and anger bubble in me and pop, pop into the afternoon's clear light.

So much for learning. Were our conversations simply for grades? Was that the real story of this year?

When we discussed Native Son, we talked about how easy it was to transfer feelings of guilt to rage at those who make you feel guilty. Bigger's hatred of whites stems from how they make him feel. He pulls a gun and threatens Mary's boyfriend, Jan, when Jan is trying to help him, because Jan has made him feel he has done wrong. In the book, Wright suggests that white society loathes blacks because they are reminders of the great sin of slavery. Is my rage from guilt -- guilt that we haven't really accomplished much this year, guilt that I was willing to cancel a class because I didn't want to endure 45 minutes of bored faces? Pop ... pop.

I dismiss the class and stroll over to the dining commons to collect my free lunch.

Erasure is a brilliant satire, one that contains an entire novella by the book's protagonist, a frustrated African American writer, Monk Ellison, who has been told one too many times by editors that his writings aren't "black enough." The novel within a novel lifts the plot of Native Son almost completely, and it presents a main character, Van Go Jenkins, as the worst stereotype of African American culture, someone without morals, whose only interests are sex and violence. At one point, Van Go slaps one of his sons around -- he has four children by four different women -- because the mentally handicapped three-year-old spilled juice on Van Go's new shirt.

It's clear that Erasure's narrator, Monk, is appalled by the book he writes, and that he's appalled by Native Son and the attitudes about race and writing the novel has fostered. When we do discuss the book in class, I point to a snippet of dialogue that Monk imagines:

D.W. GRIFFITH: I like your book very much.

RICHARD WRIGHT: Thank you.

"So this is a real question Erasure raises," I say. My pulse quickens. I can sense them listening, waiting. "Is this book right about Richard Wright? Is this book fair to him? To Native Son? Has the creation of Bigger Thomas been a disaster for African Americans? Has it skewered the country's view of race in a harmful way?" I pause, content. Even if no one raises a hand, even if no discussion ensues, -- and certainly some discussion will erupt -- I can see the question worming into their minds, a question that they might even try to answer themselves.

La Sauna, the student who never lets me get away with anything, raises her hand: "What do you think?"

What do I think? I wasn't ready for that. What do I think?

What I think, I realize, has been altered by what they think, and what they have taught me about the book, about the world.

There are no definite answers, but my students had helped identify the questions, and had pointed toward possible replies. After we had finished reading Native Son, I asked the class, "How many of you want Bigger to get away, even after he bashes in Bessie's head?" A good third of the class raised their hands, and, like the class itself, those who wanted this double murderer to escape were a mix of men and women, blacks and whites. There are several ways to interpret this, but I don't think it is a sign of callousness, the residue of playing too much Grand Theft Auto. They wanted Bigger to escape because Wright had gotten into Bigger's consciousness deeply and believably enough that he became real, more than a symbol or a stereotype.

I tell them this, how their response to Bigger has influenced my reading. I don't tell them Gina's story.

Gina was one of the students who read the books. She loved Tea Cake and Sula, was torn between Martin Luther King Jr. and Malcolm X. She even visited me in my office once or twice to seek advice about problems with a roommate, or a professor. An African American student from a rough neighborhood, she ended up leaving the college after the semester ended, unable to afford housing costs.

Sometime in March of that semester, Gina came to my office. She had missed class and wanted to turn in her response paper on Native Son. The class had read the essays by Baldwin and Bradley criticizing the novel, and had been asked to evaluate them. Baldwin, Gina tells me, was difficult, "but he was such a good writer."

Did she agree with Baldwin, I ask? Was Bigger denied humanity by Wright? How does she feel toward Bigger?

"I think he needs help," she says, "but I felt sorry for him. I wanted him to be able to understand his life--" I cut in, offering some teacherish observation about how Bigger shows glimmers of understanding in the last part of the book, but her mind is far ahead of me, just waiting for me to stop. I do.

"The book reminded me of the guy who killed my uncle. You probably saw it -- the trial was all over the TV last week."

I shake my head.

The man and an accomplice had murdered her uncle, a local storeowner, three years ago, and the previous week had been sentenced to life without parole. The two had been friends of the uncle's family, had played pool with the uncle the night before, planning to rob and kill him the next day.

"When I saw him sitting there, with his head down, looking all sad, I don't know, I felt sorry for him. I wanted to give him a copy of Native Son. I wanted to walk up to him and put it in his lap. It might help him to understand his life.

She looks at me, her brown face just a few shades darker than mine. She's 19. Her hair is pinned back, and some strands float loose. Her eyes are as wide as half dollars, as if she's asking me something. Without thinking, I nod slowly, trying to hold her gaze. On the shelves surrounding us are the papers and books of my profession, the giant horde that will pursue me until I die.

"My family wants him to suffer -- hard. But I want to talk to him. Do you think that's bad? I want to know why he did it, what happened. I wonder how he'd react if he saw me -- what he'd do if I gave him the book."

I imagined Native Son in the man's lap. The glossy, purple, green, and black cover bright against the courtroom's muted wood, the man's trousers. His hand, smooth with youth, holds its spine. His thumb blots out part of the eerie full-lipped face on the front. As the words of the court fall about him, the book rises and falls ever so slightly, as if breathing.

Author/s: 
J.D. Scrimgeour
Author's email: 
info@insidehighered.com

J.D. Scrimgeour coordinates the creative program at Salem State College and is the author of the poetry collection The Last Miles. This essay is part of his new collection, Themes for English B: A Professor's Education In and Out of Class, which is being released today by the University of Georgia Press and is reprinted here with permission.

The Standardization of College Teaching

As an undergraduate at a state university, I read the schedule of classes long before I had to register. I scanned instructors' names first. Next I considered courses, and finally I would take the action that would decide my class schedule -- I went to the university bookstore and looked at the textbooks each professor required.

Scanning the stacks, I was overwhelmed by the number of textbooks on the bookshelves. Every two or three books represented a semester's worth of learning. And for 16 weeks, I would be married to that book. I looked at how the textbooks were written, the amount of reading necessary, and the different tools offered to help a student understand a concept. I knew myself. I knew my learning style. And after flipping through a few shelves of textbooks at the university bookstore, I was making choices that would give me a better chance -- not only of passing the courses -- but of actually learning and carrying that knowledge with me into later courses.

When I became an instructor myself, I marveled at the autonomy of the job. To some degree I could make my own hours. As long as I aligned my courses with the course objectives set up by my department, I would receive positive peer evaluations and approval by the administration. At these campuses, I chose a textbook from the list provided by the department's textbook committee. At two campuses where I worked, the department chair told me that textbooks not on the list were often approved by the committee chair quickly enough that they could be used that semester.

When I moved to teach at a large urban community college, I faced something that looked like too much freedom. For one freshman composition course, I was given a choice of 59 textbooks to choose from. At the next level of composition, the list of approved textbooks was 104 titles long. Dazed, I contacted trusted colleagues and skimmed their textbooks.

Finally, I reverted to my old undergraduate habits and visited the college bookstore. This time, however, I was making a bigger decision. I now had to commit to a textbook that would serve three sections of a particular composition class. That meant that 99 students of varying academic abilities would have to live with my decision. And even though I could change the text the next semester if I needed to, there would be 16 long weeks with a book that did not serve our needs as well as it should.

Finally I would make my choices -- and start the laborious process of ordering desk copies and passing paperwork on to my department chair. It was exhausting, but tremendously rewarding. After all, I was able to choose a text that, for the most part, aligned with my own beliefs. I would be challenged to teach some new material and learn some new teaching techniques with this choice -- and my students would benefit.

In contrast, this week, at the university where I am on contract to teach full-time, my supervisor told a roomful of composition faculty which textbook they will be using for Fall 2007. To stunned silence, he held up three textbooks that he had chosen for what he called a "one year experiment." One text was to be used for incoming freshman taking composition; the next semester's instructors would have the choice of one of the remaining two textbooks. Refusing any discussion, he indicated that part of the reason for this change was the administration's edict that freshman students be given a "uniform
experience" in our composition courses.

There was not a sound as more than 30 professors left the room. It was not until the next day that I first heard their collective unbridled response. One professor who had worked at this university for over a decade stopped our director in the copy room and said, "So, since you're choosing the textbook, are you going to give us standardized lesson plans, too." When his supervisor did not respond, the professor made one last attempt to communicate his disappointment, "Hey, why don't you just come in and teach my courses for me?"

"It's just the beginning," another professor told me. "This university has bought into the idea that education is a business." Sighing, he said, "The next step will be classes of a thousand with PowerPoint presentations instead of lessons." At the time I thought he was just being sarcastic and reactionary; yet I later wondered if he was on to something.

"Student as consumer" has become a driving force at many colleges. In the last few decades, a number of provosts, presidents, and chancellors have buckled under pressure from students, local businesspeople, and voting citizens to think of education as a simple equation -- quickly deliver the students information, get money. In some cases, accreditation boards have tried to hold the line; in other cases, they seem to be in collusion with this move toward efficiency at all costs. In any case, the art of teaching has been relegated to a much lower status -- or in some cases, completely disregarded.

Slowly and quietly, the freedoms that not only made teaching enjoyable, but effective, are being taken away by an administration that is more interested in uniformity -- as if education was a drive-through fast-food product. Perhaps they've forgotten that even the drive-through provides choices: a hamburger or cheeseburger, a chicken sandwich, a fish sandwich, chicken fingers, fries, curly fries, a few salad choices, a baked potato -- the list may be too big to fit on one menu board. Yet in something as important as education, some are thinking, "the fewer choices, the better."

I'm not sure if they're really thinking primarily of the students. True -- students would be reading the same material across the board. But perhaps this is the answer that receives less resistance from those concerned. Perhaps administrators have other motives as well. It would be less work for their secretaries if they only had to order one book. And standardization often makes it easier to assess students' learning. That means increased claims of success -- and a better shot at funding. And, of course, departments would be easier to manage with fewer variables. Whether they are pressed into service or welcome the chance, administrators need to spend time fund raising, informing the public, creating events that will reflect well in press releases, shaking hands at groundbreaking ceremonies, and attending mayoral functions; why not scale back in an area that already causes them concern?

After all, with the massification of education, universities are serving more "consumers." Many feel it is better to get as many students as they can in and out of the educational system quickly and show our culture that we have "produced" the workers we had promised. Yet in the short seven years I've been teaching, every professor I've come in contact with has expressed concerns: 1) that we are stooping to lowered standards; 2) that many students infer that memorizing facts and spilling them back to a proctor is enough; 3) and that certificates and programs are being created not because of student demand -- but because of pressure from those who fund the campus. These worries, among others, have made many professors aware that the pressure to provide a quality educational experience falls almost solely to them. And when they have an administration that does not support this goal, it becomes almost impossible to attain.

Last semester, administrators at my university told English composition faculty that they are going to implement a standardized syllabus in the near future; perhaps lesson plans will be faculty's own -- yet policies previously set by faculty will soon be dictated by the administration. This semester, my department chair dictated the textbooks that faculty will use in 2007. Will the next move be lesson plans created by administrators? Standardized testing? By removing the creativity and style that individual instructors provide, couldn't we, in effect, move to a system simply supervised by proctors? Some administrators will wince at this suggestion; I guarantee a few will actually gaze up at their office ceilings and think about it -- if only for a moment. And even if administrators do not instigate this kind of plan, moving toward overt control (or even elimination) of faculty can be a piecemeal business.

Yet in many disciplines, the use of standardized materials and lesson plans is already problematic. Because the materials were not developed for a particular course or student population, many students feel detached from the material. In liberal arts, especially, faculty must be in place to personalize the learning experience for students -- otherwise students feel as if their input is worthless. Faculty are of infinite value here; taking away their ability to teach well cannot be a recipe for a successful educational experience.

While writing this, I admit that I am feeling reactionary. In a few weeks or months, I may be less angry -- yet in that time, won't these edicts still be in place at my university? Yes. And my ability to teach in the way that I think works best will be curtailed more and more. I will feel less like an instructor and more like the employee of a corporate machine. I am reminded that I spent a decade in Silicon Valley purchasing semiconductors, and another decade in advertising, writing copy for products and services that I didn't care about. In these positions I felt useless -- and at times, degraded.

I moved to higher education because I believed in the service we provided. And I remembered the choices I was allowed to make as an undergraduate -- not only in major, but also in courses within that major. Those courses were represented by a textbook and, most importantly, a faculty "face" that helped me interpret that textbook and often inspired me to go beyond the classroom.

Perhaps I am an anomaly. I did not go to college under pressure from family or even the society that surrounded me. The pressure was from within. For me, college was not a "product" to be bought -- something to ensure my business future with promotions and 50-cent-an-hour raises. It was a challenge that I needed to meet to find out what I was capable of.

And, of course, I was interested in learning. In a way, not much has changed. I am still interested in learning -- not only my students' learning, but my own. And the freedom to choose my own textbook, create my own assignments, and pace my courses are a form of learning. I am constantly evaluating my teaching methods and pressing for improvement. I make notes on my course outline about each lesson, "good, students applied knowledge from last assignment," "met with little discussion -- rewrite or discard," or "connected to writing topic -- keep." I attend conferences in my discipline and read articles and books in and out of my discipline. I talk to colleagues and post to an online discussion board, constantly rooting around to find different ways to teach the objectives that my department has set out for me to achieve. It's my hope that in an age of diminished expectations, campuses will leave the art of teaching to those best suited to perform this challenging and unwieldy task: faculty.

Author/s: 
Shari Wilson
Author's email: 
info@insidehighered.com

Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.

YouTube and the Cultural Studies Classroom

"I saw a small iridescent sphere of almost unbearable brightness. At first I thought it was spinning; then I realized that the movement was an illusion produced by the dizzying spectacles inside it."
                  --Jorge Luis Borges, "The Aleph"

On December 17, 2005, “Saturday Night Live” ran a skit by Chris Parnell and Andy Samberg called "Lazy Sunday," a rap video about going out on a "lazy Sunday" to see The Chronicles of Narnia and procuring some cupcakes with "bomb frostings" from the Magnolia Bakery in New York City. The rap touches on the logistics of getting to the theater on the Upper West Side: "Let's hit up Yahoo Maps to find the dopest route./ I prefer Mapquest!/ That's a good one too./ Google Maps is the best!/ True that! Double true!/ 68th and Broadway./ Step on it, sucka!"

Parnell and Samberg make it to the Magnolia for their cupcakes, go to a deli for more treats, and hide their junk food in a backpack for smuggling past movie security. They complain about the high movie prices at the box office ("You can call us Aaron Burr from the way we're dropping Hamiltons") and brag about participating in the pre-movie trivia quiz. Doesn't seem like much if you've never seen it, but for pure joie de vivre, and white suburban dorkiness, "Lazy Sunday" just can't be beat. What makes "Lazy Sunday" special, however, is how its original airing coincided with the birth of Internet video-sharing, enabling the two minute clip to be viewed millions of times on YouTube, a free service that hosts videos posted by users. In fact, the popularity of the clip on YouTube was so great that NBC forced the site to remove it several months later, citing copyright infringement. The prospect of its programming being net-jacked by Internet geeks and magnified through YouTube's powerful interface was just too much for NBC.

I bring up "Lazy Sunday" to foreground my discussion of the pedagogical uses of YouTube because it sums up its spirit and helps us define the genre of video with which YouTube is most associated. Although YouTube is awash in clips from television and film, the sui generis YouTube video is the product of collaborative "lazy Sunday" moments when pals film each other or perform for the camera doing inane things like dancing, lip synching or making bottles of Diet Coke become volcanic after dropping Mentos candies in them.

Parnell and Samberg's references to Internet tools and movie trivia, as well as their parody of rap, perfectly capture a zeitgeist in which all pleasures can be recreated, reinvented and repeated ad nauseam through the magic of the Web. As Sam Anderson describes it in Slate, YouTube is "an incoherent, totally chaotic accretion of amateurism -- pure webcam footage of the collective unconscious." Whatever you're looking for (except porn) can be found in this Borgesian hall of mirrors: videos of puppies, UFO footage, ghosts on film, musical memento mori about recently deceased celebrities, movie and documentary clips, real and faux video diaries, virtuoso guitar picking performances and all kinds of amateur films. In my case, the video that sold me on YouTube was "Where the Hell is Matt Harding Dancing Now?" -- a strangely uplifting video of a guy called Matt Harding who traveled around the world and danced in front of landmarks such as Macchu Picchu in Peru, Area 51 in the U.S., the head-shaped monoliths of Easter Island, and the Great Wall of China, among many others.

OK, that's all nice, but what can YouTube do for professors, apart from giving them something to look at during their lunch breaks? Inside Higher Ed has reported on the ways in which YouTube is causing consternation among academics because it is being used by students to stage moments of guerilla theater in the classroom, record lectures without permission and ridicule their professors. Indeed, a search on YouTube for videos of professors can bring up disquieting clips of faculty behaving strangely in front of their students, like the professor who coolly walks over to a student who answers a ringing cell phone in class, politely asks for the device, and then violently smashes it on the floor before continuing on with his lecture as if nothing had happened. It could be staged (authenticity is more often than not a fiction on YouTube) but it is still disturbing.

But I would like to argue for an altogether different take on YouTube, one centered on the ways in which this medium can enrich the learning experience of college students by providing video realia to accompany their textbooks, in-class documentaries and course lectures. Although I can't speak to the applicability of YouTube to every discipline, in what follows I make a case for how the service can be harnessed by professors in the humanities and social sciences.

As a professor Latin American literature and culture, I often teach an introductory, third year course called Latin American Culture and Civilization in which students study history, literature and any other media that the instructor wishes to include in the course, such as music, film, comics and the visual arts. My version of the course emphasizes student engagement with foundational documents and writings that span all periods of Latin American history and that I have annotated for student use. One of the figures we study is President Hugo Chávez of Venezuela, whose outsized political persona has made him a YouTube star. Apart from having my students watch an excerpt of his "Bush as sulfurous devil" speech at the United Nations, I assigned a series of animated cartoons prepared by the Venezuelan state to educate children about the Bolivarian constitution championed by Chávez. These cartoons allow students see the ways in which the legacy of the 19th-century Venezuelan Liberator, Simon Bolívar, remains alive today.

The textual richness of these cartoons invites students to visually experience Bolivarian nationalism in a way that cannot be otherwise recreated in the classroom. It invites them to think critically about the ways in which icons such as Bolívar are creatively utilized to instill patriotism in children. In a similar vein, a Cuban cartoon about Cuba's founding father, José Martí, depicts how a child is transformed into the future champion of independence and social justice when he witnesses the horrors of slavery (this video has now been removed from YouTube). With regard to the Mexican Revolution, one of the most important units of the class, YouTube offers some fascinating period film of the revolutionary icons Emiliano Zapata and Pancho Villa, and especially their deaths. Although I cannot say that these are visual texts that lend themselves to the kind of rich dialogue provoked by the aforementioned cartoons, they are nonetheless an engaging visual complement to readings, discussions and lectures.

Another course in which YouTube has played a part in is my senior-level literature course on the Chilean Nobel Laureate Pablo Neruda. It may seem farfetched to use Internet video in a poetry class, but in this case, YouTube offers several useful media clips. I have utilized film clips in which Neruda's poetry appears (such as Patch Adams and Truly, Madly, Deeply), as well as music videos of Latin American singers who use lyrics by Neruda. More than anything that I could say in class, these videos illustrate the reach and enduring quality of Neruda's poetry in Latin American and North American culture. This said, there are a surprising number of student-produced videos about Neruda on YouTube that are cringe-worthy, the "Lazy Sunday" versions of the poet and his poetry. These are quite fascinating in of themselves as instances in which young people use video to interpret and stage Neruda, in ways that might be set into dialogue with more literary and canonical constructions of his legacy, but I confess that I am not yet convinced of their pedagogical value.

In this regard, the case of Neruda is not so different from that of other literary figures, such as Emily Dickinson, Nathaniel Hawthorne and Robert Frost, who are also the subject of interesting home-made YouTube videos. What do we do, for example, with a Claymation film that recreates Frost's "The Road Not Taken"? I would argue that this film is interesting because it captures the banality of a certain canonical image or version of Robert Frost that is associated with self-congratulatory, folksy Hallmark Card moments.

There are all kinds of video with classroom potential on YouTube. Consider, for example, one of YouTube's greatest stars, Geriatric1927, a 79 year-old Englishman whose video diaries document his memories of World War II, as well as of other periods of English history. Then there are the Michel Foucault-Noam Chomsky debates, in which Foucault sketches out, in animated, subtitled conversation, the key arguments of seminal works such as Discipline and Punish. There's an excellent short slide show of period caricatures of Leon Trotsky, news reels and lectures about the Spanish Civil War, rare footage of Woody Guthrie performing, Malcolm X at the University of Oxford, clips of Chicana activist Dolores Huerta discussing immigration reform and a peculiar musical montage, in reverse, about Che Guevara, beginning with images and reels of his death and ending with footage of him as a child.

Don't let me tell you what you can find; seek and ye shall receive.

YouTube is not necessary for good teaching, in the same way that wheeling a VCR into the classroom is not necessary, or bringing in PowerPoint slide shows with images, or audio recordings. YouTube simply makes more resources available to teachers than ever before, and allows for better classroom management. Rather than use up valuable time in class watching a film or video clips, such media can be assigned to students as homework in the same way that reading is assigned. However, to make it work, faculty should keep in mind that the best way to deliver this content is through a course blog. YouTube provides some simple code that bloggers can use to stream the videos on a blog, rather than having to watch them within the YouTube interface. This can be important because we may not want students to have to deal with advertisements or the obnoxious comments that many YouTube users leave on the more controversial video pages. On my free wordpress.com course blog, I can frame YouTube videos in a way that makes them look more professional and attractive ( sample page here). At this point, courseblogging is so easy that even the least technologically-minded can learn how to use services like blogger or wordpress to post syllabi, course notes and internet media.

There are problems however, the most glaring of which is the legality of streaming a clip that may infringe on copyright. If I am not responsible for illegally uploading a video of Malcolm X onto the web, and yet I stream it from my course blog, am I complicit in infringing on someone's copyright? Now that Google has bought YouTube, and a more aggressive purging of copyright protected works on the service has begun, will content useful for education dwindle over time? I don't have the answers to these urgent questions yet, but even in the worst of cases, we can assume that good, educational material will be made available, legally, on YouTube and other such services in the future, either for free or for a modest fee.

For example, I am confident that soon I will be able to tell my students that, in addition to buying One Hundred Years of Solitude for a class, they will have to purchase a $5 video interview with García Márquez off of the World Wide Web and watch it at home. And, even as I write this, podcasting technologies are already in place that will allow faculty members to tell their students that most of their lectures will be available for free downloading on Itunes so that class time can be used more productively for interactive learning activities, such as group work and presentations. Unlike more static and limited media, like PowerPoint and the decorative course Web page, video and audio-sharing help professors be more creative and ambitious in the classroom.

In sum, my friends, YouTube is not just for memorializing lazy Sundays when you want to "mack on some cupcakes." It can help your students "mack" on knowledge.

Author/s: 
Christopher Conway
Author's email: 
info@insidehighered.com

Christopher Conway is associate professor of modern languages  and coordinator of the Spanish program at the University of Texas at Arlington, where he teaches Latin American literature and culture.

Ignorant of Their Ignorance

My undergraduate students can't accurately predict their academic performance or skill levels. Earlier in the semester, a writing assignment on study styles revealed that 14 percent of my undergraduate English composition students considered themselves "overachievers." Not one of those students was receiving an A in my course by midterm. Fifty percent were receiving a C, another third was receiving B's and the remainder had earned failing grades by midterm. One student wrote, "overachievers like myself began a long time ago." She received a 70 percent on her first paper and a low C at midterm.

A solid 40 percent  of my undergraduate English composition students described themselves as "overachieving if they liked the subject." The grades for these students, understandably, were scattered. Twenty-nine percent of my undergraduates described their study styles as "normal." Of these, 36 percent were working at a C level by midterm; another 18 percent were receiving a B, with another 18 percent receiving a D. The remaining 27 percent were failing. One student who described his study style as "normal" confessed that he rarely started assignments when they were first given out, waited until a few days before work was due to get started, and did a lot of his writing over the weekend. At midterm, he was receiving an F.

A whopping 17 percent  of my undergraduates confessed to being "underachievers"-studying at the last minute, not doing the reading, and only spending a few hours on major assignments.

My data -- though tremendously limited in scope-seems to be supported by Douglas Hacker's findings. In "Test Prediction and Performance in a Classroom Context," an article published in the Journal of Educational Psychology, Hacker and colleagues at the University of Memphis found that to a great degree, overconfidence is prevalent among low-performing students. True, Hacker's study was with introductory psychology undergraduates rather than English composition students. But it does give me a great deal of insight as to how students predict performance. And although I don't like the idea of considering my students "low-performers," I admit that my state does have a weak high school system, and my university doesn't turn paying students away. Even low-achievers are admitted under a "conditional" admission standard.

I don't think Hacker's experience is unique. Dozens of colleagues have told me that their undergraduates simply do not have the tools to criticize and evaluate their own work-much less predict how well they will do on assignments. What's behind this  great drop in ability to assess performance?

A colleague of mine believes that primary and secondary schools, overwhelmed with students who were never well prepared for school, students with learning disabilities, addictions, and even severe discipline problems have found themselves delivering a weakened curriculum. Yet a recent article in American Educator, "Balancing the Educational Agenda," by Jean Johnson et al, indicates that academic standards for secondary schools are rising-a move supported not only by academics and administrators, but by parents as well. Perhaps this move is recent; those of us in postsecondary positions are, in effect, responding to the academic standards in place a decade ago. Or perhaps regions suffer differences in standards based on student population and demands of the surrounding community. Another possibility, among others, is that the curriculum shifts when administrators attempt to adopt each new trend in education.

Just as an inconsistent curriculum can cause students pain and confusion, the move from high school to college can be a hair-raising leap. High school systems with a weak curriculum (or one that is not consistently applied) can create tremendous problems later in the academic system. At my current university, a large percentage of our undergraduates have brought their high-school experience with them. Some of them are under the impression that if they now come to their college classes every day, they will pass these courses. Many of these students are stunned whey they fail their first major test or receive a D  for what they thought was an award-winning essay.

Even when academic advisors warn, "college is not high school," many of these under-prepared students continue to believe that they will receive A's for a token effort. Clear class objectives and strongly worded syllabi are often ignored as students continue to overestimate their capabilities based on past performance. After the first major assessment, many of these students clutch at their professors' arms, lamenting, "But I got A's in high school."

Colleagues often commiserate about this particular student response. After all, it's almost impossible to respond to. Often we can only repeat that our expectations are clearly outlined in the syllabus and course outline, that we would be happy to define these further, and that they may want to drop the course if they cannot afford to dedicate time outside of class for study. One professor friend  often tells students that the A's they received in high school are simply a step toward admittance to the local university-not a guarantee of grades.

Another colleague says that the level of competition has changed from high school to college; until freshman understand that, they will be inaccurately predicting performance. And the vilification of competition has set up many students to believe that they are all doing well -- regardless of outcome. As a friend of mine in teacher education says, "It's the result of the 'feel good 70's' where every child was deemed a winner. Competition was considered demoralizing. The result was a continuing trend in the 90's which focuses on reward across the board. Today, we have turned out a glut of students who not only can't assess themselves, but who have received awards for every little thing." When they enroll in college, students often still have no idea how they fare when compared with other undergraduates.

A good friend on staff at a university library says that helicopter parenting also contributes to the problem. When he escorts tour groups of grade school students through his facility for a hands-on learning tour, he often sees parents and grandparents hovering so much that instead of helping young students stay focused on assignments, the children end up being spectators instead of participants in what should be their chance to "try out" a college experience. The urge to spare children from the ego blows of failure, too, often results in parents actually doing homework for children -- not only in primary and secondary grades, but in college, as well. Some parents, perhaps perfectionists, have rationalized that if they "assist" their child, the task will be done in a much shorter time. Unfortunately for these children, their formative years do not allow for effort, failure, increased effort, failure, and another attempt which results in success. This set up may produce college students who can only do the most superficial work before becoming discouraged.

Another  academic friend says that an inability to focus and an overwhelming desire to multi-task make it almost impossible for students to succeed academically. Staff who manage study rooms and carrels often report that students seem to work "in dribs and drabs" while in the library. Backpacks in hand, they often loiter at computers and chat at tables instead of actually working. Dependent on high-tech gadgets, these same students often feel compelled to answer phones while in study groups, and constantly check e-mail or view sites such as Facebook or MySpace during hours they had dedicated to working on assignments or doing research.

One reference desk librarian reported that she would see students "studying for four minutes, goofing off for a half an hour, and then studying for another four minutes." Of course, these students often report to faculty that they've been studying for hours -- which in some ways must seem like an accurate appraisal. After all, they were in the library; therefore, they must have been studying. In the end, a diminished attention span combined with the feeling that doing one thing at a time is a waste of time almost guarantees that they will not be turning in top A-level work to their professors.

This narrative is very incomplete as a study. I'm sure that sociologists, education specialists and other experts have outlined a  long history and a number of interrelated causes that explain this drop out in students' knowledge.

As an instructor of undergraduate core classes, however, I realize that my responsibility does not stop at content. I cannot simply list assessment as a course objective and then feign ignorance when my students show me again and again that they cannot predict their own performance. Strategies -- not only for instruction, but also for exercises and assessment -- are integral in setting my students on the right path for the remainder of their college careers. To accomplish this, I realize that I will need to work much, much harder to help my undergraduates understand assignments and expectations, rubrics and  
assessments, in-class grades and the prediction of success.

Some is already in place. Like many English composition instructors, I do instill a peer-editing component to my writing courses -- not only to help students view writing as a process -- but to give them some tools and much-needed experience in evaluating student work. I provide instruction in how to apply rubrics to student work and often use past student work as "models." Some students are glad for the transparency of my courses; with a detailed 16-week course outline given out at the first class, they can start relating course objectives to specific assignments throughout the semester. Lessons scaffold one on another; assessment follows thorough instruction. Still, there is much to be done. It's clear that I need to develop more tools to help my students learn to assess their own work and predict academic performance more accurately.

Author/s: 
Shari Wilson
Author's email: 
info@insidehighered.com

Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.

Defining Academic Vision

“Administrators are supposed to have an academic vision. What’s yours?”

That’s the best question a faculty member has asked me since I’ve become associate dean. In conversations that have followed, I’ve begun to understand that my vision, built upon a sense of curiosity and the impulse to teach, implies both a certain type of faculty and a certain type of institution.

The academy rests on a foundation of newly formulated and previously acquired knowledge, and a sense of wonder in its presence. That sense -- call it curiosity -- propels faculty to collect data, analyze it, and hazard generalizations in articles and books. It engenders creativity in music and the arts and drives academics to sift through mounds of evidence in the hopes of assembling something grander, be it historical argument or literary analysis. Curiosity requires mass spectrometers and gas chromatographs; it urges fundamental and foundational understanding of the world around us.

Yet none of us works in an institution where such delight in the new alone suffices. Sooner or later, every one of us wants someone else to understand what we’ve come to know. Perhaps it is a colleague or peer, if we work in a research institute or within the graduate school of a major research university. But in my vision, the faculty want to help shape younger minds, those of undergraduates. And most of the time undergraduates don’t have the background to really understand the faculty.

The mismatch of intellectual preparedness and complexity of information compels faculty to teach. This impulse will have a faculty member reduce chaos, ignore variables, abstract principles, and then oversimplify them -- all for the purpose of communicating to the relatively underprepared. The faculty I envision see such teaching as a craft and think about it continually. In their classrooms, students experience a variety of lectures, discussions, and small group work, all meant to stimulate curiosity and create a setting maximally conducive to learning.

So what’s my academic vision? I see the classic encounter of liberal education: expert faculty put their own ideas into dynamic tension with those of their colleagues, and then eagerly begin to engage students. I see classrooms where discovery and the boundaries of a field are the principal subjects, albeit explained at appropriate levels of simplification. These attributes -- curiosity and the impulse to teach -- explain why faculty labor over evidence, chisel away at concepts yet undiscovered, and manage syllabi of divergent topics, approaches, and problems.

I have also come to realize that good administrators likewise must take a deep interest in everything and put people in a position where they want to share their expertise.

Day after day during the search season, a dean’s calendar is filled with candidate meetings, during which the dean must talk with a vast array of potential faculty members, and then make wise decisions about competence, communications skills, and energy, to say nothing of their fit within a community of scholars and students. At the end of those long days, someone inevitably asks: How can you talk with people in such a wide variety of fields, especially since your degree is in some other equally narrow field? This question comes up outside of the search season too. Indeed what makes administrative life intellectually rich and rewarding is meeting with department chairs, program directors, individual faculty, other deans: all with different training, all specialists in their own areas.

An example: the leader of the Science and Math Advisory Group approaches the administration with a hefty repair bill for a gas chromatograph/mass spectrometer, a bill that puts his department over budget for repairs. The administrator’s task is to get this professor to teach enough that her decision is well grounded. How do students use this particular piece of equipment? What research will have to wait until next July if it goes unrepaired?

In just the same way, a good dean will enter the free-form portion of a candidate interview and begin with deceptively simple curiosity: “Tell me about your work…?” Other questions, based on the answer received, keep arising: “Could your work on the economics of dental care help someone understand why health insurers don’t want to pay for preventative measures?” And the answer (dental sealants pay off only over the long haul, too long for the insurers) provokes another question, and so on and on.

Although I like to think there is a skill to such an interview, really it is all about putting the candidate in a position to be a teacher. The dean tries to draw from the expert a tidbit that summarizes a subject, in admittedly too simplistic a form, so as to ask for more detail, and perhaps a more cogent summary. The questions tip off the candidate as to how far to translate expert jargon into generally accessible ideas, complex ideas into simplifications comprehensible by a non-expert.

So what is my academic vision as an administrator? The task is to use budgets, hiring, and curricular leadership to promote faculty research and enhance student learning. And the skills we’re building should be no strangers to academic deans. Before moving into administration, over long careers in graduate school (and for those of us lucky enough, in the classroom), curiosity and the impulse to teach defined our work -- as it does that of the faculty we serve.

Of course, I’m only an associate dean. I work within a much larger structure, under a president and senior administrative team that combines with the faculty’s academic vision to build an institutional culture. Still, I’d argue that liberal arts colleges that embrace a culture of curiosity and teaching have a quite distinctive profile, in terms of curriculum, structure, and values.

Institutions with a culture of curiosity and teaching use the curriculum to help drive students to areas of study otherwise unthought of, and allow faculty to construct courses that test ideas in new contexts and combinations. General education programs range widely, helping students sample broadly enough to educate their academic palates, while major requirements sink deeply into subject matter, guiding young scholars toward the nuances of disciplinary cuisine.

Such a curriculum demands that the administration be nimble and open to change, supportive of both classical and emergent fields. The president must lead discussions defining institutional goals and the dean of faculty must propound a theory of which academic issues and programs trump dollar costs. And since no institution can spend all the money required to do everything, even the CFO will need to teach: how shall we reallocate resources effectively to bring on new programs while closing down those that no longer meet institutional goals?

Liberal arts colleges that pursue a vision of curiosity and teaching will also have certain predictable structures. Foremost among these is the wave of interdisciplinarity that began on campuses in the late 1970s. Interdisciplinary programs and centers arise when faculty and student curiosity about a topic exceeds disciplinary possibilities: for example, environmental studies is born when a group of faculty realizes that biology and botany cannot answer all of their critical questions, and wants to consult regularly with colleagues from chemistry, public policy, and sociology, as well as literature and others.

The shifting nature of the disciplines raises questions that must be engaged: At the limits of interdisciplinarity, what guides the granting of positions, the allocation of budgets, the support of the community? An institution that has fostered curiosity in labs, studios, and classrooms will have answers to such questions, because curiosity and teaching propel faculty to build bridges between subjects, leading to multidisciplinary appointments and calls for newly intersecting programs and emerging fields. By contrast, an institution that has not attended to such matters will be caught up short when its faculty meet across the divide between disciplinarity and interdisciplinarity.

Finally, an institutional commitment to curiosity and teaching will result in an embrace of the values of liberal education, from critical thinking and self-development to understanding matters of difference and diversity.

The hallmark of small, residential liberal arts institutions is close student-faculty engagement in and out of the classroom, lab, and studio. Such apprenticeships of the mind aim to develop students’ abilities toward critical thinking. Students attempt to create principles abstracted from a set of facts and circumstances, and then to apply those principles in situations never before encountered. Successful students gain a facile (and curious) mind that is both critical and adaptable. Such intellectual formation must happen everywhere on campus and shape the student as a whole. And as students demonstrate what they have learned -- in written essays, oral presentations, in logical or mathematical proofs, scientific lab reports, and artistic presentations -- they go beyond mere mastery of facts to critical argument (and, indeed, to teaching one another).

The object of curiosity in this type of institution is the entire world around us, from the cosmos at one extreme to quantum states at the other. But a particular focus on humanity and our place within the universe of meaning emerges from the social nature of a residential college. That is to say, curiosity about “The Other” (here understood as a focus of inquiry, not an epithet) becomes a critical part of the academic curriculum. Institutional values of diversity and equity of necessity shift from the periphery toward the center; administrative support for such curricular and community attention emerges during complex conversations about resources and structures, all the while cognizant that a diverse faculty and curriculum can better serve a community curious to be taught about culture and difference.

In the end, of course, academic planning must begin with an institution’s mission and core values. And when that mission centers on liberal education, an entire community of students, faculty, and administrators must find common ground in the face of critical issues, from resource allocation to interdisciplinarity and diversity. I remain convinced of my original reply: the best academic vision builds on intellectual curiosity and the impulse to teach.

Author/s: 
Roger Brooks
Author's email: 
info@insidehighered.com

Roger Brooks is the Elie Wiesel Professor of Judaic Studies and asociate dean of the faculty at Connecticut College.

The Disappointment of Portfolio-Based Teaching

When I was an art director, I loved the idea of showing my design portfolio to prospective employers. After seeing my best design work professionally produced and mounted on boards, I often received either an offer to work on staff, or at the very least, a chance to do freelance work for that advertising agency. I loved creating these pieces, and this format seemed to respect the artistic process more than the drudgery required of day-to-day work in the industry.

When I started to teach graphic design at a local community college, I used the portfolio format for my own students. Although they loved the idea of being able to discard their less effective pieces, I often wondered if I was accurately assessing their work. The outcome revealed an ability to produce beautiful artwork after much trial and error over the course of a semester; yet, the process did not seem to take into account the sometimes painful learning curve that most students experienced. Still, I continued using portfolios, convinced that the advantages outweighed the few negatives.

After being hired to teach composition, I was encouraged to use a portfolio system for my writing courses. What could be better, I thought? This would encourage (and reward) students for revising their work. Given a chance to assess their own writing, they would move from passively learning to actively participating in their own education. They could showcase their best work and have a chance to reflect on writing as a process rather than as a simple outcome. And best yet, I could see their work as a progression rather than as staccato assignments that fell during particular times during a semester. Knowing that portfolios were the standard at a number of colleges -- and in many ways, still considered "progressive" in my discipline -- I started gathering information from colleagues and industry publications to find out how to instill this process into my undergraduate courses.

After two years of teaching writing utilizing a portfolio system, I realized there were pitfalls. Some could be mitigated by a tight syllabus and clearly outlined course requirements; others seemed to cripple the outcomes that my department had deemed desirable.

First, all of my students were anxious about not knowing their in-class grade until the end of the course. In traditional writing classes, students received either a number or letter grade on each writing assignment. They could predict their final grades simply by keeping a tally of how they did on each essay and writing assignment. Faculty often listed how grades were figured at the top of each syllabus, making this even easier.

With the portfolio system, however, a large portion (sometimes as much as 75 percent) of a student's final class grade was based on their final portfolio - which was often comprised of four to six essays. This, of course, was turned in at the end of the semester. Students often took their final and walked away from the campus without any clear idea of how they were doing in their portfolio-based class. Faculty then graded the portfolio, figured the students' final grades, and often turned final grades into the registrar's office without administrative review. Students had no way of knowing how they did until their final grades were posted by the campus. The number of students requesting grade review often escalates with this system -- if only because the students feel powerless and confused by this form of "blind review."

I did everything I could to give students some information about how they were doing during the portfolio-based semester. I made due dates for assignments and gave them detailed feedback about each written work. Rubrics that showed areas for improvement may have helped students rewrite papers for their portfolio, but still gave them no tangible evidence of their grade to date. Even when students came to my office and we went over essays together, they still could not see how this information might be reflected in their class grade-to-date. I ended up wasting many precious class hours trying to reassure students about the portfolio process.

My undergraduates' constant requests to nail down their grade-to-date made me aware that the flexibility and abstract nature of the portfolio system generated absolute fear in many of them. They simply were not prepared to trust this system.

After fielding over 50 phone calls and e-mail messages from students in a state of panic about their grades two weeks before their final portfolio was due, I decided to make a change. The next semester, I initiated what I called "advisory grades." When a student handed in an assignment, I evaluated it, wrote down the grade the assignment would receive in its current state, and logged this "advisory grade" into our campus online grading software. I advised students that when they turned in their portfolios, these "advisory grades" would be eliminated. The new grade replaced the old.

Class-wide anxiety seemed to lessen because students were now able to see the grade their latest assignment had received -- and how they were doing in the class overall. Although this reduced the number of grade reviews that I suffered, it added an additional "step" in what was supposed to be a seamless venture. It also created a loophole. Students who approved of their "advisory grade" simply did not revise that assignment for the final portfolio. This, of course, negated one tremendous advantage of using the portfolio system -- the encouragement to revise.

Another concern was the responsibility of choice that we were now relegating to undergraduates. Some students saw the instruction to "pick the best four out of six" for inclusion in the portfolio as a way to avoid the most difficult and challenging work in my core classes. If my syllabus did not specifically state that all six assignments must be done, they would often only complete four. In this case, the all-important objective for students to evaluate and assess their work was now eliminated.

Even when I began stipulating that all six assignments were required, a fair number of underachievers would produce what I would consider a "token effort" for two out of the six assignments. For example, if I asked for a 10-page paper, these students would produce a one- or two-page rough draft, confident that they were going to exclude this assignment from the final portfolio.

I also noticed that students who were going to eliminate a particular work from their portfolio tended to skip classes that focused on that work; what they didn't realize is that they were missing lessons and concepts that were building to the next assignment. These students saw grades falling rather than climbing; the number of those who met me at the podium after class to complain increased. Disappointingly, these students often refused to make appointments to see me to catch up on missed work -- they only saw the holes in their education as missed chances to gain a few grade points.

The next semester I initiated a punitive attendance policy. I hated treating my undergraduates like high school students, but it was clear that the weakest students did not understand the value of a day's lesson that did not immediately translate into grade points. I also indicated in my syllabus that anything less than a full-length paper would be returned without credit. In response, my less motivated students then turned in what would look like a pre-write -- something so unformed that it could not be considered college-level work. My evaluation of these assignments was wasted time; I knew that these students would never return to these rough pieces to work through initial difficulties to master these concepts. And through the magic of the portfolio process, the poor grade that these works received was eliminated.

Next, when allowed to rework and revise only four out of six assignments, my undergraduates immediately discarded the assignments they found most difficult. It was as if the two assignments that asked the most of them did not exist. This meant that they were reworking materials whose underlying concepts they had, in essence, already mastered. Here, again, part of my curriculum was being eliminated. Students would no longer meet my course objectives with pieces and parts discarded.

When given a choice, students dropped the most challenging assignments. They may have seen this as a wise budgeting of time and effort, yet I felt as if they were making two important statements: one, my expertise in that area was not important; and two, they were telling my department that they did not value that particular outcome. In my courses, students often dropped the more difficult argumentative essay -- or more often than not, the long research paper required for the course. Yet these specific assignments were the ones that would have prepared my students most effectively for courses in other disciplines. And the painful reality was that my department's desire to be democratic was, in effect, allowing under-prepared undergraduates to dictate their own curriculum.

When it came to revision, my overachievers immediately started reworking assignments the minute they received feedback. Yet, 90 percent often waited until the last possible moment to revise their work. Somehow, viewing four major assignments that desperately needed revision seemed to de-motivate them. In an effort to help, I encouraged students to come see me outside of class.

Each semester, I added eight or nine additional office hours a week during the last two or three weeks of class, hoping to lift my undergraduates from mediocre work. Still, I would find myself almost completely undisturbed. Here and there, an honors student would appear with a revised paper in hand, hoping to move from 90 or  95 percent to a perfect 100 percent. My other students simply did not see the value of free one-on-one tutoring with their instructor -- or they were intimidated by the portfolio system. In either case, they did not receive the help they needed to improve their work as a whole.

I finally started initiating the occasional "in-class work day," and placed my students in a computer lab. Here they could rework their papers. I "floated" from row to row, viewing their writing and making suggestions. Still, a minute or two per student did not give them substantial feedback.

Last year, I started requiring my students to see me for a 15-minute consultation once during a critical time in the semester. Although these individual conferences proved fruitful, this short time period was not enough to look at more than one revised assignment. Students may have walked away with concrete ideas to improve an assignment; yet, unless they were tremendously motivated, their other assignments went untouched.

My expectation that students would revise all six assignments and then ask for help in choosing the best work for their portfolio was quickly revealed as a pipe dream. Even my honors students knew the value of their time. Better to spend time pursuing more grade points on the four works that "counted" than waste time on all six. Yet the idea that the students and I were going to view their work holistically was what had sold me on the use of portfolio systems. And my experience seemed to suggest that other than a few overachieving students, I was the only one doing any form of "global review."

As an active writer, I can't help but find the writing process interesting. I loved the idea of encouraging my own students to reflect on their own writing process. Maybe I secretly hoped that one undergraduate out of a hundred would suddenly see the beauty in this creative venture and change their major to English literature, rhetoric, or journalism. The one concrete assignment where I could find out more about my students' writing experience was a "letter to the instructor," which promised 10 points without regard to content. Set inside their portfolio, I hoped this 250-word note would give me the inside track to improving my course and engaging students in my next course.

Unfortunately, the majority of my students used this platform to plead for better grades. Of course, I empathized. One on occasion, I was able to intervene and suggest that a student ask for a medical deferment for the semester's work. But I could only view the work they produced -- not the stressed, and sometimes, troubled person behind it. And, of course, I was no closer to truly understanding their writing process and the obstacles they had faced in producing the body of work I demanded that semester.

A small number of my most accomplished students did take the time to review their work and seriously discuss what they saw as their strengths and weaknesses. On occasion, they complimented my teaching, thanked me for "keeping on them," or made a concrete suggestion for my course. I kept these few notes in a special file to be reviewed when I felt overwhelmed and disappointed. I later began to suspect that the concept of only "showing your best work" was setting students up for failure. Because their worst work was eliminated, their final in-class grade was higher than normal. This source of "grade inflation" created several problems. First, the jump to other courses was even more substantial. Many students who performed well in a developmental course that used a portfolio system then did poorly in a traditionally assessed transfer-level course that followed. By midterm, some students were failing. Shocked, they would initiate grade reviews by the dozens.

Colleagues of mine who did not use a portfolio system started to view those of us who did with a critical eye. "Just what were we letting these students get away with?" they often asked each other. Although there was no official discussion of these concerns, this division did not help our already fragmented department.

There was also dissent among instructors who used portfolio-grading systems. One instructor who taught a lower-level composition course allowed students to discard 4 out of 10 major assignments. He also stipulated that these six successful works would count for 75 percent of the student's final grade. The result was that he turned in a slew of A's and B's each semester. His format looked enormously successful on paper -- yet those of use who taught his former students were in for trouble.

Even if the next course used a portfolio system as well, even subtle differences in format would be devastating to the students' expectations. Asking students to eliminate two assignments out of six would reflect their true abilities more closely, resulting in less
"grade inflation." And with a portfolio worth 50% of a student's final in-class grade, there was more pressure on other parts of the course -- something that these students had not yet experienced at this level. The result was often constant complaint, and in some cases, grade review. I had questions, serious questions, about this process.

The portfolio system also required more work from already overwhelmed instructors. A colleague confessed to working at a university that required him and seven other colleagues to grade over 375 portfolios (each with three essays, including outlines, pre-writes, drafts, "final" papers, and rewrites) in one afternoon. After a "norming" session, each portfolio had to be blind reviewed by at least two instructors; a third would be used in a case where a portfolio grade fluctuated more than a half grade. Although my friend felt reassured knowing how he compared to colleagues when it came to assessing student work, he dreaded this day all semester. No number of after-review drinks at a local tavern washed away fatigue and a general sense of being taken advantage of by his university.

Most departments do not install such a demanding regimen; still, the constant review of work often necessitated many more hours from faculty than those teaching classes in a more traditional format.

In graduate-level courses, I was sure that many of the obstacles I faced would be lessened or eliminated; still, my department chair had strongly encouraged me to apply these principles to my pool of undergraduates. As a contract employee, I felt compelled to do the best I could. Upon reflection, I realize that the students that did well within the portfolio format would also succeed in a traditional class. The students in survival mode would attempt to work the system, just as they would with any course. I did not sense that the portfolio system was a complete failure -- but I had a nagging sense of discontent about the process.

In the end, I'm most concerned that my curriculum is being negatively affected by what is considered a progressive form of assessment. In other disciplines, it seems to be applied more effectively. In graphic design courses, students are motivated to succeed in their specialty. Many of my design students worked to improve their complete body of work -- if only to have a greater number of pieces to show potential employers. Even in the fine arts, students may move into an area of concentration, but often move back to master other formats as they grow curious or bored. In both of these disciplines, students are motivated by discovery more so than grade points; therefore, the portfolio system fits well with the curriculum.

With undergraduate classes, however, a great number of students are motivated to "get the core over with" so they can go on to classes in their major. Anything that helps them scale back the amount of effort and still achieve the same grade in these bread-and-butter classes is desirable -- no matter what the effect on the curriculum. No matter how instructors struggle to hold the line, the portfolio system encourages "grade inflation" that is not only damaging to an undergraduate's academic experience, but to faculty, administrators, and to the college as a whole. This system also allows undergraduates to discard what may be tremendously important portions of the core class curriculum long before they are qualified to be making such decisions. These losses will be felt down the line in future classes, other disciplines, and even in future careers when the student is far from the university's reach.

Author/s: 
Shari Wilson
Author's email: 
info@insidehighered.com

Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.

Reject the 'Finish in 4' Fad

"Finish in four, I promise!" That is what Northern Arizona University is telling its incoming students. With a little better advising and a binding contract to take 15 credits per semester, the university promises that students can complete their undergrad degrees in four years. Utah State University, the University of Iowa and the University of Colorado at Boulder are also offering similar guarantees.

Now, there are some strings. As the Tucson Citizen notes, "It doesn't hold if students change majors midway through college or drop or flunk several courses. A few majors, such as engineering, are excluded because some students need to take pre-college math courses that can extend graduation beyond four years." So, do it right, make no changes, make no mistakes, and you can move efficiently through the university.

As someone who has to report to my university’s provost about what we will do to get our students to graduate in four years, I am sensitive to this newest fad. It affects how our institutions will be ranked and how parents will select the perfect place for their children to study. Yet, as a five-year undergrad myself, I am not sure why this is even a good goal. Yes, our federal loan money, and our state subsidies, will go to more students if we can push them through, but that is exactly what we would be doing ... pushing. And is that what we are here to do? For that matter, is efficiency a worthwhile measure of a college? Of a student?

When I attend events to recruit new students, I rejoice in those who don't know what they want to do. They come to the experience open for adventure, exploration, excitement, and challenge. I tell them that they will probably do better than those who have their future planned out. Why? Because most students change their majors. And, at a public university like mine, students are even more likely to change their majors than their private college counterparts.

Why do students change their majors? I think it is because students have little idea about (a) what jobs exist, (b) what majors correspond with what jobs, (c) what they are good at, and (d) what course of study would best use their abilities.

Hell, when I attend college major recruitment fairs, almost all the students and their parents line up for business, pre-med, and pre-law. (Working class folks tend to go for health sciences and business, because they hear there are jobs there.) I am tempted to just hand out fliers that say, "Business majors have to take accounting and advanced math. Pre-med (and health sciences) folks have to take a LOT of science courses... with labs! When you find you don't like those courses, or you fail a few of them because you actually have no special ability in advanced math or science, come check us out!"

That is how we get our majors, for the most part; the students realize that they picked a major for some bogus reason, like they knew someone who had X job and s/he made a lot of money, and they realize as they take more classes in that area that it is not what they originally thought or that it does not suit them. Then they look for something that actually suits their interests and talents. So, the parents who pushed them into their original major gnash their teeth and complain when their children have to take additional courses to meet our requirements, which are different than their original major, and their time is extended. Yet, while this can be more costly, it is such a bargain in the long term. Better to make the change in undergrad than to figure out, after earning the degree, that you are ill-suited for the professions for which you were prepared.

So, among those who don't finish in four, we first have the confused. Add to this number the students who party too much, who attend a college that doesn't suit them (that was my error), who have adjustment issues transitioning to undergraduate life, whose mental illness expresses itself during college, who have personal traumas in their lives (also my issue), whose families face financial downturns, who face discrimination or harassment, and/or who just bomb a class or two. Suddenly, our numbers look terrible! See how few students we graduate in four years!?! (And we aren't even counting the transfer student s-- the year-to-degree numbers only count students who entered as freshmen. If we included those folks in our numbers, we would see how few students really graduate in four years.)

If we still have a perverse need to measure time to degree rates, we should extend the bar to six years of full-time study, as we do for athletes and for some federal reporting requirements. (Athletes are not the only ones balancing academics with other interests!) We should exclude students who move to part-time status from our count. But I would hope that we would not use these data to rate institutions.

Finish in four sends the wrong message. It says that college is simply utilitarian, a means to a financial end. We should recognize that college is not high school. It is about self-discovery, the investigation of different majors and fields, and intellectual exploration and development. Let's reject this fad and focus on the long-term goals: producing graduates who can write, read, and think critically, and who can contribute to our society.

Author/s: 
Lesboprof
Author's email: 
info@insidehighered.com

Lesboprof is the pseudonym of a faculty member and administrator at a public university in the Midwest where the official line is that four years and out is a good thing.

The Co-op Model's Relevance Today

Cooperative education is now more than 100 years old. The co-op approach, in which students alternate time in the classroom with professionally paid work directly related to their majors, was founded at the University of Cincinnati by Dean Herman Schneider in 1906. There are co-op programs today at 500 institutions in the United States.

The centennial marks a good time to take stock. How effective is co-op? What has been its impact on its three fundamental partners -- students, employers, and institutions of higher education? Is co-op still relevant? Still viable? What role should co-op play in 21st century education?

I see empirical evidence of co-op’s value every day at the University of Cincinnati. We have 3,800 students in 44 disciplines participating in co-op opportunities at more than 1,500 employers in 34 states and 9 foreign countries. At graduation, UC co-op students have an enviable head-start in their careers by virtue of their on-the-job work experience (an average of one-and-a-half years for UC students), marketable skills, impressive credentials, and networking connections. Many are hired immediately by the companies where they completed their co-ops.

Collectively, our co-op students earn about $35 million each year. Plain and simple, that money helps students pay for college. Moreover, if those dollars came in the form of scholarships, it would necessitate a university endowment totaling $875 million.  In short, we would have to nearly double our endowment to support the program.

Beyond those signs of success, of course, our co-op students benefit from blending classroom learning with experience in the workforce -- applying theory to practice, as one researcher summarized it. Theirs is the ultimate school-to-work transition. And at the nexus where co-op takes place, benefits also accrue to participating employers and the sponsoring university.

We have long known of these benefits anecdotally. Over the past 20 years, a series of small studies have started to confirm co-op’s value in data. Overall the field needs broader studies and better longitudinal analysis, but the research that has been conducted to date tells a remarkably consistent story. Studies show definitively, for example, that co-op experiences help students explore career options, clarify goals, and find mentors. There’s now statistical evidence that co-op motivates students to learn and study, leads to higher GPAs, and improves individual self-confidence. There is further documentation of the value of co-op in improving individual communications and human relations skills. That’s all in addition to findings that co-op alumni get higher salary offers than their non-co-op peers.

Abstracts from over 40 years of research are available online.

Studies also now confirm the benefits of co-op for employers. Co-op serves as an effective screening and selection process in the recruitment of new talent and it leads employers to workers who are typically more motivated and more productive than other recruits. Co-op also has a positive effect on employee retention and productivity.

In its “Job Outlook 2005,” the National Association of Colleges and Employers reported that employers complain continually that too many new college graduates lack maturity, don’t know how to conduct themselves in a business environment, and don’t have an appropriate work ethic. Those are skill sets that co-op students develop during their education. It’s perhaps not surprising, therefore, that estimates of the number of co-op employers -- including Fortune 500 companies, small businesses, government, and nonprofit organizations -- have jumped in recent years from 50,000 to more than 120,000. Not surprising, either, is that such organizations as the Education Commission of the States and the State Higher Education Executive Officers have called for improved postsecondary attention to the school-to-work transition, which of course is at the heart of co-op education.

Colleges and universities benefit from co-op, too. Co-op students enhance learning by infusing classroom discussions with real-world experiences -- sometimes leading faculty to reform curricula.

In 2006 the highly ranked architecture program at my university combined employer feedback with faculty observations from the classroom and resolved to focus on the enhancement of students’ building construction skills. Similarly the civil engineering program used employer feedback as well as input from their accrediting body to redesign the curriculum to enhance students’ understanding of the fundamental concepts of structural analysis.

By its inherent nature, co-op leads institutions of higher education to better relationships with business, which in turn opens new doors for fundraising and partnerships beyond co-op. Another practical benefit is in student recruitment. Pace University found that a full half of incoming students were attracted to the university by its co-op opportunities. What’s more, their study showed, the student retention rate for those in the co-op program was 96 percent, compared to 52 percent for the institution as a whole. Other studies corroborate co-op’s positive impact on student retention.

Co-op programs drive colleges and universities to be continually innovative in curricula and learning processes in response to employers’ needs. In fact, a study under way at the U.S. Department of Education is helping document that co-op education is emerging as one of the few educational approaches that can help curricular development keep pace with industry needs. It may be time, then, for the U.S. Congress, as it works on re-authorizing the Higher Education Act, to take a fresh look at how co-op education can help enhance college affordability and ensure the relevance of higher education in the new century.

Our neighbors to the north have the right idea.  The Province of Ontario offers up to 15 percent tax breaks for companies hiring co-op students.  Tax incentives for companies employing co-op students could be the best way of increasing the participation in cooperative education. Tax breaks treat all sectors of industry equally, and are less likely to skew the production of graduates towards segments without a solid employment market.

One hundred years after co-op was created at the University of Cincinnati, our Professional Practice program is leading a $1 million study that will help create the next generation of co-operative education. We’re looking for ways to link measures of student performance in co-op with corporate feedback and curricular reform. Our work is just one of a number of current efforts looking to make co-op stronger pedagogically and even more relevant -- efforts, for example, to reinforce student learning through improved self-reflection, and to link co-op more deliberately with experiential and service learning.

The co-op approach creates necessary bridges between work and learning, between liberal education and professional education, and between universities, government, and business. Moreover, co-op prepares students extraordinarily well for work -- and life -- in today’s fluid, fast-paced, and globally interdependent workplace. By the time they graduate, co-op students have a firsthand perspective on international competition, business ethics, workplace diversity, corporate cultures and more. As we prepare students for their roles in the 21st century, the benefits and attributes of co-op education have never been more relevant, or more urgently needed.

Author/s: 
Nancy L. Zimpher
Author's email: 
info@insidehighered.com

Nancy L. Zimpher is president of the University of Cincinnati. As a faculty member, Zimpher directed hundreds of student teaching experiences and recalls fondly her own initial “real world” experience -- as a student teacher. 

Teachers and Teaching

During the summer months before I entered Harvard in the fall of 1953, I read The Education of Henry Adams  (1918). His sardonic, world-weary recollection of his undergraduate years at Harvard from 1854 to 1858 was not reassuring. Harvard “taught little, and that little ill,” Adams wrote, and “the entire work of the four years could have been easily put into the work of any four months in after life.” The best he could say was that Harvard “left the mind open, free from bias, ignorant of facts, but docile.” In reflecting on his classmates, Adams dourly observed, “If any one of us had an ambition higher than that of making money; a motive better than that of expediency; a faith warmer than that of reasoning; a love purer than that of self; he has been slow to express it; still slower to urge it.” He had even fewer good words to bestow upon his teachers.

My own experience as an undergraduate 100 years later was quite different. It was indelibly marked by a number of teachers and writers who changed my life utterly and forever. They were models of the life of the mind in action. They made me want to be, if I could, precisely what they were: teachers and scholars.

Virtually from the day I entered Harvard, I wanted to be a professor. I found books intellectually exhilarating. Nothing gave me greater satisfaction than achieving a sense of mastery of the life and works of particular authors and thinkers -- not simply the sort that earns an outstanding grade on an exam but the kind that yields a rounded, nuanced appreciation. I came close to reaching this level of knowledge and insight, I thought, with Samuel Johnson, Sigmund Freud, George Bernard Shaw, and T.S. Eliot. I immersed myself in their most significant works, not once but again and again, and I read the leading works of criticism and secondary materials about them. Eventually I came to a fluent familiarity with the texture of their thought. Few intellectual efforts were more satisfying; few brought me closer to sensing the thrill of being a scholar.

My admiration for my teachers -- indeed, my wonder at how much they knew and how compellingly they wrote -- was unbounded. I thought of the words that Oliver Goldsmith used, in “The Deserted Village,” to describe the intellectual capacities of the parson: “And still they gaz’d, and still the wonder grew, / That one small head could carry all he knew.” I gazed in wonder at all that so many of my professors knew: Northrop Frye, seemingly about all literature; Perry Miller, about the New England Mind; Douglas Bush, about John Milton; Walter Jackson Bate, about Samuel Johnson and John Keats; Arthur M. Schlesinger, Jr., about American intellectual history. Harvard, in Nicholas Dawidoff’s phrase, was “a culture that served men who had spent a lifetime accumulating knowledge.” It honored men of learning, scholarship, and wisdom. I wanted dearly to become a part of that culture, wherever it might exist and however I might qualify for entry.

How, I would ask myself, had Harvard chosen its faculty members so well, especially when it had chosen them when they were so young? How did it recognize intellectual promise with such consistent perspicacity? Perhaps there was something about the capacity of Harvard to reinforce a sense of destiny that elevated the achievements of its faculty members as they matured, just as it did of many of its students.

Many histories describe the fifties as years of intellectual passivity, simplistic religiosity, and political meanness, of “the organization man” and “the man in the gray flannel suit.” And yet, because of the craft and character of the best of my teachers, I regard it as a period bursting with decidedly powerful ideas. I am astonished still by the boldness and enduring authority of many books of political and social criticism published during that decade. As undergraduates who were then coming of age intellectually, my friends and I wrestled intensely, often late into the night, with the encompassing claims of those contemporary philosophies to which our teachers introduced us: especially Freudianism, Marxism, Keynesianism, and existentialism.

Each new course was an awakening. I read books that were unconventional and pathbreaking, books with bold and synoptic themes that would change forever how we thought about the world and ourselves -- books like Isaiah Berlin’s The Hedgehog and the Fox (1953), Erik Erikson’s Childhood and Society (1950), Freud’s The Interpretation of Dreams (1899), Northrop Frye’s Anatomy of Criticism (1957), Richard Hofstadter’s The Age of Reform (1955), F.O. Matthiessen’s The Achievement of T.S. Eliot (1935), Reinhold Niebuhr’s The Children of Light and The Children of Darkness (1944), Morton White’s Social Thought in America (1948), and Edmund Wilson’s To the Finland Station (1940).

As my professors explored in their lectures the rugged intellectual terrain of these challenging books, they taught me the beauty of powerful ideas, as a liberal education should. They gave no slack. I studied the political and historical analyses of such demanding scholars as Joseph Schumpeter, George F. Kennan, Richard Hofstadter, and Louis Hartz. I devoured the works of important modern novelists: Lawrence, Conrad, and Forster, Hemingway and Gide, Malraux and Camus. No contemporary novelist overwhelmed me more than Faulkner, who was an entire universe in himself, as were the very greatest writers, like Balzac and Dickens, who came before him. I struggled with the dense, often difficult poetry of Eliot and Yeats, Stevens and Frost, Auden and cummings. And I embraced the icon-breaking plays of the modern dramatists: Ibsen, Strindberg, and Shaw, O’Neill, Williams, Miller, and Beckett.

As one who was fortunate to be an undergraduate at the time, I cannot accede to the conventional claim that these were years of intellectual and spiritual quiescence. These books and these men (my professors, in fact, were all men) made me want to be, throughout my lifetime, a reader, a learner, a teacher, a scholar.

By their loving immersion in their subjects, by the strenuous demands they made of their students, my teachers inspired me -- an anonymous student sitting in classes typically of several hundred -- to be passionate about the life of the mind. In the words of George Steiner, author of Lessons of the Masters (2003), each represented the ideal “of a true Master.” Steiner rightly adds, “The fortunate among us will have met with true Masters, be they Socrates or Emerson, Nadia Boulanger or Max Perutz.”

I yearned to become a member of their company of scholars. I hungered to write books like those they taught me so to admire. I wanted to partake of their professional way of life. What could be more thrilling or ennobling, I thought -- what could be more worthy or rewarding -- than a career as a teacher and scholar?

My naïveté about the possibility of teaching English at a good liberal arts college was brought home to me one day as I talked with a Radcliffe friend, herself the daughter of a distinguished professor. “I am wary about ever marrying an academic,” she said, “no matter how much I might love him.” I asked why, expecting that she would point perhaps to the modesty of academic salaries. “It might be that the best job he could get would be in Brunswick, Maine,” she replied. “Why would I want to spend the rest of my life in Brunswick, Maine?” She had injected realism into the conversation.

Almost all of my courses were taught in large lectures, typically of 100 to 300 students; occasionally a class would be as large as 400. Most of the professors had mastered the art of projecting to a large audience -- and this was in a period before the regular use of slides and other audiovisual aids. Some professors were accomplished orators or humorists, and some roamed the platform dramatically and with a practiced pace. Some timed their presentations to end in a grand flourish on precisely the stroke of the hour’s end. Many had established a reputation for one or two fabled lectures. Students would annually await the day of their delivery: Crane Brinton, a European historian, on the activities of Parisian prostitutes during the French Revolution; Walter Jackson Bate, a literary critic and biographer, on the death of Samuel Johnson; Arthur M. Schlesinger, Jr., a historian, on the embattled presidency of Andrew Jackson; David Owen, a historian of England, on British rule in India, always with the timeworn ditty:

  Briton meets native,
  Bible in hand. Native
  gets Bible. Briton
  gets land.

At the time I majored in English, the department had reached “a high plateau,” in the description of Morton and Phyllis Keller in Making Harvard Modern (2001), and become “the most notable of Harvard’s humanities departments.” Its intellectual leader was Walter Jackson Bate. Its senior professors, all “at or near the top of their games,” included Douglas Bush, Perry Miller, Harry Levin, Alfred Harbage, John V. Kelleher, Howard Mumford Jones, and Albert J. Guerard. And this list did not include such distinguished visiting professors as Northrop Frye and F. W. Dupee, both of whom attracted large student enrollments.

The greatest of all my Harvard teachers was Walter Jackson Bate, a man of immense learning whose humane exemplification of literature as a source of moral teaching shaped me in permanent ways. During a long career at Harvard he published the magisterial biographies Samuel Johnson (1977) and John Keats (1963), both of which won the Pulitzer Prize for biography, as well as many books of criticism. Bate taught three principal courses -- The Age of Johnson, Literary Criticism (from Aristotle to Matthew Arnold), and English Literature from 1750 to 1950 -- and I took all three during my sophomore year, when he published The Achievement of Samuel Johnson (1955).

Bate was a frail, delicate man whose frame, in Nicholas Dawidoff's words, “appeared to be constructed of twigs and mist.” His lectures were conversational but deeply felt, meditations of a sort that were themselves a metaphor for his striving to achieve a perception of life’s tragic truths. He described, in tones of melancholy nostalgia, how he had come to Harvard as a 16-year-old farm boy from Indiana, without a scholarship, and been awakened to the life of the mind by the lectures of Professor Raphael Demos in Philosophy 1. He could be ironic and mischievous. He seemed congenitally sad and weary. He was the most memorable teacher I ever had.

Bate’s most celebrated course was The Age of Johnson. During the weeks of a semester, Bate led us through Johnson’s Lives of the English Poets (1779), The Vanity of Human Wishes (1749), Rasselas (1759), and his achievement as a lexicographer in the two-volume Dictionary of the English Language (1755), in which Johnson sought, as he wrote in his preface, to capture “the boundless chaos of a living speech,” as well as generous, incomparable selections from Boswell’s Life of Samuel Johnson (1791). Bate argued, again and again, that through his efforts to invest existence with meaning, Johnson had lived a life of allegory, as Keats said of Shakespeare -- “his works are the comments on it.”

Literature, for Bate, was an instrument of moral education and development. “Man was not made for literature,” he often recited, paraphrasing the Bible; “literature was made for man.” A protégé of Alfred North Whitehead, Bate sometimes repeated Whitehead’s premise that “moral education is impossible apart from the habitual vision of greatness.” For Bate, Samuel Johnson was the preeminent example of a life straining toward moral meaning and emancipation from adversity. “Life is a progress from want to want, not from enjoyment to enjoyment,” wrote Johnson. Many of the illustrations that ornamented Bate’s lectures were drawn from Johnson’s tortured efforts to overcome his own idiosyncrasies, eccentricities, irascibility, and sloth. “The great business of his life,” Bate wrote, “was to escape from himself.”

I loved Johnson’s praise of Paradise Lost, with his conclusive reservation “None ever wished it longer,” and his famous observation that a second marriage represented the “triumph of hope over experience.” I admired the angry candor in his condemnation of self-righteous patriotism as “the last refuge of a scoundrel” and his weary observation “No man but a blockhead ever wrote except for money.” I took delight in his blunt rebuke of Lord Chesterfield: “Is not a patron, my Lord, one who looks with unconcern on a man struggling for life in the water and when he has reached ground encumbers him with help? The notice which you have been pleased to take of my labours, had it been early, had been kind, but it has been delayed till I am indifferent and cannot enjoy it, till I am solitary and cannot import it, till I am known and do not want it.”

Bate especially admired the paradoxical reversals that lit up Johnson’s prose. For example, Johnson once dismissed a book as “both good and original, but that which was good in it was not original, and that which was original was not good.” He admired, too, Johnson’s psychological insight. Johnson once wrote, “So few of the hours of life are filled up with objects adequate to the mind of man ... that we are forced to have recourse every moment to the past and future for supplemental satisfactions.”

Among undergraduates, Bate was especially known for involuntarily losing his composure every year -- some thought he was reduced to tears -- in describing the death of Johnson. He seemed as genuinely moved by Johnson’s death as he would have been by the death of a beloved contemporary. It was one of Harvard’s most famous performances. In another course I took from Bate, he was equally moved in describing John Keats’s deathbed wish that there be no name upon his grave, no epitaph, only the words, “Here lies one whose name was writ in water.”

Under Bate’s gentle guidance, I came to love the literature of the 18th century: the periodic prose of Burke, the wit and irony of Gibbon, the rhyming couplets of Pope. I have ever since been able to recite from memory a certain amount of 18th-century poetry, especially those poignant lines from “The Deserted Village” by Goldsmith: “Ill fares the land, to hastening ills a prey / Where wealth accumulates, and men decay.”

In a memorial minute adopted after Bate’s death by the Harvard faculty of arts and sciences, his colleagues wrote that he “gave his students what he said Johnson had given so many, the greatest gift that any human can give another, the gift of hope: that human nature can overcome its frailties and follies and, in the face of ignorance and illness, can through courage still carve out something lasting and worthwhile, even something astonishing, something that will act as a support and friend to succeeding generations.”

Another teacher whom I greatly admired was Albert J. Guerard, a professor of English and comparative literature, who taught a brilliant course entitled Forms of the Modern Novel. Three mornings a week he lectured to a class of more than three hundred students on novels from Flaubert’s Madame Bovary (1857), Zola’s Germinal (1885), and Hardy’s Jude the Obscure (1895) across the first half of the 20th century to Camus’s The Plague (1948) and Faulkner’s Light in August (1932). In between -- it was an all-star list -- he assigned Gide’s The Immoralist (1902), Conrad’s Heart of Darkness (1902) and Lord Jim (1900), Joyce’s Portrait of the Artist as a Young Man (1916), and Greene’s The Power and the Glory (1940), among others. (I do not recall any reference at the time to the argument that Chinua Achebe would later make in “An Image of Africa” (1977); there Achebe denied that “a novel that depersonalizes a portion of the human race” -- he was referring to Heart of Darkness -- “can be called a great work of art.”)

Typically he covered a novel in an hour, always with luminous clarity and insight, as he introduced us to such critical themes as moral ambiguity and latent homosexuality. His great theme was the moral power of literature: “The greatest writers take us beyond our common sense and selective inattention, even to paradoxical sympathy with the lost and the damned -- take us, that is, to the recognition of humanity in its most hidden places.”

Professor Guerard was not only a novelist and critic; he was also a teacher of writing. Many of his students established themselves as novelists. One of whom he was particularly proud was John Hawkes, whose experimental novel The Cannibal (1949) he warmly recommended.

Guerard reveled in the beauty of a novel’s first and last sentences. He loved the way in which first sentences -- like Melville’s “Call me Ishmael” in Moby-Dick (1851) -- can set a tone for a novel’s primary mission. He especially admired opening sentences that invited a sense of intimacy, like “This is the saddest story I have ever heard,” in The Good Soldier (1915) by Ford Maddox Ford, or suggested a sense of quiet mystery, like “The past is a foreign country: they do things differently there,” in The Go-Between (1953) by L. P. Hartley. (My father loved first lines, too. His favorite was from Scaramouche [1921] by Rafael Sabatini: “He was born with a gift of laughter and a sense that the world was mad.” He loved the reckless, romantic sweep of that language. Another favorite was the haunting opening sentence of Rebecca [1938] by Daphne du Maurier: “Last night I dreamt I went to Manderley again.”)

I recalled from my high school reading the famous opening sentence of A Tale of Two Cities (1859) by Charles Dickens, “It was the best of times, it was the worst of times...,” and that of Pride and Prejudice (1813) by Jane Austen, which confidently asserts, “It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife.” As I became conscious of the tone-setting capacity of opening sentences, I looked in my reading for new examples. One that I admired appears in The Heart Is a Lonely Hunter (1940) by Carson McCullers: “In the town there were two mutes, and they were always together.” Another appears in The Stranger (1942) by Albert Camus: “Mother died today.”

There are, of course, any number of further examples. One of the best in European literature is “Happy families are all alike; every unhappy family is unhappy in its own way,” in Anna Karenina (1877) by Leo Tolstoy. One of the best in American literature is “You don’t know about me without you have read a book by the name of The Adventures of Tom Sawyer; but that ain’t no matter,” in The Adventures of Huckleberry Finn (1884) by Mark Twain.

I especially remember a poetic lecture that Professor Guerard delivered on the importance of a novel’s final sentences. Most endings, he said, were melodramatic or tired when they should have conveyed a cadenced finality. Few approached the quiet beauty of the last line of James Joyce’s “The Dead” -- “His soul swooned slowly as he heard the snow falling faintly through the universe and faintly falling, like the descent of their last end, upon all the living and the dead” -- or the lyric passion of the soliloquy of Molly Bloom that concludes Joyce’s Ulysses (1922): “... and then he asked me would I say yes to say yes my mountain flower and first I put my arms around him yes and drew him down to me so he could feel my breasts all perfume yes and his heart was going like mad and yes I said yes I will Yes.” Few were as philosophically effective as the last line of F. Scott Fitzgerald’s The Great Gatsby (1925): “So we beat on, boats against the current, borne back ceaselessly into the past.” One of my favorite endings is that of The Sun Also Rises (1926) by Ernest Hemingway. “Oh, Jake,” Brett said, “we could have had such a damned good time together.” Jake responds, “Isn’t it pretty to think so.”

Professor Guerard was especially good at analyzing dialogue. The conversation that appears in fiction, he said, with the experience of one who had published several novels himself, is quite different from the conversation of everyday life. A tape recorder can capture the way most people actually speak -- in false starts, circuitous detours, and garrulous and prolix meanderings, with pointless and irrelevant insertions. But no matter how accurate the transcript, such a rendering will seem stilted in print. Fiction, by contrast, must use dialogue to achieve artificially the illusion of a reality that is more richly authentic and convincing than a tape-recorded transcript could ever be. For a conversation to seem natural to the reader, he said, the author must shape it, omitting the repetitive, relying on the telling phrase and the pivotal word. In the end, “nature requires the sculpting hand of art in order to appear in literature as nature.” The lesson was as true for the ornate, drawing room conversation of Henry James as it was for the terse, telegraphic dialogue of Ernest Hemingway.

Guerard spoke in a measured, husky voice. He was a mesmerizing lecturer, a dignified man of magnetic warmth. When the entire class spontaneously applauded a lecture he delivered early in the term, Guerard expressed his appreciation at the start of the next lecture but asked that we thereafter refrain from applause. He feared that he would think his lecture fell short on all of those more usual occasions when the class did not applaud.

During his lectures on Light in August (1932), Guerard described his only meeting with William Faulkner, a meeting at which Faulkner insisted that he was a self-educated Mississippian who had never finished high school and had little to contribute to a conversation about literature. His only ambition, he had written to Malcolm Cowley, was “to be, as a private individual, abolished and voided from history, leaving it markless, no refuse save the printed books.” Guerard mentioned three books that he regularly taught: Notes from Underground (1864) by Dostoevsky, The Secret Sharer (1912) by Joseph Conrad, and The Plague (1952) by Camus. As it happened, Faulkner had a remarkably exact knowledge of all three. After that response, Guerard did not ask Faulkner about his obvious indebtedness to the cadences of the King James Bible and the plays of Shakespeare. Faulkner often asserted that he had never read Freud. “Neither did Shakespeare,” he told a Paris Review interviewer in 1956. “I doubt if Melville did either and I’m sure Moby-Dick didn’t.”

When the term ended, I sent a letter of appreciation to Professor Guerard, explaining that I intended to become a professor of English. I was thrilled to receive a reply. “There are many rewards to teaching,” he wrote, “but receiving such letters [as yours] is certainly one of the most satisfying. I’m a poor giver of advice, but would be glad to talk with you if you think I could be of any help.”

Several years later, in 1961, Guerard decamped for Stanford in a move that shocked Harvard: no one ever left Harvard. By the time of his death, he had published nine novels, six books of literary criticism, and a memoir. Among the subjects of his critical books were Conrad, Hardy, Gide, Dickens, Dostoevsky, and Faulkner.

Forty years after my graduation, when The New York Times reported my impending retirement as president of Dartmouth, Guerard sent me a beautiful letter. “It was a pleasure to see your beaming, youthful face,” he wrote. “You seem much too young to retire. On the other hand I think you can look forward to the reading of many books and perhaps writing one or two. At 83 I’m still at it.” And then, recalling a conversation that we had more than 10 years earlier when he had been my guest for dinner at the President’s House at the University of Iowa, he added, “I have had many fine and famous people in my classes but you were the only one able to recite the reading list years after taking my course.” Few letters have ever gratified me more.

Even as I admired Harvard professors like Guerard, I was intimidated by the prospect of emulating them. I shuddered at the lifelong burden of reading that a career choice to become a professor of English would entail. I thought of the frustration of Eugene Gant, Thomas Wolfe’s protagonist in Of Time and the River (1935), who as a college student “would prowl the stacks of the library at night, pulling books out of a thousand shelves and reading them like a madman. The thought of those vast stacks of books would drive him mad: the more he read, the less he seemed to know -- the greater the number of books he read, the greater the immense uncountable number of those which he could never read would seem to be.”

“How,” I asked my father, “does Professor Guerard find the time to reread each year the novels that he is teaching, keep up with the scholarly literature, and read all of the new novels published in this country and Europe?”

“Don’t you suppose he enjoys it?” my father replied.

I also admired Northrop Frye, a compact, bespectacled man with a booming voice, who was a visiting professor from the University of Toronto. He had made his critical reputation a decade earlier with a book on Blake, Fearful Symmetry (1947). He lectured with a strong assurance and an unusual clarity. As an ordained minister in the United Church of Canada, he commanded both the Bible and the works of Shakespeare. Now he was about to publish one of his masterworks, Anatomy of Criticism (1957), which presented a complete worldview -- a coherent framework, comprising tragedy, comedy, and romance, in which all novels, poems, and plays had interconnected places. “Poetry can only be made out of other poems,” he wrote, “novels out of other novels.” His theory took the Bible as the mythological substructure of Western culture. All human thought, Frye argued, was shaped by that substructure. Anatomy was an elucidation of how an archetypal and mythological reading could illuminate all of literature. When I bought Anatomy at the Mandrake Book Store, the proprietor, comparing Frye’s volume to a current national best-seller, said, “This is our Auntie Mame.

The qualities that most distinguished Frye were the breadth of his learning and the Euclidean clarity of his lectures. He seemed to be familiar with the whole of literary output; he was the furthest from a period specialist that one could be. Frye pushed creative imagination to the limits. He admired the ways in which certain lines encapsulated thoughts with a near-perfect economy of words. Shakespeare, of course, was more adept at achieving this masterful concision of thought than any other writer. His plays abound with pertinent examples, of which the most supreme is “To be or not to be.”

Frye could be devastating on literary trendiness. “The literary chit-chat which makes the reputations of poets boom and crash in an imaginary stock exchange is pseudo-criticism,” he wrote in Anatomy of Criticism. “That wealthy investor Mr. Eliot, after dumping Milton on the market, is now buying him again; Donne has probably reached his peak and will begin to taper off; Tennyson may be in for a slight flutter but the Shelley stocks are still bearish.”

Still another impressive professor was the American historian Arthur M. Schlesinger Jr. He was simply a wunderkind -- a brilliant intellect, a compelling writer, a scholar of breathtaking learning. The son of a distinguished Harvard historian, Schlesinger had had a meteoric career. The honors thesis that he had written as a senior had been published a year later as Orestes A. Brownson: A Pilgrim’s Progress (1939), and his work as a junior fellow at Harvard had resulted in The Age of Jackson (1945), for which he received the Pulitzer Prize for history at the age of 28. Shortly thereafter, Schlesinger was appointed an associate professor of history with tenure, to the surprise of some historians who believed that he had drawn forced historical parallels between the politics of Jackson’s administration and that of Franklin D. Roosevelt. (Only later did I read the seminal work on that tendency, The Whig Interpretation of History [1931] by Herbert Butterfield.) Schlesinger had a near-adulatory admiration for Roosevelt who, he believed, had preserved capitalism from itself by introducing governmental regulation of its harshest features. During the term that I took Schlesinger’s course, he was completing The Crisis of the Old Order, 1919–1933 (1957), the first volume of his history The Age of Roosevelt.

Schlesinger was not only exceptionally skilled in dismantling the theories of others; he also was richly imaginative in building theories of his own. Many of Harvard’s courses in American history sought to define a national identity by emphasizing a narrative of accommodation and progress: consensus over conflict, the absence of a landed aristocracy, the liberating presence of the frontier, the opportunities for upward mobility, and the constant presence of renewal and rebirth. For Schlesinger, American history had been a series of conflicts between the forces of wealth and privilege and those of the poor and underprivileged -- what George Bancroft had called “the house of Have and the house of Want.” In an important passage in The Age of Jackson, Schlesinger wrote:

American history has been marked by recurrent conservatism and liberalism. During the periods of inaction, unsolved social problems pile up till the demand for reform becomes overwhelming. Then a liberal government comes to power, the dam breaks and a flood of change sweeps away a great deal in a short time. After 15 or 20 years the liberal impulse is exhausted, the day of consolidation and inaction arrives, and conservatism once again expresses the mood of the country, but generally in the terms of the liberalism it displaces.

Schlesinger was an admirer of Herbert Croly’s The Promise of American Life (1909), which argued for a strong central government to address the problem of growing inequality. In Schlesinger’s reading, American history had been an “enduring struggle between the business community and the rest of society.” That struggle, in turn, was “the guarantee of freedom in a liberal capitalist state.” The goal of a pragmatic liberalism, perhaps ironically, was to prevent the capitalists from destroying capitalism. For that reason, he championed what he called “the vital center” where compromise and experimentation could devise practical solutions to democratic problems.

One of Schlesinger’s central domestic themes was that the New Deal had solved the problems of quantitative liberalism, and that the next decades -- starting with the sixties -- would be dominated politically and socially by issues of qualitative liberalism. In contrasting the old “quantitative liberalism” with the new “qualitative liberalism,” Schlesinger wrote: “Today we dwell in the economy of abundance -- and our spiritual malaise seems greater than before. As a nation, the richer we grow, the more tense, insecure, and unhappy we seem to become. Yet too much of our liberal thought is still mired in the issues, the attitudes, and the rallying cries of the 1930’s.” The concern of liberalism in the next decades, he believed, should be “the quality of civilization to which our nation aspires in an age of ever-increasing abundance and leisure.”

The new liberalism that Schlesinger envisioned presumably would emphasize such quality-of-life issues as civil rights, racial justice, employment discrimination, capital punishment, the availability of health care, religious toleration, gender equity, fair housing, educational opportunity, and environmental protection. Ironically, some of the qualitative issues -- perhaps they are best called cultural issues -- that came to the fore in the next several decades, such as abortion, gun control, school prayer, and welfare reform, had a distinctively conservative tenor. They cast doubt on the consensus theory of American development and illustrated Pieter Geyl’s observation that “history is argument without end.”

Schlesinger was fascinated by the American presidency. Following in the footsteps of his father, he organized polls of historians to rank the presidents. In the poll conducted during my student days, six presidents were adjudged to be great: Washington, Jefferson, Jackson, Lincoln, Wilson, and Franklin D. Roosevelt. Perhaps gratifying to Schlesinger, Truman was ranked near great, in the company of Polk, Cleveland, and Theodore Roosevelt. The ranking complemented Schlesinger’s thesis that periods of liberal and conservative ascendancy alternated in 30-year cycles.

At the podium, Schlesinger, always sporting a bow tie and often a bold-striped shirt, was an impressive presence. His mind was both agile and deep. His lectures were incisive, meticulously prepared, and polished. Never was a word out of place, a sentence left uncompleted. His course on American intellectual history was riveting -- the largest in the History Department (and that was a department that included Samuel Eliot Morison, John K. Fairbank, Frederick Merk, Crane Brinton, Charles H. Taylor, Myron Gilmore, David Owen, and Edwin Reischauer). He was as penetrating in discussing the sociology of William Graham Sumner and Walter Rauschenbusch as he was shrewd in analyzing the political machinations of Andrew Jackson and Franklin D. Roosevelt.

Because of his aplomb as a lecturer, I was surprised to read in the first volume of his autobiography, A Life in the Twentieth Century (2000), that Schlesinger felt great trepidation at the lectern:

I never quite escaped the imposter complex, the fear that I would one day be found out. My knowledge was by some standards considerable, but it was outweighed by my awareness of my ignorance. I always saw myself skating over thin ice. The imposter complex had its value. It created a great reluctance, for example, to impose my views on students.

Few professorial examples of intellectual humility impressed me as much as that of a colleague of Schlesinger’s, Professor Frederick Merk, a compelling lecturer who traced, with an unsurpassed skill, the westward movement, the role of the frontier, and the spirit of manifest destiny in American history. His lectures were clear, crisp, and witty. Students had affectionately named his most popular course “Wagon Wheels.” Near the end of the first term in his survey course on American history, Professor Merk announced that he did not know enough about the causes of the Civil War to lecture on it and that he had therefore asked Schlesinger to substitute for him in delivering the next four lectures. How many professors ever set their standards of intellectual humility so high?

My tutor during my junior and senior years was Professor John V. Kelleher, one of the world’s foremost scholars of Irish literature and culture, especially of the twentieth century. He held a chair in Celtic studies established by a Boston Brahmin expressly to promote understanding between the Yankee and Irish-American cultures.

Once a week I would thread my path through the Widener Library stacks for my tutorial hour with him. During the course of the two years, we read our way diligently through much of the poetry of Edmund Spenser (especially The Faerie Queene [1590] and “Epithalamion”) and John Donne. But the true lessons of these tutorial sessions lay not in the poetry itself, but in the conversations we had about the poetry. When Professor Kelleher read a poem aloud, my understanding of it grew. He taught me how to discover more and still more in the coded arrangement of words in poetic lines and stanzas. I was in awe of him.

Kelleher was a shy and modest man and a dedicated scholar. Crowned with a great shock of pure-white hair, he came from a blue-collar family in the mill city of Lawrence, Massachusetts. A graduate of Dartmouth College, he began his academic career as a junior fellow at Harvard and was appointed to the faculty without a Ph.D. Although he spoke with a severe stammer, he read poetry aloud in a deep and sonorous voice and with a lilting fluency, without any trace of a speech impediment. When he recited Spenser, his Irish accent captured the sound of Elizabethan English, he told me. He charmed me with his self-deprecating manner -- he was one of the most modest men I have ever known -- and his vivid recollections of many of the great Irish figures he had met: Maud Gonne, Jack B. Yeats, Frank O’Connor, Sean O’Faolain, and Samuel Beckett.

He loved to talk, too, about the gradual transformation that was occurring in Irish-American society -- a subject I had observed at an ethnic distance but that he knew at first hand. He saw the transformation, as he later wrote in an essay on his friend Edwin O’Connor, as a “rapid demise” characterized by “the rise of the funeral home and the destruction of the wake; the death of the old people, the last links with that vanished mid-nineteenth-century Ireland from which we were all originally recruited; the disappearance of the genial, uncomplimentary nicknames; and finally, the lack of any continuing force, like discrimination, or afterward the resentment of remembered discrimination, strong enough to hold the society together from without or within. Whatever happened, there came a time when nobody felt very Irish anymore, or had much reason to. By the late 1940s that society was practically all gone.”

Professor Kelleher probably understood the works of James Joyce (and Yeats, too) as deeply as anyone in the world. His copies of Ulysses (1922) and Finnegans Wake (1939) were extensively annotated and interlined with his comments on those often-baffling texts; they obviously constituted documents of exceptional critical brilliance. His favorite work of Joyce, however, was A Portrait of the Artist as a Young Man (1916). In an essay he gave me to read in typescript -- it later appeared as “The Perceptions of James Joyce” in the Atlantic Monthly (March 1958) -- he wrote, “I remember that when I first encountered Stephen Dedalus, I was twenty and wondered how Joyce could have known so much about me.”

One afternoon, as I was planning my course schedule for the next year, Professor Kelleher surprised me by saying that it probably did not make sense to take a course in Shakespeare. “No one can truly teach Shakespeare,” he said. “If you want to appreciate Shakespeare, you simply have to sit down and read him yourself, over and over again.”

Once I graduated from Harvard, I did not see Professor Kelleher again for thirty-two years, until he attended his 50th class reunion at Dartmouth in 1989. I was in the midst of my speech to his reunion luncheon -- at least 400 members of the class and their wives were packed into the room -- when I spotted him standing alone at a rear corner. His full head of pure-white hair was still a beacon. As soon as the lunch was over, I wove my way through the crowd, excited to greet him. “President Freedman,” he exclaimed, as we laughed in joyous reunion. For the first and only time, I corrected him: it was still okay to call me Jim.

One of Harvard’s most notable professors in the fifties was Perry Miller, who had returned from the war in 1946 as one of the university’s first professors of American literature. His wartime exploits as an OSS officer were well known; according to local legend, he had kept an Irish mistress, announced his intention to kill as many Nazis as he could, and accompanied the French war hero General Jacques Philippe Leclerc when the Free French forces liberated Alsace. Who knew whether any of this was true?

Upon his return, Miller began to offer his famous course, Romanticism in American Literature, concentrating on Cooper, Emerson, Hawthorne, Melville, and Thoreau. A year later, Miller offered one of the first courses in the new General Education program, Classics of the Christian Tradition. Miller went on to become an important intellectual and cultural historian, a leading exponent of Puritan thought, a gifted and exhaustive scholar with an unquenchable interest in theology, philosophy, and the history of ideas. He sought to capture what he referred to in Errand into the Wilderness (1956) as the “massive narrative of the movement of European culture into the vacant wilderness of America.” His work on the theological progression from seventeenth-century Puritanism to nineteenth-century Unitarianism was penetrating and original.

The two-volume The New England Mind (1939, 1953) that made his reputation had been published by the time I entered Harvard. So had his biography Jonathan Edwards (1949), with its evocation of the Great Awakening and its striking analysis of the role that Newton’s physics and Locke’s psychology had played in the formation of Edwards’s thought, and his anthology The Transcendentalists (1950). Miller published several other volumes while I was an undergraduate, including Errand into the Wilderness and The Raven and the Whale (1956), a study of Poe and Melville.

I took Miller’s survey course in American literature, which covered ground from Anne Bradstreet and Edward Taylor to John Steinbeck and William Faulkner. Miller’s teaching style was compelling. He was a man of physical gusto and intellectual enthusiasm. When he read from Jonathan Edwards’s famous sermon “Sinners in the Hands of an Angry God,” he fairly bellowed the preacher’s theme of eternal damnation in a fire of wrath.

In his book Exemplary Elders (1990), David Levin, a Harvard student in the years immediately after the war, recollected Miller’s teaching authority: “Miller’s great skill as a teacher was exemplary rather than sympathetically imaginative. He had a brilliantly intuitive mind, an extraordinary ability to find the heart of a seventeenth-, eighteenth-, or nineteenth-century text. That gift, and the art of dramatizing intellectual history so that young students who had virtually no knowledge of theology would see both the passion and the intellectual complexity in the debates of narrow Puritans or corpse-cold Unitarians, made him a priceless teacher.”

Once, when our teaching fellow was ill, Miller conducted our section of 15 students. Shifting uneasily in his chair, he told us that this was the first time in his entire career that he had ever taught a section of undergraduates. He virtually implored us to participate voluntarily so that he could get through the experience. Miller died much too early -- in 1963, at the age of 58.

Douglas Bush was another professor whom I greatly admired. He was a quiet man, modest and understated, but his vast knowledge of literature and his deferential demeanor made a deep impression on me. I took his course on Milton and have always regretted that I did not take his course on the Victorian novel. Bush had made his reputation with a magisterial book, English Literature in the Earlier Seventeenth Century, 1600–1660 (1952). He went on to display his critical virtuosity in more than a dozen other books, including studies of Jane Austen, Matthew Arnold, and John Keats.

(Nothing better illustrated his catholicity of taste than his unsuccessful efforts in nominating Edmund Wilson and Robert Frost for the Nobel Prize in Literature.)

Professor Bush’s method of teaching of Milton was to read the poetry to the class, quietly, patiently, line by line, pausing every several lines to comment on their meaning, historical allusions, classical references or echoes, and events in Milton’s life. Often it appeared that he was reciting from memory, rather than reading. Once, when the classroom lights suddenly went out, he immediately recited an apt passage from Paradise Lost: “More safe I sing with mortal voice . . . / In darkness, and with dangers compass’d round.”

Under the tutelage of Professor Bush, I came to admire the power and beauty of Milton. I reveled in the lyrical reach of his metered lines. I loved “Lycidas” (“Fame is no plant that grows on mortal soil”) and the sonnets, especially “On His Blindness,” with its canonical line “They also serve who only stand and wait,” which John Berryman called “the greatest sonnet in the language.” I also admired Milton’s prose, especially Areopagitica, his argument against censorship, with its stirring rhetorical assertion “Who ever knew truth put to the worse in a free and open encounter?” Each year Bush asked his students to memorize 20 lines from Milton for the final exam. I took an easy path, choosing the opening passage of the short poem “L’Allegro,” a poem that Helen Vendler counts as “Milton’s first triumph,” and to this day I can recite that energetic passage on command: “Haste thee, Nymph and bring with thee / Jest, and youthful Jollity....”

During my undergraduate years, I had many opportunities to hear poets and novelists read their work. The occasion I remember most indelibly was related to Professor Bush -- a reading on May 29, 1955, by T. S. Eliot, who appeared in Sanders Theatre under the auspices of the Advocate, Harvard’s undergraduate literary magazine. Eliot had written for the Advocate as an undergraduate and now was helping the magazine to raise money. Because I had competed unsuccessfully for membership on the Advocate, I felt a special sense of yearning that evening, a desire to identify with this Harvard graduate who was perhaps the most significant living poet and critic.

After being introduced by Archibald MacLeish, poet, playwright, and Harvard professor, Eliot rose to speak. “I don’t think most people know or realize how important an undergraduate literary magazine can be at so critical a time in a young writer’s development,” he said. “It meant not only encouragement and companionship, but very salutary discouragement and criticism.” He went on to say that he wished that he had intended all the obscure classical references and complex layers of symbolism that scholars and teachers were “discovering” in his work and attributing to his scholarship.

And then he added a word of homage to Professor Bush. In a number of early essays, Eliot had downgraded Milton’s stature as an English poet. “While it must be admitted that Milton is a very great poet indeed,” he wrote in 1936, “it is something of a puzzle to decide in what his greatness consists. On analysis, the marks against him appear both more numerous and more significant than the marks to his credit.” In his celebrated rejection of Paradise Lost, Eliot wrote, “So far as I perceive anything, it is a glimpse of a theology that I find in large part repellent, expressed through a mythology that would have been better left in the Book of Genesis, upon which Milton has not improved.”

Now, Eliot announced, Professor Bush had since persuaded him that Milton must indeed be ranked among the great English poets. I was stunned by the significance of that statement. It was, of course, a tribute to Professor Bush. But even more important, it was a confession of a critical mistake. Eliot’s confession of error was an epiphany; it brought the audience into the intimacy of a writer secure enough, generous enough, to admit his fallibility.

Professor Bush was an indomitable proponent of the humanities. He thought them more essential to a liberal education than the social sciences or the natural sciences; they were, he said, “the most basic of the three great bodies of knowledge and thought.”

With firm conviction as well as a fearful pessimism, Bush once wrote, “We may indeed reach a point in our new Dark Age -- at moments one may wonder if we have not reached it already -- where the literary creations of saner and nobler ages can no longer be assimilated or even dimly apprehended, where man has fulfilled his destiny as a mindless, heartless, will-less node. Meanwhile, no scientific problem is anywhere near so urgent as the preservation of individual man and his humane faculties and heritage.” I have always cherished the passion of his conviction.

When I took Economics 1 with Professor Seymour Harris during my sophomore year, the subject had not yet become a mathematical, model-building discipline. The basic textbook -- an early edition of the classic work by Paul Samuelson -- emphasized macroeconomic activity: the role of government in fostering aggregate demand and stabilizing the economy, managing the business cycle, correcting misallocations and market failures, and providing public goods. It covered basic neo-Keynsian topics of the mid–20th century, like supply and demand, business cycles, patterns of saving and spending, the pump-priming role of government, and the indeterminate influence of the imponderables that constitute consumer behavior.

It was in this course that I was introduced to one of the most engaging books about economists ever written, The Worldly Philosophers (1953) by Robert L. Heilbroner. The course’s intellectual heroes were Joseph Schumpeter, who highlighted the “perennial gale of creative destruction” at the heart of competitive markets, and John Maynard Keynes, the most influential economist of the century, whose emphasis on government spending to stimulate the economy animated the New Deal. Schumpeter, who taught at Harvard from 1932 until his death in 1950, emphasized the disruptive role of innovation and technological change in a competitive economy. His most famous book, Capitalism, Socialism, and Democracy (1942), was essential reading.

When it came time to write a term paper, I asked my section man if I might write on The Road to Serfdom (1944) by Friedrich A. Hayek, the Austrian economist who had studied with Ludwig von Mises and was perhaps the leading intellectual opponent of Keynesian orthodoxy. Although Hayek was a classical liberal, his book argued the conservative theme that the logic of the European welfare state implied the erosion of personal freedoms. He feared the results of central planning and social engineering; he admired individualism and the economic outcomes of unfettered markets. “Hayek?” my section man responded quizzically. “He is completely out of step with current thinking.” He expressed his disdain for Hayek’s so-called inevitability thesis: that if a nation experiments with intervention in the economy, it will eventually end up as a totalitarian state. He concluded, “I don’t see that there’s much you can do with that book.” And so I renewed the search for a paper topic. (Twenty years later, in 1974, Hayek was awarded the Nobel Prize in Economic Science.)

Having enjoyed Economics 1, I ventured into an advanced course in economics and political thought, taught by O.H. Taylor, a sad, shy man who led the seminar-size class with great gentleness through the work of the important theorists of the state and economic activity: Smith, Ricardo, Locke, Hume, Marx, Weber, and Veblen. Of all these thinkers, I was most intrigued by Weber and his argument, in The Protestant Ethic and the Spirit of Capitalism (1904), that Calvinist religious beliefs provided the economic basis of capitalism. Perhaps Taylor knew already that the place of this philosophical course in the economics curriculum would soon be doomed, at Harvard and elsewhere, by the increasingly econometric and empirical tendencies of the discipline.

Robert G. McCloskey, a political scientist, was a distinguished expert and fluent lecturer on the Supreme Court. His course was a stimulating review of the Court’s jurisprudence, emphasizing the historical forces that shaped the direction of the decisions of the Court. From him I first glimpsed something of the grandeur of public law. He especially emphasized the political alertness of the Court and the way in which it had historically tended to follow or confirm public opinion rather than challenge it. “[P]ublic concurrence sets an outer boundary for judicial policy making,” McCloskey wrote. “[J]udicial ideas of the good society can never be too far removed from the popular ideas.” In the 19th and 20th centuries, the Supreme Court occasionally challenged public opinion, often to its chagrin (as in the cases finding New Deal legislation unconstitutional), sometimes to its glory (as in Brown v. Board of Education, holding segregated public schools unconstitutional). Indeed, McCloskey emphasized the value to the Court as a deliberative institution in having one or more former elected officials (governors and senators) among its members.

During my sophomore year I took Edwin Honig’s course in creative writing, English C. I learned, to my grim disappointment, that I was not meant to be a writer of fiction. Honig was a poet, and he gave each of his 15 students detailed personal attention. He was a calming influence on his often tense, anxious students, never seeming to tire of reading endless manuscripts on the familiar subjects of first love, sexual initiation, and generational conflict. Honig appreciated that a teacher cannot teach students to write, but that he could, by wise and gentle criticism, teach them to improve their writing.

I wrote a number of short stories for the course, all of them wooden and unimaginative, obvious and predictable in their plotting. As was Honig’s practice weekly, he read one of my stories anonymously to the class for criticism; I squirmed in the hope that my classmates would not recognize it as mine, even though it was the best of the impoverished lot that I wrote for the course.

I admired Honig -- he was a humane man, tall, craggy, shy in demeanor, halting in speech -- and I read most of his books of poetry as well as his critical book on the Spanish poet Federico Garcia Lorca. From him, I learned that a writer must not only have a versatile command of language, he must also have something to say. Novels must have themes and make points; the best writers are thinkers. As a fledgling writer, I had a thin imagination and was bereft of striking ideas. I had no conception of what I wanted to say. I concluded that I did not have the creative qualities of a writer....

For all my admiration for my Harvard teachers, as a student I never met or had a conversation with any of them, with the exception of Professor Kelleher. After completing a lecture, most professors hurried from the podium as quickly as they could, well before any student could come forward to ask a question. The Harvard system of undergraduate education was not conducive to faculty-student interaction. Professors did not hold office hours for undergraduates, and they rarely took meals or attended social events at Lowell House. They were apparently too busy or important to spend time with students. The system was designed to ensure that students’ moments of personal discourse were with the teaching fellows who taught our sections, not with members of the faculty.

A few professors, inevitably, were terrible lecturers, and I wondered why the quality of their teaching was not better. The issue usually was not substance but style. James Bryant Conant, Harvard’s former president, once quoted Edward Gibbon on Greek scholars in the 10th century: “[The teachers of the day] held in their lifeless hands the riches of their fathers without inheriting the spirit which had created and improved that sacred patrimony.”

Many professors didn’t seem to care about the organization or fluency of their presentations. Occasionally some seemed unprepared. Lecturing to a large audience was, I believed, an art that could be improved by instruction and practice -- wasn’t that what Dale Carnegie purported to do? -- and I assumed that professors themselves would find satisfaction in perfecting their lecturing styles.

Every faculty undoubtedly has its share of opinionated, self-centered teachers like Miss Jean Brodie, whose unorthodox prime is chronicled in Muriel Spark’s novel. For all her fervent dedication to her students, Miss Jean Brodie was a self-deluded admirer of fascist regimes who abused her position of authority. But my worst professors were not especially opinionated or self-centered -- merely dull. Were all of my teachers models of intellectual power and pedagogical clarity, let alone of moral stature and common sense? Surely, they were not, although I was probably too inexperienced -- or too dazzled by Harvard’s reputation -- to appreciate that.

Despite these limitations, I admired beyond measure the wisest, most learned members of the faculty and have been forever grateful for the models of the life of the mind that they provided me. From them I learned, as George Steiner wrote in Lessons of the Masters (2003), “There is no craft more privileged.... To awaken in another human being powers, dreams beyond one’s own; to induce in others a love for that which one loves; to make of one’s inward present their future: this is a three-fold adventure like no other.”

Author/s: 
James O. Freedman
Author's email: 
info@insidehighered.com

This essay is an excerpt from Finding the Words, an autobiography by James O. Freedman of the first 27 years of his life. Freedman served as president of Dartmouth College and the University of Iowa and was the author of Idealism and Liberal Education (University of Michigan Press) and Liberal Education and the Public Interest (University of Iowa Press). Freedman died of cancer last year, weeks before Finding the Words was to move into production at Princeton University Press. The press -- working with two of Freedman's friends, Stanley N. Katz of Princeton University and Howard Gardner of Harvard University -- finished the book, which has just been released. This excerpt is printed with permission of the Princeton University Press.

The Flawed Metaphor of the Spellings Summit

By the conclusion of Secretary of Education Margaret Spellings' recently-convened Test of Leadership Summit on Higher Education, I finally understood why her proposals are so ... well, so ill-conceived. They rest on a faulty metaphor: the belief that education is essentially like manufacturing. High school students are "your raw material," as Rhode Island Gov. Donald Carcieri told us. We need "more productive delivery models," economies of scale, even something called "process redesign strategies." Underlying everything is the belief that business does things right, higher education does things wrong, and a crisis is almost upon us, best symbolized by that coming tsunami of Chinese and Indian scientists we hear so much about. Time for higher ed to shape up and adopt the wisdom of business.

But the whole metaphor is wrong. Education is nothing like business, especially not like manufacturing. Consider the Spellings Summit's faulty assumptions:

1. "If it isn't measured, it isn't happening." This slogan we heard in formal talks and casual conversations. Therefore more testing, more reporting, more oversight, as Spellings is proposing, should improve colleges and universities. The one certain result of the Spellings initiatives will be a mountain of new reporting by colleges and universities, funneled to the Federal government via accreditors. Without formal assessment, this view holds, nobody learns anything.

But for human beings, it's obviously wrong, unmeasured good things happen all the time. Left alone, a 5-year old will explore, discover, and learn. So will a 20-year-old. They get up in the morning and do things, for at least a good part of the day, whether anyone watches and measures them or not. Many people read even if they aren't forced to. The professor does nothing; the student learns anyway. Medical doctors live by the dictum Primum non nocere:  first, do no harm. Sometimes the best treatment is to leave the person alone. That's because - unlike steel girders - students are living creatures. (We'll return to this point.)

2. Motivation is simple. "Rewards drive behavior," said several speakers with no more thought on the matter, moving easily to the use of money to guide institutions. Students and professors alike were considered to be easily directed. If tests are "high stakes," students will automatically want to do well, and if colleges as a whole do poorly, they should just be punished. Nowhere did the Spellings Commission report, or the "action plan" presented at the summit, consider that students might not like standardized tests, that administrators find report-writing onerous, or that professors could resent the nationalization of educational goals-and quit teaching altogether. Coercion, it is believed, is a simple and effective method for directing people. After all, if you put a steel girder on a flatcar, it will stay there until moved. And if you melt a steel girder to 4,000 degrees F., it almost never gets angry and storms out of the room or broods.

Consider one of the immediate results of No Child Left Behind, the resignation of hundreds of fourth-grade teachers. Coercion costs; people will try to avoid it. They'll quit their job, for instance. They'll get angry and sulk in the back of the room. "Getting tough" is not the answer.

3. Clearly stated goals at the outset are a prerequisite for success. In machining, or the production of microchips, precise specifications, measured to the nanometer, are necessary. Everything must be planned, laid out in advance, then rationally carried through to completion. As several speakers said, "We all know what needs to be done," as if that were a simple thing.

But in fact, serendipity -- the occurrence of happy, if unpredicted, outcomes seems to have no place in this scheme. The great Peter Drucker recognized that in business, unplanned outcomes can be better than planned outcomes. Post-it Notes and Viagra, for instance,  were not intended outcomes in planning; they were huge successes.

People set their own (often conflicting) goals; they resist coercion; they often surprise us. Admittedly, that makes working with them (healing them, leading them to salvation, encouraging their curiosity) a messy process. But I've seen no evidence that business people are better at it than educators.

 

Author/s: 
Daniel F. Chambliss
Author's email: 
info@insidehighered.com

Daniel F. Chambliss is chair of the sociology department at Hamilton College and director of the Project for Assessment of Liberal Arts Education. He is the author of Champions: The Making of Olympic Swimmers and Beyond Caring: Hospitals, Nurses and the Social Organization of Ethics.

Pages

Subscribe to RSS - Curriculum
Back to Top