"I saw a small iridescent sphere of almost unbearable brightness. At first I thought it was spinning; then I realized that the movement was an illusion produced by the dizzying spectacles inside it." --Jorge Luis Borges, "The Aleph"
On December 17, 2005, “Saturday Night Live” ran a skit by Chris Parnell and Andy Samberg called "Lazy Sunday," a rap video about going out on a "lazy Sunday" to see The Chronicles of Narnia and procuring some cupcakes with "bomb frostings" from the Magnolia Bakery in New York City. The rap touches on the logistics of getting to the theater on the Upper West Side: "Let's hit up Yahoo Maps to find the dopest route./ I prefer Mapquest!/ That's a good one too./ Google Maps is the best!/ True that! Double true!/ 68th and Broadway./ Step on it, sucka!"
Parnell and Samberg make it to the Magnolia for their cupcakes, go to a deli for more treats, and hide their junk food in a backpack for smuggling past movie security. They complain about the high movie prices at the box office ("You can call us Aaron Burr from the way we're dropping Hamiltons") and brag about participating in the pre-movie trivia quiz. Doesn't seem like much if you've never seen it, but for pure joie de vivre, and white suburban dorkiness, "Lazy Sunday" just can't be beat. What makes "Lazy Sunday" special, however, is how its original airing coincided with the birth of Internet video-sharing, enabling the two minute clip to be viewed millions of times on YouTube, a free service that hosts videos posted by users. In fact, the popularity of the clip on YouTube was so great that NBC forced the site to remove it several months later, citing copyright infringement. The prospect of its programming being net-jacked by Internet geeks and magnified through YouTube's powerful interface was just too much for NBC.
I bring up "Lazy Sunday" to foreground my discussion of the pedagogical uses of YouTube because it sums up its spirit and helps us define the genre of video with which YouTube is most associated. Although YouTube is awash in clips from television and film, the sui generis YouTube video is the product of collaborative "lazy Sunday" moments when pals film each other or perform for the camera doing inane things like dancing, lip synching or making bottles of Diet Coke become volcanic after dropping Mentos candies in them.
Parnell and Samberg's references to Internet tools and movie trivia, as well as their parody of rap, perfectly capture a zeitgeist in which all pleasures can be recreated, reinvented and repeated ad nauseam through the magic of the Web. As Sam Anderson describes it in Slate, YouTube is "an incoherent, totally chaotic accretion of amateurism -- pure webcam footage of the collective unconscious." Whatever you're looking for (except porn) can be found in this Borgesian hall of mirrors: videos of puppies, UFO footage, ghosts on film, musical memento mori about recently deceased celebrities, movie and documentary clips, real and faux video diaries, virtuoso guitar picking performances and all kinds of amateur films. In my case, the video that sold me on YouTube was "Where the Hell is Matt Harding Dancing Now?" -- a strangely uplifting video of a guy called Matt Harding who traveled around the world and danced in front of landmarks such as Macchu Picchu in Peru, Area 51 in the U.S., the head-shaped monoliths of Easter Island, and the Great Wall of China, among many others.
OK, that's all nice, but what can YouTube do for professors, apart from giving them something to look at during their lunch breaks? Inside Higher Ed has reported on the ways in which YouTube is causing consternation among academics because it is being used by students to stage moments of guerilla theater in the classroom, record lectures without permission and ridicule their professors. Indeed, a search on YouTube for videos of professors can bring up disquieting clips of faculty behaving strangely in front of their students, like the professor who coolly walks over to a student who answers a ringing cell phone in class, politely asks for the device, and then violently smashes it on the floor before continuing on with his lecture as if nothing had happened. It could be staged (authenticity is more often than not a fiction on YouTube) but it is still disturbing.
But I would like to argue for an altogether different take on YouTube, one centered on the ways in which this medium can enrich the learning experience of college students by providing video realia to accompany their textbooks, in-class documentaries and course lectures. Although I can't speak to the applicability of YouTube to every discipline, in what follows I make a case for how the service can be harnessed by professors in the humanities and social sciences.
As a professor Latin American literature and culture, I often teach an introductory, third year course called Latin American Culture and Civilization in which students study history, literature and any other media that the instructor wishes to include in the course, such as music, film, comics and the visual arts. My version of the course emphasizes student engagement with foundational documents and writings that span all periods of Latin American history and that I have annotated for student use. One of the figures we study is President Hugo Chávez of Venezuela, whose outsized political persona has made him a YouTube star. Apart from having my students watch an excerpt of his "Bush as sulfurous devil" speech at the United Nations, I assigned a series of animated cartoons prepared by the Venezuelan state to educate children about the Bolivarian constitution championed by Chávez. These cartoons allow students see the ways in which the legacy of the 19th-century Venezuelan Liberator, Simon Bolívar, remains alive today.
The textual richness of these cartoons invites students to visually experience Bolivarian nationalism in a way that cannot be otherwise recreated in the classroom. It invites them to think critically about the ways in which icons such as Bolívar are creatively utilized to instill patriotism in children. In a similar vein, a Cuban cartoon about Cuba's founding father, José Martí, depicts how a child is transformed into the future champion of independence and social justice when he witnesses the horrors of slavery (this video has now been removed from YouTube). With regard to the Mexican Revolution, one of the most important units of the class, YouTube offers some fascinating period film of the revolutionary icons Emiliano Zapata and Pancho Villa, and especially their deaths. Although I cannot say that these are visual texts that lend themselves to the kind of rich dialogue provoked by the aforementioned cartoons, they are nonetheless an engaging visual complement to readings, discussions and lectures.
Another course in which YouTube has played a part in is my senior-level literature course on the Chilean Nobel Laureate Pablo Neruda. It may seem farfetched to use Internet video in a poetry class, but in this case, YouTube offers several useful media clips. I have utilized film clips in which Neruda's poetry appears (such as Patch Adams and Truly, Madly, Deeply), as well as music videos of Latin American singers who use lyrics by Neruda. More than anything that I could say in class, these videos illustrate the reach and enduring quality of Neruda's poetry in Latin American and North American culture. This said, there are a surprising number of student-produced videos about Neruda on YouTube that are cringe-worthy, the "Lazy Sunday" versions of the poet and his poetry. These are quite fascinating in of themselves as instances in which young people use video to interpret and stage Neruda, in ways that might be set into dialogue with more literary and canonical constructions of his legacy, but I confess that I am not yet convinced of their pedagogical value.
In this regard, the case of Neruda is not so different from that of other literary figures, such as Emily Dickinson, Nathaniel Hawthorne and Robert Frost, who are also the subject of interesting home-made YouTube videos. What do we do, for example, with a Claymation film that recreates Frost's "The Road Not Taken"? I would argue that this film is interesting because it captures the banality of a certain canonical image or version of Robert Frost that is associated with self-congratulatory, folksy Hallmark Card moments.
There are all kinds of video with classroom potential on YouTube. Consider, for example, one of YouTube's greatest stars, Geriatric1927, a 79 year-old Englishman whose video diaries document his memories of World War II, as well as of other periods of English history. Then there are the Michel Foucault-Noam Chomsky debates, in which Foucault sketches out, in animated, subtitled conversation, the key arguments of seminal works such as Discipline and Punish. There's an excellent short slide show of period caricatures of Leon Trotsky, news reels and lectures about the Spanish Civil War, rare footage of Woody Guthrie performing, Malcolm X at the University of Oxford, clips of Chicana activist Dolores Huerta discussing immigration reform and a peculiar musical montage, in reverse, about Che Guevara, beginning with images and reels of his death and ending with footage of him as a child.
Don't let me tell you what you can find; seek and ye shall receive.
YouTube is not necessary for good teaching, in the same way that wheeling a VCR into the classroom is not necessary, or bringing in PowerPoint slide shows with images, or audio recordings. YouTube simply makes more resources available to teachers than ever before, and allows for better classroom management. Rather than use up valuable time in class watching a film or video clips, such media can be assigned to students as homework in the same way that reading is assigned. However, to make it work, faculty should keep in mind that the best way to deliver this content is through a course blog. YouTube provides some simple code that bloggers can use to stream the videos on a blog, rather than having to watch them within the YouTube interface. This can be important because we may not want students to have to deal with advertisements or the obnoxious comments that many YouTube users leave on the more controversial video pages. On my free wordpress.com course blog, I can frame YouTube videos in a way that makes them look more professional and attractive ( sample page here). At this point, courseblogging is so easy that even the least technologically-minded can learn how to use services like blogger or wordpress to post syllabi, course notes and internet media.
There are problems however, the most glaring of which is the legality of streaming a clip that may infringe on copyright. If I am not responsible for illegally uploading a video of Malcolm X onto the web, and yet I stream it from my course blog, am I complicit in infringing on someone's copyright? Now that Google has bought YouTube, and a more aggressive purging of copyright protected works on the service has begun, will content useful for education dwindle over time? I don't have the answers to these urgent questions yet, but even in the worst of cases, we can assume that good, educational material will be made available, legally, on YouTube and other such services in the future, either for free or for a modest fee.
For example, I am confident that soon I will be able to tell my students that, in addition to buying One Hundred Years of Solitude for a class, they will have to purchase a $5 video interview with García Márquez off of the World Wide Web and watch it at home. And, even as I write this, podcasting technologies are already in place that will allow faculty members to tell their students that most of their lectures will be available for free downloading on Itunes so that class time can be used more productively for interactive learning activities, such as group work and presentations. Unlike more static and limited media, like PowerPoint and the decorative course Web page, video and audio-sharing help professors be more creative and ambitious in the classroom.
In sum, my friends, YouTube is not just for memorializing lazy Sundays when you want to "mack on some cupcakes." It can help your students "mack" on knowledge.
Christopher Conway is associate professor of modern languagesÂ and coordinator of the Spanish program at the University of Texas at Arlington, where he teaches Latin American literature and culture.
My undergraduate students can't accurately predict their academic performance or skill levels. Earlier in the semester, a writing assignment on study styles revealed that 14 percent of my undergraduate English composition students considered themselves "overachievers." Not one of those students was receiving an A in my course by midterm. Fifty percent were receiving a C, another third was receiving B's and the remainder had earned failing grades by midterm. One student wrote, "overachievers like myself began a long time ago." She received a 70 percent on her first paper and a low C at midterm.
A solid 40 percent of my undergraduate English composition students described themselves as "overachieving if they liked the subject." The grades for these students, understandably, were scattered. Twenty-nine percent of my undergraduates described their study styles as "normal." Of these, 36 percent were working at a C level by midterm; another 18 percent were receiving a B, with another 18 percent receiving a D. The remaining 27 percent were failing. One student who described his study style as "normal" confessed that he rarely started assignments when they were first given out, waited until a few days before work was due to get started, and did a lot of his writing over the weekend. At midterm, he was receiving an F.
A whopping 17 percent of my undergraduates confessed to being "underachievers"-studying at the last minute, not doing the reading, and only spending a few hours on major assignments.
My data -- though tremendously limited in scope-seems to be supported by Douglas Hacker's findings. In "Test Prediction and Performance in a Classroom Context," an article published in the Journal of Educational Psychology, Hacker and colleagues at the University of Memphis found that to a great degree, overconfidence is prevalent among low-performing students. True, Hacker's study was with introductory psychology undergraduates rather than English composition students. But it does give me a great deal of insight as to how students predict performance. And although I don't like the idea of considering my students "low-performers," I admit that my state does have a weak high school system, and my university doesn't turn paying students away. Even low-achievers are admitted under a "conditional" admission standard.
I don't think Hacker's experience is unique. Dozens of colleagues have told me that their undergraduates simply do not have the tools to criticize and evaluate their own work-much less predict how well they will do on assignments. What's behind this great drop in ability to assess performance?
A colleague of mine believes that primary and secondary schools, overwhelmed with students who were never well prepared for school, students with learning disabilities, addictions, and even severe discipline problems have found themselves delivering a weakened curriculum. Yet a recent article in American Educator, "Balancing the Educational Agenda," by Jean Johnson et al, indicates that academic standards for secondary schools are rising-a move supported not only by academics and administrators, but by parents as well. Perhaps this move is recent; those of us in postsecondary positions are, in effect, responding to the academic standards in place a decade ago. Or perhaps regions suffer differences in standards based on student population and demands of the surrounding community. Another possibility, among others, is that the curriculum shifts when administrators attempt to adopt each new trend in education.
Just as an inconsistent curriculum can cause students pain and confusion, the move from high school to college can be a hair-raising leap. High school systems with a weak curriculum (or one that is not consistently applied) can create tremendous problems later in the academic system. At my current university, a large percentage of our undergraduates have brought their high-school experience with them. Some of them are under the impression that if they now come to their college classes every day, they will pass these courses. Many of these students are stunned whey they fail their first major test or receive a D for what they thought was an award-winning essay.
Even when academic advisors warn, "college is not high school," many of these under-prepared students continue to believe that they will receive A's for a token effort. Clear class objectives and strongly worded syllabi are often ignored as students continue to overestimate their capabilities based on past performance. After the first major assessment, many of these students clutch at their professors' arms, lamenting, "But I got A's in high school."
Colleagues often commiserate about this particular student response. After all, it's almost impossible to respond to. Often we can only repeat that our expectations are clearly outlined in the syllabus and course outline, that we would be happy to define these further, and that they may want to drop the course if they cannot afford to dedicate time outside of class for study. One professor friend often tells students that the A's they received in high school are simply a step toward admittance to the local university-not a guarantee of grades.
Another colleague says that the level of competition has changed from high school to college; until freshman understand that, they will be inaccurately predicting performance. And the vilification of competition has set up many students to believe that they are all doing well -- regardless of outcome. As a friend of mine in teacher education says, "It's the result of the 'feel good 70's' where every child was deemed a winner. Competition was considered demoralizing. The result was a continuing trend in the 90's which focuses on reward across the board. Today, we have turned out a glut of students who not only can't assess themselves, but who have received awards for every little thing." When they enroll in college, students often still have no idea how they fare when compared with other undergraduates.
A good friend on staff at a university library says that helicopter parenting also contributes to the problem. When he escorts tour groups of grade school students through his facility for a hands-on learning tour, he often sees parents and grandparents hovering so much that instead of helping young students stay focused on assignments, the children end up being spectators instead of participants in what should be their chance to "try out" a college experience. The urge to spare children from the ego blows of failure, too, often results in parents actually doing homework for children -- not only in primary and secondary grades, but in college, as well. Some parents, perhaps perfectionists, have rationalized that if they "assist" their child, the task will be done in a much shorter time. Unfortunately for these children, their formative years do not allow for effort, failure, increased effort, failure, and another attempt which results in success. This set up may produce college students who can only do the most superficial work before becoming discouraged.
Another academic friend says that an inability to focus and an overwhelming desire to multi-task make it almost impossible for students to succeed academically. Staff who manage study rooms and carrels often report that students seem to work "in dribs and drabs" while in the library. Backpacks in hand, they often loiter at computers and chat at tables instead of actually working. Dependent on high-tech gadgets, these same students often feel compelled to answer phones while in study groups, and constantly check e-mail or view sites such as Facebook or MySpace during hours they had dedicated to working on assignments or doing research.
One reference desk librarian reported that she would see students "studying for four minutes, goofing off for a half an hour, and then studying for another four minutes." Of course, these students often report to faculty that they've been studying for hours -- which in some ways must seem like an accurate appraisal. After all, they were in the library; therefore, they must have been studying. In the end, a diminished attention span combined with the feeling that doing one thing at a time is a waste of time almost guarantees that they will not be turning in top A-level work to their professors.
This narrative is very incomplete as a study. I'm sure that sociologists, education specialists and other experts have outlined a long history and a number of interrelated causes that explain this drop out in students' knowledge.
As an instructor of undergraduate core classes, however, I realize that my responsibility does not stop at content. I cannot simply list assessment as a course objective and then feign ignorance when my students show me again and again that they cannot predict their own performance. Strategies -- not only for instruction, but also for exercises and assessment -- are integral in setting my students on the right path for the remainder of their college careers. To accomplish this, I realize that I will need to work much, much harder to help my undergraduates understand assignments and expectations, rubrics and assessments, in-class grades and the prediction of success.
Some is already in place. Like many English composition instructors, I do instill a peer-editing component to my writing courses -- not only to help students view writing as a process -- but to give them some tools and much-needed experience in evaluating student work. I provide instruction in how to apply rubrics to student work and often use past student work as "models." Some students are glad for the transparency of my courses; with a detailed 16-week course outline given out at the first class, they can start relating course objectives to specific assignments throughout the semester. Lessons scaffold one on another; assessment follows thorough instruction. Still, there is much to be done. It's clear that I need to develop more tools to help my students learn to assess their own work and predict academic performance more accurately.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
“Administrators are supposed to have an academic vision. What’s yours?”
That’s the best question a faculty member has asked me since I’ve become associate dean. In conversations that have followed, I’ve begun to understand that my vision, built upon a sense of curiosity and the impulse to teach, implies both a certain type of faculty and a certain type of institution.
The academy rests on a foundation of newly formulated and previously acquired knowledge, and a sense of wonder in its presence. That sense -- call it curiosity -- propels faculty to collect data, analyze it, and hazard generalizations in articles and books. It engenders creativity in music and the arts and drives academics to sift through mounds of evidence in the hopes of assembling something grander, be it historical argument or literary analysis. Curiosity requires mass spectrometers and gas chromatographs; it urges fundamental and foundational understanding of the world around us.
Yet none of us works in an institution where such delight in the new alone suffices. Sooner or later, every one of us wants someone else to understand what we’ve come to know. Perhaps it is a colleague or peer, if we work in a research institute or within the graduate school of a major research university. But in my vision, the faculty want to help shape younger minds, those of undergraduates. And most of the time undergraduates don’t have the background to really understand the faculty.
The mismatch of intellectual preparedness and complexity of information compels faculty to teach. This impulse will have a faculty member reduce chaos, ignore variables, abstract principles, and then oversimplify them -- all for the purpose of communicating to the relatively underprepared. The faculty I envision see such teaching as a craft and think about it continually. In their classrooms, students experience a variety of lectures, discussions, and small group work, all meant to stimulate curiosity and create a setting maximally conducive to learning.
So what’s my academic vision? I see the classic encounter of liberal education: expert faculty put their own ideas into dynamic tension with those of their colleagues, and then eagerly begin to engage students. I see classrooms where discovery and the boundaries of a field are the principal subjects, albeit explained at appropriate levels of simplification. These attributes -- curiosity and the impulse to teach -- explain why faculty labor over evidence, chisel away at concepts yet undiscovered, and manage syllabi of divergent topics, approaches, and problems.
I have also come to realize that good administrators likewise must take a deep interest in everything and put people in a position where they want to share their expertise.
Day after day during the search season, a dean’s calendar is filled with candidate meetings, during which the dean must talk with a vast array of potential faculty members, and then make wise decisions about competence, communications skills, and energy, to say nothing of their fit within a community of scholars and students. At the end of those long days, someone inevitably asks: How can you talk with people in such a wide variety of fields, especially since your degree is in some other equally narrow field? This question comes up outside of the search season too. Indeed what makes administrative life intellectually rich and rewarding is meeting with department chairs, program directors, individual faculty, other deans: all with different training, all specialists in their own areas.
An example: the leader of the Science and Math Advisory Group approaches the administration with a hefty repair bill for a gas chromatograph/mass spectrometer, a bill that puts his department over budget for repairs. The administrator’s task is to get this professor to teach enough that her decision is well grounded. How do students use this particular piece of equipment? What research will have to wait until next July if it goes unrepaired?
In just the same way, a good dean will enter the free-form portion of a candidate interview and begin with deceptively simple curiosity: “Tell me about your work…?” Other questions, based on the answer received, keep arising: “Could your work on the economics of dental care help someone understand why health insurers don’t want to pay for preventative measures?” And the answer (dental sealants pay off only over the long haul, too long for the insurers) provokes another question, and so on and on.
Although I like to think there is a skill to such an interview, really it is all about putting the candidate in a position to be a teacher. The dean tries to draw from the expert a tidbit that summarizes a subject, in admittedly too simplistic a form, so as to ask for more detail, and perhaps a more cogent summary. The questions tip off the candidate as to how far to translate expert jargon into generally accessible ideas, complex ideas into simplifications comprehensible by a non-expert.
So what is my academic vision as an administrator? The task is to use budgets, hiring, and curricular leadership to promote faculty research and enhance student learning. And the skills we’re building should be no strangers to academic deans. Before moving into administration, over long careers in graduate school (and for those of us lucky enough, in the classroom), curiosity and the impulse to teach defined our work -- as it does that of the faculty we serve.
Of course, I’m only an associate dean. I work within a much larger structure, under a president and senior administrative team that combines with the faculty’s academic vision to build an institutional culture. Still, I’d argue that liberal arts colleges that embrace a culture of curiosity and teaching have a quite distinctive profile, in terms of curriculum, structure, and values.
Institutions with a culture of curiosity and teaching use the curriculum to help drive students to areas of study otherwise unthought of, and allow faculty to construct courses that test ideas in new contexts and combinations. General education programs range widely, helping students sample broadly enough to educate their academic palates, while major requirements sink deeply into subject matter, guiding young scholars toward the nuances of disciplinary cuisine.
Such a curriculum demands that the administration be nimble and open to change, supportive of both classical and emergent fields. The president must lead discussions defining institutional goals and the dean of faculty must propound a theory of which academic issues and programs trump dollar costs. And since no institution can spend all the money required to do everything, even the CFO will need to teach: how shall we reallocate resources effectively to bring on new programs while closing down those that no longer meet institutional goals?
Liberal arts colleges that pursue a vision of curiosity and teaching will also have certain predictable structures. Foremost among these is the wave of interdisciplinarity that began on campuses in the late 1970s. Interdisciplinary programs and centers arise when faculty and student curiosity about a topic exceeds disciplinary possibilities: for example, environmental studies is born when a group of faculty realizes that biology and botany cannot answer all of their critical questions, and wants to consult regularly with colleagues from chemistry, public policy, and sociology, as well as literature and others.
The shifting nature of the disciplines raises questions that must be engaged: At the limits of interdisciplinarity, what guides the granting of positions, the allocation of budgets, the support of the community? An institution that has fostered curiosity in labs, studios, and classrooms will have answers to such questions, because curiosity and teaching propel faculty to build bridges between subjects, leading to multidisciplinary appointments and calls for newly intersecting programs and emerging fields. By contrast, an institution that has not attended to such matters will be caught up short when its faculty meet across the divide between disciplinarity and interdisciplinarity.
Finally, an institutional commitment to curiosity and teaching will result in an embrace of the values of liberal education, from critical thinking and self-development to understanding matters of difference and diversity.
The hallmark of small, residential liberal arts institutions is close student-faculty engagement in and out of the classroom, lab, and studio. Such apprenticeships of the mind aim to develop students’ abilities toward critical thinking. Students attempt to create principles abstracted from a set of facts and circumstances, and then to apply those principles in situations never before encountered. Successful students gain a facile (and curious) mind that is both critical and adaptable. Such intellectual formation must happen everywhere on campus and shape the student as a whole. And as students demonstrate what they have learned -- in written essays, oral presentations, in logical or mathematical proofs, scientific lab reports, and artistic presentations -- they go beyond mere mastery of facts to critical argument (and, indeed, to teaching one another).
The object of curiosity in this type of institution is the entire world around us, from the cosmos at one extreme to quantum states at the other. But a particular focus on humanity and our place within the universe of meaning emerges from the social nature of a residential college. That is to say, curiosity about “The Other” (here understood as a focus of inquiry, not an epithet) becomes a critical part of the academic curriculum. Institutional values of diversity and equity of necessity shift from the periphery toward the center; administrative support for such curricular and community attention emerges during complex conversations about resources and structures, all the while cognizant that a diverse faculty and curriculum can better serve a community curious to be taught about culture and difference.
In the end, of course, academic planning must begin with an institution’s mission and core values. And when that mission centers on liberal education, an entire community of students, faculty, and administrators must find common ground in the face of critical issues, from resource allocation to interdisciplinarity and diversity. I remain convinced of my original reply: the best academic vision builds on intellectual curiosity and the impulse to teach.
Roger Brooks is the Elie Wiesel Professor of Judaic Studies and asociate dean of the faculty at Connecticut College.
When I was an art director, I loved the idea of showing my design portfolio to prospective employers. After seeing my best design work professionally produced and mounted on boards, I often received either an offer to work on staff, or at the very least, a chance to do freelance work for that advertising agency. I loved creating these pieces, and this format seemed to respect the artistic process more than the drudgery required of day-to-day work in the industry.
When I started to teach graphic design at a local community college, I used the portfolio format for my own students. Although they loved the idea of being able to discard their less effective pieces, I often wondered if I was accurately assessing their work. The outcome revealed an ability to produce beautiful artwork after much trial and error over the course of a semester; yet, the process did not seem to take into account the sometimes painful learning curve that most students experienced. Still, I continued using portfolios, convinced that the advantages outweighed the few negatives.
After being hired to teach composition, I was encouraged to use a portfolio system for my writing courses. What could be better, I thought? This would encourage (and reward) students for revising their work. Given a chance to assess their own writing, they would move from passively learning to actively participating in their own education. They could showcase their best work and have a chance to reflect on writing as a process rather than as a simple outcome. And best yet, I could see their work as a progression rather than as staccato assignments that fell during particular times during a semester. Knowing that portfolios were the standard at a number of colleges -- and in many ways, still considered "progressive" in my discipline -- I started gathering information from colleagues and industry publications to find out how to instill this process into my undergraduate courses.
After two years of teaching writing utilizing a portfolio system, I realized there were pitfalls. Some could be mitigated by a tight syllabus and clearly outlined course requirements; others seemed to cripple the outcomes that my department had deemed desirable.
First, all of my students were anxious about not knowing their in-class grade until the end of the course. In traditional writing classes, students received either a number or letter grade on each writing assignment. They could predict their final grades simply by keeping a tally of how they did on each essay and writing assignment. Faculty often listed how grades were figured at the top of each syllabus, making this even easier.
With the portfolio system, however, a large portion (sometimes as much as 75 percent) of a student's final class grade was based on their final portfolio - which was often comprised of four to six essays. This, of course, was turned in at the end of the semester. Students often took their final and walked away from the campus without any clear idea of how they were doing in their portfolio-based class. Faculty then graded the portfolio, figured the students' final grades, and often turned final grades into the registrar's office without administrative review. Students had no way of knowing how they did until their final grades were posted by the campus. The number of students requesting grade review often escalates with this system -- if only because the students feel powerless and confused by this form of "blind review."
I did everything I could to give students some information about how they were doing during the portfolio-based semester. I made due dates for assignments and gave them detailed feedback about each written work. Rubrics that showed areas for improvement may have helped students rewrite papers for their portfolio, but still gave them no tangible evidence of their grade to date. Even when students came to my office and we went over essays together, they still could not see how this information might be reflected in their class grade-to-date. I ended up wasting many precious class hours trying to reassure students about the portfolio process.
My undergraduates' constant requests to nail down their grade-to-date made me aware that the flexibility and abstract nature of the portfolio system generated absolute fear in many of them. They simply were not prepared to trust this system.
After fielding over 50 phone calls and e-mail messages from students in a state of panic about their grades two weeks before their final portfolio was due, I decided to make a change. The next semester, I initiated what I called "advisory grades." When a student handed in an assignment, I evaluated it, wrote down the grade the assignment would receive in its current state, and logged this "advisory grade" into our campus online grading software. I advised students that when they turned in their portfolios, these "advisory grades" would be eliminated. The new grade replaced the old.
Class-wide anxiety seemed to lessen because students were now able to see the grade their latest assignment had received -- and how they were doing in the class overall. Although this reduced the number of grade reviews that I suffered, it added an additional "step" in what was supposed to be a seamless venture. It also created a loophole. Students who approved of their "advisory grade" simply did not revise that assignment for the final portfolio. This, of course, negated one tremendous advantage of using the portfolio system -- the encouragement to revise.
Another concern was the responsibility of choice that we were now relegating to undergraduates. Some students saw the instruction to "pick the best four out of six" for inclusion in the portfolio as a way to avoid the most difficult and challenging work in my core classes. If my syllabus did not specifically state that all six assignments must be done, they would often only complete four. In this case, the all-important objective for students to evaluate and assess their work was now eliminated.
Even when I began stipulating that all six assignments were required, a fair number of underachievers would produce what I would consider a "token effort" for two out of the six assignments. For example, if I asked for a 10-page paper, these students would produce a one- or two-page rough draft, confident that they were going to exclude this assignment from the final portfolio.
I also noticed that students who were going to eliminate a particular work from their portfolio tended to skip classes that focused on that work; what they didn't realize is that they were missing lessons and concepts that were building to the next assignment. These students saw grades falling rather than climbing; the number of those who met me at the podium after class to complain increased. Disappointingly, these students often refused to make appointments to see me to catch up on missed work -- they only saw the holes in their education as missed chances to gain a few grade points.
The next semester I initiated a punitive attendance policy. I hated treating my undergraduates like high school students, but it was clear that the weakest students did not understand the value of a day's lesson that did not immediately translate into grade points. I also indicated in my syllabus that anything less than a full-length paper would be returned without credit. In response, my less motivated students then turned in what would look like a pre-write -- something so unformed that it could not be considered college-level work. My evaluation of these assignments was wasted time; I knew that these students would never return to these rough pieces to work through initial difficulties to master these concepts. And through the magic of the portfolio process, the poor grade that these works received was eliminated.
Next, when allowed to rework and revise only four out of six assignments, my undergraduates immediately discarded the assignments they found most difficult. It was as if the two assignments that asked the most of them did not exist. This meant that they were reworking materials whose underlying concepts they had, in essence, already mastered. Here, again, part of my curriculum was being eliminated. Students would no longer meet my course objectives with pieces and parts discarded.
When given a choice, students dropped the most challenging assignments. They may have seen this as a wise budgeting of time and effort, yet I felt as if they were making two important statements: one, my expertise in that area was not important; and two, they were telling my department that they did not value that particular outcome. In my courses, students often dropped the more difficult argumentative essay -- or more often than not, the long research paper required for the course. Yet these specific assignments were the ones that would have prepared my students most effectively for courses in other disciplines. And the painful reality was that my department's desire to be democratic was, in effect, allowing under-prepared undergraduates to dictate their own curriculum.
When it came to revision, my overachievers immediately started reworking assignments the minute they received feedback. Yet, 90 percent often waited until the last possible moment to revise their work. Somehow, viewing four major assignments that desperately needed revision seemed to de-motivate them. In an effort to help, I encouraged students to come see me outside of class.
Each semester, I added eight or nine additional office hours a week during the last two or three weeks of class, hoping to lift my undergraduates from mediocre work. Still, I would find myself almost completely undisturbed. Here and there, an honors student would appear with a revised paper in hand, hoping to move from 90 or 95 percent to a perfect 100 percent. My other students simply did not see the value of free one-on-one tutoring with their instructor -- or they were intimidated by the portfolio system. In either case, they did not receive the help they needed to improve their work as a whole.
I finally started initiating the occasional "in-class work day," and placed my students in a computer lab. Here they could rework their papers. I "floated" from row to row, viewing their writing and making suggestions. Still, a minute or two per student did not give them substantial feedback.
Last year, I started requiring my students to see me for a 15-minute consultation once during a critical time in the semester. Although these individual conferences proved fruitful, this short time period was not enough to look at more than one revised assignment. Students may have walked away with concrete ideas to improve an assignment; yet, unless they were tremendously motivated, their other assignments went untouched.
My expectation that students would revise all six assignments and then ask for help in choosing the best work for their portfolio was quickly revealed as a pipe dream. Even my honors students knew the value of their time. Better to spend time pursuing more grade points on the four works that "counted" than waste time on all six. Yet the idea that the students and I were going to view their work holistically was what had sold me on the use of portfolio systems. And my experience seemed to suggest that other than a few overachieving students, I was the only one doing any form of "global review."
As an active writer, I can't help but find the writing process interesting. I loved the idea of encouraging my own students to reflect on their own writing process. Maybe I secretly hoped that one undergraduate out of a hundred would suddenly see the beauty in this creative venture and change their major to English literature, rhetoric, or journalism. The one concrete assignment where I could find out more about my students' writing experience was a "letter to the instructor," which promised 10 points without regard to content. Set inside their portfolio, I hoped this 250-word note would give me the inside track to improving my course and engaging students in my next course.
Unfortunately, the majority of my students used this platform to plead for better grades. Of course, I empathized. One on occasion, I was able to intervene and suggest that a student ask for a medical deferment for the semester's work. But I could only view the work they produced -- not the stressed, and sometimes, troubled person behind it. And, of course, I was no closer to truly understanding their writing process and the obstacles they had faced in producing the body of work I demanded that semester.
A small number of my most accomplished students did take the time to review their work and seriously discuss what they saw as their strengths and weaknesses. On occasion, they complimented my teaching, thanked me for "keeping on them," or made a concrete suggestion for my course. I kept these few notes in a special file to be reviewed when I felt overwhelmed and disappointed. I later began to suspect that the concept of only "showing your best work" was setting students up for failure. Because their worst work was eliminated, their final in-class grade was higher than normal. This source of "grade inflation" created several problems. First, the jump to other courses was even more substantial. Many students who performed well in a developmental course that used a portfolio system then did poorly in a traditionally assessed transfer-level course that followed. By midterm, some students were failing. Shocked, they would initiate grade reviews by the dozens.
Colleagues of mine who did not use a portfolio system started to view those of us who did with a critical eye. "Just what were we letting these students get away with?" they often asked each other. Although there was no official discussion of these concerns, this division did not help our already fragmented department.
There was also dissent among instructors who used portfolio-grading systems. One instructor who taught a lower-level composition course allowed students to discard 4 out of 10 major assignments. He also stipulated that these six successful works would count for 75 percent of the student's final grade. The result was that he turned in a slew of A's and B's each semester. His format looked enormously successful on paper -- yet those of use who taught his former students were in for trouble.
Even if the next course used a portfolio system as well, even subtle differences in format would be devastating to the students' expectations. Asking students to eliminate two assignments out of six would reflect their true abilities more closely, resulting in less "grade inflation." And with a portfolio worth 50% of a student's final in-class grade, there was more pressure on other parts of the course -- something that these students had not yet experienced at this level. The result was often constant complaint, and in some cases, grade review. I had questions, serious questions, about this process.
The portfolio system also required more work from already overwhelmed instructors. A colleague confessed to working at a university that required him and seven other colleagues to grade over 375 portfolios (each with three essays, including outlines, pre-writes, drafts, "final" papers, and rewrites) in one afternoon. After a "norming" session, each portfolio had to be blind reviewed by at least two instructors; a third would be used in a case where a portfolio grade fluctuated more than a half grade. Although my friend felt reassured knowing how he compared to colleagues when it came to assessing student work, he dreaded this day all semester. No number of after-review drinks at a local tavern washed away fatigue and a general sense of being taken advantage of by his university.
Most departments do not install such a demanding regimen; still, the constant review of work often necessitated many more hours from faculty than those teaching classes in a more traditional format.
In graduate-level courses, I was sure that many of the obstacles I faced would be lessened or eliminated; still, my department chair had strongly encouraged me to apply these principles to my pool of undergraduates. As a contract employee, I felt compelled to do the best I could. Upon reflection, I realize that the students that did well within the portfolio format would also succeed in a traditional class. The students in survival mode would attempt to work the system, just as they would with any course. I did not sense that the portfolio system was a complete failure -- but I had a nagging sense of discontent about the process.
In the end, I'm most concerned that my curriculum is being negatively affected by what is considered a progressive form of assessment. In other disciplines, it seems to be applied more effectively. In graphic design courses, students are motivated to succeed in their specialty. Many of my design students worked to improve their complete body of work -- if only to have a greater number of pieces to show potential employers. Even in the fine arts, students may move into an area of concentration, but often move back to master other formats as they grow curious or bored. In both of these disciplines, students are motivated by discovery more so than grade points; therefore, the portfolio system fits well with the curriculum.
With undergraduate classes, however, a great number of students are motivated to "get the core over with" so they can go on to classes in their major. Anything that helps them scale back the amount of effort and still achieve the same grade in these bread-and-butter classes is desirable -- no matter what the effect on the curriculum. No matter how instructors struggle to hold the line, the portfolio system encourages "grade inflation" that is not only damaging to an undergraduate's academic experience, but to faculty, administrators, and to the college as a whole. This system also allows undergraduates to discard what may be tremendously important portions of the core class curriculum long before they are qualified to be making such decisions. These losses will be felt down the line in future classes, other disciplines, and even in future careers when the student is far from the university's reach.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
Now, there are some strings. As the Tucson Citizen notes, "It doesn't hold if students change majors midway through college or drop or flunk several courses. A few majors, such as engineering, are excluded because some students need to take pre-college math courses that can extend graduation beyond four years." So, do it right, make no changes, make no mistakes, and you can move efficiently through the university.
As someone who has to report to my university’s provost about what we will do to get our students to graduate in four years, I am sensitive to this newest fad. It affects how our institutions will be ranked and how parents will select the perfect place for their children to study. Yet, as a five-year undergrad myself, I am not sure why this is even a good goal. Yes, our federal loan money, and our state subsidies, will go to more students if we can push them through, but that is exactly what we would be doing ... pushing. And is that what we are here to do? For that matter, is efficiency a worthwhile measure of a college? Of a student?
When I attend events to recruit new students, I rejoice in those who don't know what they want to do. They come to the experience open for adventure, exploration, excitement, and challenge. I tell them that they will probably do better than those who have their future planned out. Why? Because most students change their majors. And, at a public university like mine, students are even more likely to change their majors than their private college counterparts.
Why do students change their majors? I think it is because students have little idea about (a) what jobs exist, (b) what majors correspond with what jobs, (c) what they are good at, and (d) what course of study would best use their abilities.
Hell, when I attend college major recruitment fairs, almost all the students and their parents line up for business, pre-med, and pre-law. (Working class folks tend to go for health sciences and business, because they hear there are jobs there.) I am tempted to just hand out fliers that say, "Business majors have to take accounting and advanced math. Pre-med (and health sciences) folks have to take a LOT of science courses... with labs! When you find you don't like those courses, or you fail a few of them because you actually have no special ability in advanced math or science, come check us out!"
That is how we get our majors, for the most part; the students realize that they picked a major for some bogus reason, like they knew someone who had X job and s/he made a lot of money, and they realize as they take more classes in that area that it is not what they originally thought or that it does not suit them. Then they look for something that actually suits their interests and talents. So, the parents who pushed them into their original major gnash their teeth and complain when their children have to take additional courses to meet our requirements, which are different than their original major, and their time is extended. Yet, while this can be more costly, it is such a bargain in the long term. Better to make the change in undergrad than to figure out, after earning the degree, that you are ill-suited for the professions for which you were prepared.
So, among those who don't finish in four, we first have the confused. Add to this number the students who party too much, who attend a college that doesn't suit them (that was my error), who have adjustment issues transitioning to undergraduate life, whose mental illness expresses itself during college, who have personal traumas in their lives (also my issue), whose families face financial downturns, who face discrimination or harassment, and/or who just bomb a class or two. Suddenly, our numbers look terrible! See how few students we graduate in four years!?! (And we aren't even counting the transfer student s-- the year-to-degree numbers only count students who entered as freshmen. If we included those folks in our numbers, we would see how few students really graduate in four years.)
If we still have a perverse need to measure time to degree rates, we should extend the bar to six years of full-time study, as we do for athletes and for some federal reporting requirements. (Athletes are not the only ones balancing academics with other interests!) We should exclude students who move to part-time status from our count. But I would hope that we would not use these data to rate institutions.
Finish in four sends the wrong message. It says that college is simply utilitarian, a means to a financial end. We should recognize that college is not high school. It is about self-discovery, the investigation of different majors and fields, and intellectual exploration and development. Let's reject this fad and focus on the long-term goals: producing graduates who can write, read, and think critically, and who can contribute to our society.
Lesboprof is the pseudonym of a faculty member and administrator at a public university in the Midwest where the official line is that four years and out is a good thing.
Cooperative education is now more than 100 years old. The co-op approach, in which students alternate time in the classroom with professionally paid work directly related to their majors, was founded at the University of Cincinnati by Dean Herman Schneider in 1906. There are co-op programs today at 500 institutions in the United States.
The centennial marks a good time to take stock. How effective is co-op? What has been its impact on its three fundamental partners -- students, employers, and institutions of higher education? Is co-op still relevant? Still viable? What role should co-op play in 21st century education?
I see empirical evidence of co-op’s value every day at the University of Cincinnati. We have 3,800 students in 44 disciplines participating in co-op opportunities at more than 1,500 employers in 34 states and 9 foreign countries. At graduation, UC co-op students have an enviable head-start in their careers by virtue of their on-the-job work experience (an average of one-and-a-half years for UC students), marketable skills, impressive credentials, and networking connections. Many are hired immediately by the companies where they completed their co-ops.
Collectively, our co-op students earn about $35 million each year. Plain and simple, that money helps students pay for college. Moreover, if those dollars came in the form of scholarships, it would necessitate a university endowment totaling $875 million. In short, we would have to nearly double our endowment to support the program.
Beyond those signs of success, of course, our co-op students benefit from blending classroom learning with experience in the workforce -- applying theory to practice, as one researcher summarized it. Theirs is the ultimate school-to-work transition. And at the nexus where co-op takes place, benefits also accrue to participating employers and the sponsoring university.
We have long known of these benefits anecdotally. Over the past 20 years, a series of small studies have started to confirm co-op’s value in data. Overall the field needs broader studies and better longitudinal analysis, but the research that has been conducted to date tells a remarkably consistent story. Studies show definitively, for example, that co-op experiences help students explore career options, clarify goals, and find mentors. There’s now statistical evidence that co-op motivates students to learn and study, leads to higher GPAs, and improves individual self-confidence. There is further documentation of the value of co-op in improving individual communications and human relations skills. That’s all in addition to findings that co-op alumni get higher salary offers than their non-co-op peers.
Studies also now confirm the benefits of co-op for employers. Co-op serves as an effective screening and selection process in the recruitment of new talent and it leads employers to workers who are typically more motivated and more productive than other recruits. Co-op also has a positive effect on employee retention and productivity.
In its “Job Outlook 2005,” the National Association of Colleges and Employers reported that employers complain continually that too many new college graduates lack maturity, don’t know how to conduct themselves in a business environment, and don’t have an appropriate work ethic. Those are skill sets that co-op students develop during their education. It’s perhaps not surprising, therefore, that estimates of the number of co-op employers -- including Fortune 500 companies, small businesses, government, and nonprofit organizations -- have jumped in recent years from 50,000 to more than 120,000. Not surprising, either, is that such organizations as the Education Commission of the States and the State Higher Education Executive Officers have called for improved postsecondary attention to the school-to-work transition, which of course is at the heart of co-op education.
Colleges and universities benefit from co-op, too. Co-op students enhance learning by infusing classroom discussions with real-world experiences -- sometimes leading faculty to reform curricula.
In 2006 the highly ranked architecture program at my university combined employer feedback with faculty observations from the classroom and resolved to focus on the enhancement of students’ building construction skills. Similarly the civil engineering program used employer feedback as well as input from their accrediting body to redesign the curriculum to enhance students’ understanding of the fundamental concepts of structural analysis.
By its inherent nature, co-op leads institutions of higher education to better relationships with business, which in turn opens new doors for fundraising and partnerships beyond co-op. Another practical benefit is in student recruitment. Pace University found that a full half of incoming students were attracted to the university by its co-op opportunities. What’s more, their study showed, the student retention rate for those in the co-op program was 96 percent, compared to 52 percent for the institution as a whole. Other studies corroborate co-op’s positive impact on student retention.
Co-op programs drive colleges and universities to be continually innovative in curricula and learning processes in response to employers’ needs. In fact, a study under way at the U.S. Department of Education is helping document that co-op education is emerging as one of the few educational approaches that can help curricular development keep pace with industry needs. It may be time, then, for the U.S. Congress, as it works on re-authorizing the Higher Education Act, to take a fresh look at how co-op education can help enhance college affordability and ensure the relevance of higher education in the new century.
Our neighbors to the north have the right idea. The Province of Ontario offers up to 15 percent tax breaks for companies hiring co-op students. Tax incentives for companies employing co-op students could be the best way of increasing the participation in cooperative education. Tax breaks treat all sectors of industry equally, and are less likely to skew the production of graduates towards segments without a solid employment market.
One hundred years after co-op was created at the University of Cincinnati, our Professional Practice program is leading a $1 million study that will help create the next generation of co-operative education. We’re looking for ways to link measures of student performance in co-op with corporate feedback and curricular reform. Our work is just one of a number of current efforts looking to make co-op stronger pedagogically and even more relevant -- efforts, for example, to reinforce student learning through improved self-reflection, and to link co-op more deliberately with experiential and service learning.
The co-op approach creates necessary bridges between work and learning, between liberal education and professional education, and between universities, government, and business. Moreover, co-op prepares students extraordinarily well for work -- and life -- in today’s fluid, fast-paced, and globally interdependent workplace. By the time they graduate, co-op students have a firsthand perspective on international competition, business ethics, workplace diversity, corporate cultures and more. As we prepare students for their roles in the 21st century, the benefits and attributes of co-op education have never been more relevant, or more urgently needed.
Nancy L. Zimpher
Nancy L. Zimpher is president of the University of Cincinnati. As a faculty member, Zimpher directed hundreds of student teaching experiences and recalls fondly her own initial “real world” experience -- as a student teacher.Â
During the summer months before I entered Harvard in the fall of 1953, I read The Education of Henry Adams (1918). His sardonic, world-weary recollection of his undergraduate years at Harvard from 1854 to 1858 was not reassuring. Harvard “taught little, and that little ill,” Adams wrote, and “the entire work of the four years could have been easily put into the work of any four months in after life.” The best he could say was that Harvard “left the mind open, free from bias, ignorant of facts, but docile.” In reflecting on his classmates, Adams dourly observed, “If any one of us had an ambition higher than that of making money; a motive better than that of expediency; a faith warmer than that of reasoning; a love purer than that of self; he has been slow to express it; still slower to urge it.” He had even fewer good words to bestow upon his teachers.
My own experience as an undergraduate 100 years later was quite different. It was indelibly marked by a number of teachers and writers who changed my life utterly and forever. They were models of the life of the mind in action. They made me want to be, if I could, precisely what they were: teachers and scholars.
Virtually from the day I entered Harvard, I wanted to be a professor. I found books intellectually exhilarating. Nothing gave me greater satisfaction than achieving a sense of mastery of the life and works of particular authors and thinkers -- not simply the sort that earns an outstanding grade on an exam but the kind that yields a rounded, nuanced appreciation. I came close to reaching this level of knowledge and insight, I thought, with Samuel Johnson, Sigmund Freud, George Bernard Shaw, and T.S. Eliot. I immersed myself in their most significant works, not once but again and again, and I read the leading works of criticism and secondary materials about them. Eventually I came to a fluent familiarity with the texture of their thought. Few intellectual efforts were more satisfying; few brought me closer to sensing the thrill of being a scholar.
My admiration for my teachers -- indeed, my wonder at how much they knew and how compellingly they wrote -- was unbounded. I thought of the words that Oliver Goldsmith used, in “The Deserted Village,” to describe the intellectual capacities of the parson: “And still they gaz’d, and still the wonder grew, / That one small head could carry all he knew.” I gazed in wonder at all that so many of my professors knew: Northrop Frye, seemingly about all literature; Perry Miller, about the New England Mind; Douglas Bush, about John Milton; Walter Jackson Bate, about Samuel Johnson and John Keats; Arthur M. Schlesinger, Jr., about American intellectual history. Harvard, in Nicholas Dawidoff’s phrase, was “a culture that served men who had spent a lifetime accumulating knowledge.” It honored men of learning, scholarship, and wisdom. I wanted dearly to become a part of that culture, wherever it might exist and however I might qualify for entry.
How, I would ask myself, had Harvard chosen its faculty members so well, especially when it had chosen them when they were so young? How did it recognize intellectual promise with such consistent perspicacity? Perhaps there was something about the capacity of Harvard to reinforce a sense of destiny that elevated the achievements of its faculty members as they matured, just as it did of many of its students.
Many histories describe the ï¬fties as years of intellectual passivity, simplistic religiosity, and political meanness, of “the organization man” and “the man in the gray flannel suit.” And yet, because of the craft and character of the best of my teachers, I regard it as a period bursting with decidedly powerful ideas. I am astonished still by the boldness and enduring authority of many books of political and social criticism published during that decade. As undergraduates who were then coming of age intellectually, my friends and I wrestled intensely, often late into the night, with the encompassing claims of those contemporary philosophies to which our teachers introduced us: especially Freudianism, Marxism, Keynesianism, and existentialism.
Each new course was an awakening. I read books that were unconventional and pathbreaking, books with bold and synoptic themes that would change forever how we thought about the world and ourselves -- books like Isaiah Berlin’s The Hedgehog and the Fox (1953), Erik Erikson’s Childhood and Society (1950), Freud’s The Interpretation of Dreams (1899), Northrop Frye’s Anatomy of Criticism (1957), Richard Hofstadter’s The Age of Reform (1955), F.O. Matthiessen’s The Achievement of T.S. Eliot (1935), Reinhold Niebuhr’s The Children of Light and The Children of Darkness (1944), Morton White’s Social Thought in America (1948), and Edmund Wilson’s To the Finland Station (1940).
As my professors explored in their lectures the rugged intellectual terrain of these challenging books, they taught me the beauty of powerful ideas, as a liberal education should. They gave no slack. I studied the political and historical analyses of such demanding scholars as Joseph Schumpeter, George F. Kennan, Richard Hofstadter, and Louis Hartz. I devoured the works of important modern novelists: Lawrence, Conrad, and Forster, Hemingway and Gide, Malraux and Camus. No contemporary novelist overwhelmed me more than Faulkner, who was an entire universe in himself, as were the very greatest writers, like Balzac and Dickens, who came before him. I struggled with the dense, often difficult poetry of Eliot and Yeats, Stevens and Frost, Auden and cummings. And I embraced the icon-breaking plays of the modern dramatists: Ibsen, Strindberg, and Shaw, O’Neill, Williams, Miller, and Beckett.
As one who was fortunate to be an undergraduate at the time, I cannot accede to the conventional claim that these were years of intellectual and spiritual quiescence. These books and these men (my professors, in fact, were all men) made me want to be, throughout my lifetime, a reader, a learner, a teacher, a scholar.
By their loving immersion in their subjects, by the strenuous demands they made of their students, my teachers inspired me -- an anonymous student sitting in classes typically of several hundred -- to be passionate about the life of the mind. In the words of George Steiner, author of Lessons of the Masters (2003), each represented the ideal “of a true Master.” Steiner rightly adds, “The fortunate among us will have met with true Masters, be they Socrates or Emerson, Nadia Boulanger or Max Perutz.”
I yearned to become a member of their company of scholars. I hungered to write books like those they taught me so to admire. I wanted to partake of their professional way of life. What could be more thrilling or ennobling, I thought -- what could be more worthy or rewarding -- than a career as a teacher and scholar?
My naïveté about the possibility of teaching English at a good liberal arts college was brought home to me one day as I talked with a Radcliffe friend, herself the daughter of a distinguished professor. “I am wary about ever marrying an academic,” she said, “no matter how much I might love him.” I asked why, expecting that she would point perhaps to the modesty of academic salaries. “It might be that the best job he could get would be in Brunswick, Maine,” she replied. “Why would I want to spend the rest of my life in Brunswick, Maine?” She had injected realism into the conversation.
Almost all of my courses were taught in large lectures, typically of 100 to 300 students; occasionally a class would be as large as 400. Most of the professors had mastered the art of projecting to a large audience -- and this was in a period before the regular use of slides and other audiovisual aids. Some professors were accomplished orators or humorists, and some roamed the platform dramatically and with a practiced pace. Some timed their presentations to end in a grand flourish on precisely the stroke of the hour’s end. Many had established a reputation for one or two fabled lectures. Students would annually await the day of their delivery: Crane Brinton, a European historian, on the activities of Parisian prostitutes during the French Revolution; Walter Jackson Bate, a literary critic and biographer, on the death of Samuel Johnson; Arthur M. Schlesinger, Jr., a historian, on the embattled presidency of Andrew Jackson; David Owen, a historian of England, on British rule in India, always with the timeworn ditty:
At the time I majored in English, the department had reached “a high plateau,” in the description of Morton and Phyllis Keller in Making Harvard Modern (2001), and become “the most notable of Harvard’s humanities departments.” Its intellectual leader was Walter Jackson Bate. Its senior professors, all “at or near the top of their games,” included Douglas Bush, Perry Miller, Harry Levin, Alfred Harbage, John V. Kelleher, Howard Mumford Jones, and Albert J. Guerard. And this list did not include such distinguished visiting professors as Northrop Frye and F. W. Dupee, both of whom attracted large student enrollments.
The greatest of all my Harvard teachers was Walter Jackson Bate, a man of immense learning whose humane exempliï¬cation of literature as a source of moral teaching shaped me in permanent ways. During a long career at Harvard he published the magisterial biographies Samuel Johnson (1977) and John Keats (1963), both of which won the Pulitzer Prize for biography, as well as many books of criticism. Bate taught three principal courses -- The Age of Johnson, Literary Criticism (from Aristotle to Matthew Arnold), and English Literature from 1750 to 1950 -- and I took all three during my sophomore year, when he published The Achievement of Samuel Johnson (1955).
Bate was a frail, delicate man whose frame, in Nicholas Dawidoff's words, “appeared to be constructed of twigs and mist.” His lectures were conversational but deeply felt, meditations of a sort that were themselves a metaphor for his striving to achieve a perception of life’s tragic truths. He described, in tones of melancholy nostalgia, how he had come to Harvard as a 16-year-old farm boy from Indiana, without a scholarship, and been awakened to the life of the mind by the lectures of Professor Raphael Demos in Philosophy 1. He could be ironic and mischievous. He seemed congenitally sad and weary. He was the most memorable teacher I ever had.
Bate’s most celebrated course was The Age of Johnson. During the weeks of a semester, Bate led us through Johnson’s Lives of the English Poets (1779), The Vanity of Human Wishes (1749), Rasselas (1759), and his achievement as a lexicographer in the two-volume Dictionary of the English Language (1755), in which Johnson sought, as he wrote in his preface, to capture “the boundless chaos of a living speech,” as well as generous, incomparable selections from Boswell’s Life of Samuel Johnson (1791). Bate argued, again and again, that through his efforts to invest existence with meaning, Johnson had lived a life of allegory, as Keats said of Shakespeare -- “his works are the comments on it.”
Literature, for Bate, was an instrument of moral education and development. “Man was not made for literature,” he often recited, paraphrasing the Bible; “literature was made for man.” A protégé of Alfred North Whitehead, Bate sometimes repeated Whitehead’s premise that “moral education is impossible apart from the habitual vision of greatness.” For Bate, Samuel Johnson was the preeminent example of a life straining toward moral meaning and emancipation from adversity. “Life is a progress from want to want, not from enjoyment to enjoyment,” wrote Johnson. Many of the illustrations that ornamented Bate’s lectures were drawn from Johnson’s tortured efforts to overcome his own idiosyncrasies, eccentricities, irascibility, and sloth. “The great business of his life,” Bate wrote, “was to escape from himself.”
I loved Johnson’s praise of Paradise Lost, with his conclusive reservation “None ever wished it longer,” and his famous observation that a second marriage represented the “triumph of hope over experience.” I admired the angry candor in his condemnation of self-righteous patriotism as “the last refuge of a scoundrel” and his weary observation “No man but a blockhead ever wrote except for money.” I took delight in his blunt rebuke of Lord Chesterï¬eld: “Is not a patron, my Lord, one who looks with unconcern on a man struggling for life in the water and when he has reached ground encumbers him with help? The notice which you have been pleased to take of my labours, had it been early, had been kind, but it has been delayed till I am indifferent and cannot enjoy it, till I am solitary and cannot import it, till I am known and do not want it.”
Bate especially admired the paradoxical reversals that lit up Johnson’s prose. For example, Johnson once dismissed a book as “both good and original, but that which was good in it was not original, and that which was original was not good.” He admired, too, Johnson’s psychological insight. Johnson once wrote, “So few of the hours of life are ï¬lled up with objects adequate to the mind of man ... that we are forced to have recourse every moment to the past and future for supplemental satisfactions.”
Among undergraduates, Bate was especially known for involuntarily losing his composure every year -- some thought he was reduced to tears -- in describing the death of Johnson. He seemed as genuinely moved by Johnson’s death as he would have been by the death of a beloved contemporary. It was one of Harvard’s most famous performances. In another course I took from Bate, he was equally moved in describing John Keats’s deathbed wish that there be no name upon his grave, no epitaph, only the words, “Here lies one whose name was writ in water.”
Under Bate’s gentle guidance, I came to love the literature of the 18th century: the periodic prose of Burke, the wit and irony of Gibbon, the rhyming couplets of Pope. I have ever since been able to recite from memory a certain amount of 18th-century poetry, especially those poignant lines from “The Deserted Village” by Goldsmith: “Ill fares the land, to hastening ills a prey / Where wealth accumulates, and men decay.”
In a memorial minute adopted after Bate’s death by the Harvard faculty of arts and sciences, his colleagues wrote that he “gave his students what he said Johnson had given so many, the greatest gift that any human can give another, the gift of hope: that human nature can overcome its frailties and follies and, in the face of ignorance and illness, can through courage still carve out something lasting and worthwhile, even something astonishing, something that will act as a support and friend to succeeding generations.”
Another teacher whom I greatly admired was Albert J. Guerard, a professor of English and comparative literature, who taught a brilliant course entitled Forms of the Modern Novel. Three mornings a week he lectured to a class of more than three hundred students on novels from Flaubert’s Madame Bovary (1857), Zola’s Germinal (1885), and Hardy’s Jude the Obscure (1895) across the ï¬rst half of the 20th century to Camus’s The Plague (1948) and Faulkner’s Light in August (1932). In between -- it was an all-star list -- he assigned Gide’s The Immoralist (1902), Conrad’s Heart of Darkness (1902) and Lord Jim (1900), Joyce’s Portrait of the Artist as a Young Man (1916), and Greene’s The Power and the Glory (1940), among others. (I do not recall any reference at the time to the argument that Chinua Achebe would later make in “An Image of Africa” (1977); there Achebe denied that “a novel that depersonalizes a portion of the human race” -- he was referring to Heart of Darkness -- “can be called a great work of art.”)
Typically he covered a novel in an hour, always with luminous clarity and insight, as he introduced us to such critical themes as moral ambiguity and latent homosexuality. His great theme was the moral power of literature: “The greatest writers take us beyond our common sense and selective inattention, even to paradoxical sympathy with the lost and the damned -- take us, that is, to the recognition of humanity in its most hidden places.”
Professor Guerard was not only a novelist and critic; he was also a teacher of writing. Many of his students established themselves as novelists. One of whom he was particularly proud was John Hawkes, whose experimental novel The Cannibal (1949) he warmly recommended.
Guerard reveled in the beauty of a novel’s ï¬rst and last sentences. He loved the way in which ï¬rst sentences -- like Melville’s “Call me Ishmael” in Moby-Dick (1851) -- can set a tone for a novel’s primary mission. He especially admired opening sentences that invited a sense of intimacy, like “This is the saddest story I have ever heard,” in The Good Soldier (1915) by Ford Maddox Ford, or suggested a sense of quiet mystery, like “The past is a foreign country: they do things differently there,” in The Go-Between (1953) by L. P. Hartley. (My father loved ï¬rst lines, too. His favorite was from Scaramouche  by Rafael Sabatini: “He was born with a gift of laughter and a sense that the world was mad.” He loved the reckless, romantic sweep of that language. Another favorite was the haunting opening sentence of Rebecca  by Daphne du Maurier: “Last night I dreamt I went to Manderley again.”)
I recalled from my high school reading the famous opening sentence of A Tale of Two Cities (1859) by Charles Dickens, “It was the best of times, it was the worst of times...,” and that of Pride and Prejudice (1813) by Jane Austen, which conï¬dently asserts, “It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife.” As I became conscious of the tone-setting capacity of opening sentences, I looked in my reading for new examples. One that I admired appears in The Heart Is a Lonely Hunter (1940) by Carson McCullers: “In the town there were two mutes, and they were always together.” Another appears in The Stranger (1942) by Albert Camus: “Mother died today.”
There are, of course, any number of further examples. One of the best in European literature is “Happy families are all alike; every unhappy family is unhappy in its own way,” in Anna Karenina (1877) by Leo Tolstoy. One of the best in American literature is “You don’t know about me without you have read a book by the name of The Adventures of Tom Sawyer; but that ain’t no matter,” in The Adventures of Huckleberry Finn (1884) by Mark Twain.
I especially remember a poetic lecture that Professor Guerard delivered on the importance of a novel’s ï¬nal sentences. Most endings, he said, were melodramatic or tired when they should have conveyed a cadenced ï¬nality. Few approached the quiet beauty of the last line of James Joyce’s “The Dead” -- “His soul swooned slowly as he heard the snow falling faintly through the universe and faintly falling, like the descent of their last end, upon all the living and the dead” -- or the lyric passion of the soliloquy of Molly Bloom that concludes Joyce’s Ulysses (1922): “... and then he asked me would I say yes to say yes my mountain ï¬‚ower and ï¬rst I put my arms around him yes and drew him down to me so he could feel my breasts all perfume yes and his heart was going like mad and yes I said yes I will Yes.” Few were as philosophically effective as the last line of F. Scott Fitzgerald’s The Great Gatsby (1925): “So we beat on, boats against the current, borne back ceaselessly into the past.” One of my favorite endings is that of The Sun Also Rises (1926) by Ernest Hemingway. “Oh, Jake,” Brett said, “we could have had such a damned good time together.” Jake responds, “Isn’t it pretty to think so.”
Professor Guerard was especially good at analyzing dialogue. The conversation that appears in ï¬ction, he said, with the experience of one who had published several novels himself, is quite different from the conversation of everyday life. A tape recorder can capture the way most people actually speak -- in false starts, circuitous detours, and garrulous and prolix meanderings, with pointless and irrelevant insertions. But no matter how accurate the transcript, such a rendering will seem stilted in print. Fiction, by contrast, must use dialogue to achieve artiï¬cially the illusion of a reality that is more richly authentic and convincing than a tape-recorded transcript could ever be. For a conversation to seem natural to the reader, he said, the author must shape it, omitting the repetitive, relying on the telling phrase and the pivotal word. In the end, “nature requires the sculpting hand of art in order to appear in literature as nature.” The lesson was as true for the ornate, drawing room conversation of Henry James as it was for the terse, telegraphic dialogue of Ernest Hemingway.
Guerard spoke in a measured, husky voice. He was a mesmerizing lecturer, a digniï¬ed man of magnetic warmth. When the entire class spontaneously applauded a lecture he delivered early in the term, Guerard expressed his appreciation at the start of the next lecture but asked that we thereafter refrain from applause. He feared that he would think his lecture fell short on all of those more usual occasions when the class did not applaud.
During his lectures on Light in August (1932), Guerard described his only meeting with William Faulkner, a meeting at which Faulkner insisted that he was a self-educated Mississippian who had never ï¬nished high school and had little to contribute to a conversation about literature. His only ambition, he had written to Malcolm Cowley, was “to be, as a private individual, abolished and voided from history, leaving it markless, no refuse save the printed books.” Guerard mentioned three books that he regularly taught: Notes from Underground (1864) by Dostoevsky, The Secret Sharer (1912) by Joseph Conrad, and The Plague (1952) by Camus. As it happened, Faulkner had a remarkably exact knowledge of all three. After that response, Guerard did not ask Faulkner about his obvious indebtedness to the cadences of the King James Bible and the plays of Shakespeare. Faulkner often asserted that he had never read Freud. “Neither did Shakespeare,” he told a Paris Review interviewer in 1956. “I doubt if Melville did either and I’m sure Moby-Dick didn’t.”
When the term ended, I sent a letter of appreciation to Professor Guerard, explaining that I intended to become a professor of English. I was thrilled to receive a reply. “There are many rewards to teaching,” he wrote, “but receiving such letters [as yours] is certainly one of the most satisfying. I’m a poor giver of advice, but would be glad to talk with you if you think I could be of any help.”
Several years later, in 1961, Guerard decamped for Stanford in a move that shocked Harvard: no one ever left Harvard. By the time of his death, he had published nine novels, six books of literary criticism, and a memoir. Among the subjects of his critical books were Conrad, Hardy, Gide, Dickens, Dostoevsky, and Faulkner.
Forty years after my graduation, when The New York Times reported my impending retirement as president of Dartmouth, Guerard sent me a beautiful letter. “It was a pleasure to see your beaming, youthful face,” he wrote. “You seem much too young to retire. On the other hand I think you can look forward to the reading of many books and perhaps writing one or two. At 83 I’m still at it.” And then, recalling a conversation that we had more than 10 years earlier when he had been my guest for dinner at the President’s House at the University of Iowa, he added, “I have had many ï¬ne and famous people in my classes but you were the only one able to recite the reading list years after taking my course.” Few letters have ever gratiï¬ed me more.
Even as I admired Harvard professors like Guerard, I was intimidated by the prospect of emulating them. I shuddered at the lifelong burden of reading that a career choice to become a professor of English would entail. I thought of the frustration of Eugene Gant, Thomas Wolfe’s protagonist in Of Time and the River (1935), who as a college student “would prowl the stacks of the library at night, pulling books out of a thousand shelves and reading them like a madman. The thought of those vast stacks of books would drive him mad: the more he read, the less he seemed to know -- the greater the number of books he read, the greater the immense uncountable number of those which he could never read would seem to be.”
“How,” I asked my father, “does Professor Guerard ï¬nd the time to reread each year the novels that he is teaching, keep up with the scholarly literature, and read all of the new novels published in this country and Europe?”
“Don’t you suppose he enjoys it?” my father replied.
I also admired Northrop Frye, a compact, bespectacled man with a booming voice, who was a visiting professor from the University of Toronto. He had made his critical reputation a decade earlier with a book on Blake, Fearful Symmetry (1947). He lectured with a strong assurance and an unusual clarity. As an ordained minister in the United Church of Canada, he commanded both the Bible and the works of Shakespeare. Now he was about to publish one of his masterworks, Anatomy of Criticism (1957), which presented a complete worldview -- a coherent framework, comprising tragedy, comedy, and romance, in which all novels, poems, and plays had interconnected places. “Poetry can only be made out of other poems,” he wrote, “novels out of other novels.” His theory took the Bible as the mythological substructure of Western culture. All human thought, Frye argued, was shaped by that substructure. Anatomy was an elucidation of how an archetypal and mythological reading could illuminate all of literature. When I bought Anatomy at the Mandrake Book Store, the proprietor, comparing Frye’s volume to a current national best-seller, said, “This is our Auntie Mame.”
The qualities that most distinguished Frye were the breadth of his learning and the Euclidean clarity of his lectures. He seemed to be familiar with the whole of literary output; he was the furthest from a period specialist that one could be. Frye pushed creative imagination to the limits. He admired the ways in which certain lines encapsulated thoughts with a near-perfect economy of words. Shakespeare, of course, was more adept at achieving this masterful concision of thought than any other writer. His plays abound with pertinent examples, of which the most supreme is “To be or not to be.”
Frye could be devastating on literary trendiness. “The literary chit-chat which makes the reputations of poets boom and crash in an imaginary stock exchange is pseudo-criticism,” he wrote in Anatomy of Criticism. “That wealthy investor Mr. Eliot, after dumping Milton on the market, is now buying him again; Donne has probably reached his peak and will begin to taper off; Tennyson may be in for a slight ï¬‚utter but the Shelley stocks are still bearish.”
Still another impressive professor was the American historian Arthur M. Schlesinger Jr. He was simply a wunderkind -- a brilliant intellect, a compelling writer, a scholar of breathtaking learning. The son of a distinguished Harvard historian, Schlesinger had had a meteoric career. The honors thesis that he had written as a senior had been published a year later as Orestes A. Brownson: A Pilgrim’s Progress (1939), and his work as a junior fellow at Harvard had resulted in The Age of Jackson (1945), for which he received the Pulitzer Prize for history at the age of 28. Shortly thereafter, Schlesinger was appointed an associate professor of history with tenure, to the surprise of some historians who believed that he had drawn forced historical parallels between the politics of Jackson’s administration and that of Franklin D. Roosevelt. (Only later did I read the seminal work on that tendency, The Whig Interpretation of History  by Herbert Butterï¬eld.) Schlesinger had a near-adulatory admiration for Roosevelt who, he believed, had preserved capitalism from itself by introducing governmental regulation of its harshest features. During the term that I took Schlesinger’s course, he was completing The Crisis of the Old Order, 1919–1933 (1957), the ï¬rst volume of his history The Age of Roosevelt.
Schlesinger was not only exceptionally skilled in dismantling the theories of others; he also was richly imaginative in building theories of his own. Many of Harvard’s courses in American history sought to deï¬ne a national identity by emphasizing a narrative of accommodation and progress: consensus over conï¬‚ict, the absence of a landed aristocracy, the liberating presence of the frontier, the opportunities for upward mobility, and the constant presence of renewal and rebirth. For Schlesinger, American history had been a series of conï¬‚icts between the forces of wealth and privilege and those of the poor and underprivileged -- what George Bancroft had called “the house of Have and the house of Want.” In an important passage in The Age of Jackson, Schlesinger wrote:
American history has been marked by recurrent conservatism and liberalism. During the periods of inaction, unsolved social problems pile up till the demand for reform becomes overwhelming. Then a liberal government comes to power, the dam breaks and a ï¬‚ood of change sweeps away a great deal in a short time. After 15 or 20 years the liberal impulse is exhausted, the day of consolidation and inaction arrives, and conservatism once again expresses the mood of the country, but generally in the terms of the liberalism it displaces.
Schlesinger was an admirer of Herbert Croly’s The Promise of American Life (1909), which argued for a strong central government to address the problem of growing inequality. In Schlesinger’s reading, American history had been an “enduring struggle between the business community and the rest of society.” That struggle, in turn, was “the guarantee of freedom in a liberal capitalist state.” The goal of a pragmatic liberalism, perhaps ironically, was to prevent the capitalists from destroying capitalism. For that reason, he championed what he called “the vital center” where compromise and experimentation could devise practical solutions to democratic problems.
One of Schlesinger’s central domestic themes was that the New Deal had solved the problems of quantitative liberalism, and that the next decades -- starting with the sixties -- would be dominated politically and socially by issues of qualitative liberalism. In contrasting the old “quantitative liberalism” with the new “qualitative liberalism,” Schlesinger wrote: “Today we dwell in the economy of abundance -- and our spiritual malaise seems greater than before. As a nation, the richer we grow, the more tense, insecure, and unhappy we seem to become. Yet too much of our liberal thought is still mired in the issues, the attitudes, and the rallying cries of the 1930’s.” The concern of liberalism in the next decades, he believed, should be “the quality of civilization to which our nation aspires in an age of ever-increasing abundance and leisure.”
The new liberalism that Schlesinger envisioned presumably would emphasize such quality-of-life issues as civil rights, racial justice, employment discrimination, capital punishment, the availability of health care, religious toleration, gender equity, fair housing, educational opportunity, and environmental protection. Ironically, some of the qualitative issues -- perhaps they are best called cultural issues -- that came to the fore in the next several decades, such as abortion, gun control, school prayer, and welfare reform, had a distinctively conservative tenor. They cast doubt on the consensus theory of American development and illustrated Pieter Geyl’s observation that “history is argument without end.”
Schlesinger was fascinated by the American presidency. Following in the footsteps of his father, he organized polls of historians to rank the presidents. In the poll conducted during my student days, six presidents were adjudged to be great: Washington, Jefferson, Jackson, Lincoln, Wilson, and Franklin D. Roosevelt. Perhaps gratifying to Schlesinger, Truman was ranked near great, in the company of Polk, Cleveland, and Theodore Roosevelt. The ranking complemented Schlesinger’s thesis that periods of liberal and conservative ascendancy alternated in 30-year cycles.
At the podium, Schlesinger, always sporting a bow tie and often a bold-striped shirt, was an impressive presence. His mind was both agile and deep. His lectures were incisive, meticulously prepared, and polished. Never was a word out of place, a sentence left uncompleted. His course on American intellectual history was riveting -- the largest in the History Department (and that was a department that included Samuel Eliot Morison, John K. Fairbank, Frederick Merk, Crane Brinton, Charles H. Taylor, Myron Gilmore, David Owen, and Edwin Reischauer). He was as penetrating in discussing the sociology of William Graham Sumner and Walter Rauschenbusch as he was shrewd in analyzing the political machinations of Andrew Jackson and Franklin D. Roosevelt.
Because of his aplomb as a lecturer, I was surprised to read in the ï¬rst volume of his autobiography, A Life in the Twentieth Century (2000), that Schlesinger felt great trepidation at the lectern:
I never quite escaped the imposter complex, the fear that I would one day be found out. My knowledge was by some standards considerable, but it was outweighed by my awareness of my ignorance. I always saw myself skating over thin ice. The imposter complex had its value. It created a great reluctance, for example, to impose my views on students.
Few professorial examples of intellectual humility impressed me as much as that of a colleague of Schlesinger’s, Professor Frederick Merk, a compelling lecturer who traced, with an unsurpassed skill, the westward movement, the role of the frontier, and the spirit of manifest destiny in American history. His lectures were clear, crisp, and witty. Students had affectionately named his most popular course “Wagon Wheels.” Near the end of the ï¬rst term in his survey course on American history, Professor Merk announced that he did not know enough about the causes of the Civil War to lecture on it and that he had therefore asked Schlesinger to substitute for him in delivering the next four lectures. How many professors ever set their standards of intellectual humility so high?
My tutor during my junior and senior years was Professor John V. Kelleher, one of the world’s foremost scholars of Irish literature and culture, especially of the twentieth century. He held a chair in Celtic studies established by a Boston Brahmin expressly to promote understanding between the Yankee and Irish-American cultures.
Once a week I would thread my path through the Widener Library stacks for my tutorial hour with him. During the course of the two years, we read our way diligently through much of the poetry of Edmund Spenser (especially The Faerie Queene  and “Epithalamion”) and John Donne. But the true lessons of these tutorial sessions lay not in the poetry itself, but in the conversations we had about the poetry. When Professor Kelleher read a poem aloud, my understanding of it grew. He taught me how to discover more and still more in the coded arrangement of words in poetic lines and stanzas. I was in awe of him.
Kelleher was a shy and modest man and a dedicated scholar. Crowned with a great shock of pure-white hair, he came from a blue-collar family in the mill city of Lawrence, Massachusetts. A graduate of Dartmouth College, he began his academic career as a junior fellow at Harvard and was appointed to the faculty without a Ph.D. Although he spoke with a severe stammer, he read poetry aloud in a deep and sonorous voice and with a lilting ï¬‚uency, without any trace of a speech impediment. When he recited Spenser, his Irish accent captured the sound of Elizabethan English, he told me. He charmed me with his self-deprecating manner -- he was one of the most modest men I have ever known -- and his vivid recollections of many of the great Irish ï¬gures he had met: Maud Gonne, Jack B. Yeats, Frank O’Connor, Sean O’Faolain, and Samuel Beckett.
He loved to talk, too, about the gradual transformation that was occurring in Irish-American society -- a subject I had observed at an ethnic distance but that he knew at ï¬rst hand. He saw the transformation, as he later wrote in an essay on his friend Edwin O’Connor, as a “rapid demise” characterized by “the rise of the funeral home and the destruction of the wake; the death of the old people, the last links with that vanished mid-nineteenth-century Ireland from which we were all originally recruited; the disappearance of the genial, uncomplimentary nicknames; and ï¬nally, the lack of any continuing force, like discrimination, or afterward the resentment of remembered discrimination, strong enough to hold the society together from without or within. Whatever happened, there came a time when nobody felt very Irish anymore, or had much reason to. By the late 1940s that society was practically all gone.”
Professor Kelleher probably understood the works of James Joyce (and Yeats, too) as deeply as anyone in the world. His copies of Ulysses (1922) and Finnegans Wake (1939) were extensively annotated and interlined with his comments on those often-bafï¬‚ing texts; they obviously constituted documents of exceptional critical brilliance. His favorite work of Joyce, however, was A Portrait of the Artist as a Young Man (1916). In an essay he gave me to read in typescript -- it later appeared as “The Perceptions of James Joyce” in the Atlantic Monthly (March 1958) -- he wrote, “I remember that when I ï¬rst encountered Stephen Dedalus, I was twenty and wondered how Joyce could have known so much about me.”
One afternoon, as I was planning my course schedule for the next year, Professor Kelleher surprised me by saying that it probably did not make sense to take a course in Shakespeare. “No one can truly teach Shakespeare,” he said. “If you want to appreciate Shakespeare, you simply have to sit down and read him yourself, over and over again.”
Once I graduated from Harvard, I did not see Professor Kelleher again for thirty-two years, until he attended his 50th class reunion at Dartmouth in 1989. I was in the midst of my speech to his reunion luncheon -- at least 400 members of the class and their wives were packed into the room -- when I spotted him standing alone at a rear corner. His full head of pure-white hair was still a beacon. As soon as the lunch was over, I wove my way through the crowd, excited to greet him. “President Freedman,” he exclaimed, as we laughed in joyous reunion. For the ï¬rst and only time, I corrected him: it was still okay to call me Jim.
One of Harvard’s most notable professors in the ï¬fties was Perry Miller, who had returned from the war in 1946 as one of the university’s ï¬rst professors of American literature. His wartime exploits as an OSS ofï¬cer were well known; according to local legend, he had kept an Irish mistress, announced his intention to kill as many Nazis as he could, and accompanied the French war hero General Jacques Philippe Leclerc when the Free French forces liberated Alsace. Who knew whether any of this was true?
Upon his return, Miller began to offer his famous course, Romanticism in American Literature, concentrating on Cooper, Emerson, Hawthorne, Melville, and Thoreau. A year later, Miller offered one of the ï¬rst courses in the new General Education program, Classics of the Christian Tradition. Miller went on to become an important intellectual and cultural historian, a leading exponent of Puritan thought, a gifted and exhaustive scholar with an unquenchable interest in theology, philosophy, and the history of ideas. He sought to capture what he referred to in Errand into the Wilderness (1956) as the “massive narrative of the movement of European culture into the vacant wilderness of America.” His work on the theological progression from seventeenth-century Puritanism to nineteenth-century Unitarianism was penetrating and original.
The two-volume The New England Mind (1939, 1953) that made his reputation had been published by the time I entered Harvard. So had his biography Jonathan Edwards (1949), with its evocation of the Great Awakening and its striking analysis of the role that Newton’s physics and Locke’s psychology had played in the formation of Edwards’s thought, and his anthology The Transcendentalists (1950). Miller published several other volumes while I was an undergraduate, including Errand into the Wilderness and The Raven and the Whale (1956), a study of Poe and Melville.
I took Miller’s survey course in American literature, which covered ground from Anne Bradstreet and Edward Taylor to John Steinbeck and William Faulkner. Miller’s teaching style was compelling. He was a man of physical gusto and intellectual enthusiasm. When he read from Jonathan Edwards’s famous sermon “Sinners in the Hands of an Angry God,” he fairly bellowed the preacher’s theme of eternal damnation in a ï¬re of wrath.
In his book Exemplary Elders (1990), David Levin, a Harvard student in the years immediately after the war, recollected Miller’s teaching authority: “Miller’s great skill as a teacher was exemplary rather than sympathetically imaginative. He had a brilliantly intuitive mind, an extraordinary ability to ï¬nd the heart of a seventeenth-, eighteenth-, or nineteenth-century text. That gift, and the art of dramatizing intellectual history so that young students who had virtually no knowledge of theology would see both the passion and the intellectual complexity in the debates of narrow Puritans or corpse-cold Unitarians, made him a priceless teacher.”
Once, when our teaching fellow was ill, Miller conducted our section of 15 students. Shifting uneasily in his chair, he told us that this was the ï¬rst time in his entire career that he had ever taught a section of undergraduates. He virtually implored us to participate voluntarily so that he could get through the experience. Miller died much too early -- in 1963, at the age of 58.
Douglas Bush was another professor whom I greatly admired. He was a quiet man, modest and understated, but his vast knowledge of literature and his deferential demeanor made a deep impression on me. I took his course on Milton and have always regretted that I did not take his course on the Victorian novel. Bush had made his reputation with a magisterial book, English Literature in the Earlier Seventeenth Century, 1600–1660 (1952). He went on to display his critical virtuosity in more than a dozen other books, including studies of Jane Austen, Matthew Arnold, and John Keats.
(Nothing better illustrated his catholicity of taste than his unsuccessful efforts in nominating Edmund Wilson and Robert Frost for the Nobel Prize in Literature.)
Professor Bush’s method of teaching of Milton was to read the poetry to the class, quietly, patiently, line by line, pausing every several lines to comment on their meaning, historical allusions, classical references or echoes, and events in Milton’s life. Often it appeared that he was reciting from memory, rather than reading. Once, when the classroom lights suddenly went out, he immediately recited an apt passage from Paradise Lost: “More safe I sing with mortal voice . . . / In darkness, and with dangers compass’d round.”
Under the tutelage of Professor Bush, I came to admire the power and beauty of Milton. I reveled in the lyrical reach of his metered lines. I loved “Lycidas” (“Fame is no plant that grows on mortal soil”) and the sonnets, especially “On His Blindness,” with its canonical line “They also serve who only stand and wait,” which John Berryman called “the greatest sonnet in the language.” I also admired Milton’s prose, especially Areopagitica, his argument against censorship, with its stirring rhetorical assertion “Who ever knew truth put to the worse in a free and open encounter?” Each year Bush asked his students to memorize 20 lines from Milton for the ï¬nal exam. I took an easy path, choosing the opening passage of the short poem “L’Allegro,” a poem that Helen Vendler counts as “Milton’s ï¬rst triumph,” and to this day I can recite that energetic passage on command: “Haste thee, Nymph and bring with thee / Jest, and youthful Jollity....”
During my undergraduate years, I had many opportunities to hear poets and novelists read their work. The occasion I remember most indelibly was related to Professor Bush -- a reading on May 29, 1955, by T. S. Eliot, who appeared in Sanders Theatre under the auspices of the Advocate, Harvard’s undergraduate literary magazine. Eliot had written for the Advocate as an undergraduate and now was helping the magazine to raise money. Because I had competed unsuccessfully for membership on the Advocate, I felt a special sense of yearning that evening, a desire to identify with this Harvard graduate who was perhaps the most signiï¬cant living poet and critic.
After being introduced by Archibald MacLeish, poet, playwright, and Harvard professor, Eliot rose to speak. “I don’t think most people know or realize how important an undergraduate literary magazine can be at so critical a time in a young writer’s development,” he said. “It meant not only encouragement and companionship, but very salutary discouragement and criticism.” He went on to say that he wished that he had intended all the obscure classical references and complex layers of symbolism that scholars and teachers were “discovering” in his work and attributing to his scholarship.
And then he added a word of homage to Professor Bush. In a number of early essays, Eliot had downgraded Milton’s stature as an English poet. “While it must be admitted that Milton is a very great poet indeed,” he wrote in 1936, “it is something of a puzzle to decide in what his greatness consists. On analysis, the marks against him appear both more numerous and more signiï¬cant than the marks to his credit.” In his celebrated rejection of Paradise Lost, Eliot wrote, “So far as I perceive anything, it is a glimpse of a theology that I ï¬nd in large part repellent, expressed through a mythology that would have been better left in the Book of Genesis, upon which Milton has not improved.”
Now, Eliot announced, Professor Bush had since persuaded him that Milton must indeed be ranked among the great English poets. I was stunned by the signiï¬cance of that statement. It was, of course, a tribute to Professor Bush. But even more important, it was a confession of a critical mistake. Eliot’s confession of error was an epiphany; it brought the audience into the intimacy of a writer secure enough, generous enough, to admit his fallibility.
Professor Bush was an indomitable proponent of the humanities. He thought them more essential to a liberal education than the social sciences or the natural sciences; they were, he said, “the most basic of the three great bodies of knowledge and thought.”
With ï¬rm conviction as well as a fearful pessimism, Bush once wrote, “We may indeed reach a point in our new Dark Age -- at moments one may wonder if we have not reached it already -- where the literary creations of saner and nobler ages can no longer be assimilated or even dimly apprehended, where man has fulï¬lled his destiny as a mindless, heartless, will-less node. Meanwhile, no scientiï¬c problem is anywhere near so urgent as the preservation of individual man and his humane faculties and heritage.” I have always cherished the passion of his conviction.
When I took Economics 1 with Professor Seymour Harris during my sophomore year, the subject had not yet become a mathematical, model-building discipline. The basic textbook -- an early edition of the classic work by Paul Samuelson -- emphasized macroeconomic activity: the role of government in fostering aggregate demand and stabilizing the economy, managing the business cycle, correcting misallocations and market failures, and providing public goods. It covered basic neo-Keynsian topics of the mid–20th century, like supply and demand, business cycles, patterns of saving and spending, the pump-priming role of government, and the indeterminate inï¬‚uence of the imponderables that constitute consumer behavior.
It was in this course that I was introduced to one of the most engaging books about economists ever written, The Worldly Philosophers (1953) by Robert L. Heilbroner. The course’s intellectual heroes were Joseph Schumpeter, who highlighted the “perennial gale of creative destruction” at the heart of competitive markets, and John Maynard Keynes, the most inï¬‚uential economist of the century, whose emphasis on government spending to stimulate the economy animated the New Deal. Schumpeter, who taught at Harvard from 1932 until his death in 1950, emphasized the disruptive role of innovation and technological change in a competitive economy. His most famous book, Capitalism, Socialism, and Democracy (1942), was essential reading.
When it came time to write a term paper, I asked my section man if I might write on The Road to Serfdom (1944) by Friedrich A. Hayek, the Austrian economist who had studied with Ludwig von Mises and was perhaps the leading intellectual opponent of Keynesian orthodoxy. Although Hayek was a classical liberal, his book argued the conservative theme that the logic of the European welfare state implied the erosion of personal freedoms. He feared the results of central planning and social engineering; he admired individualism and the economic outcomes of unfettered markets. “Hayek?” my section man responded quizzically. “He is completely out of step with current thinking.” He expressed his disdain for Hayek’s so-called inevitability thesis: that if a nation experiments with intervention in the economy, it will eventually end up as a totalitarian state. He concluded, “I don’t see that there’s much you can do with that book.” And so I renewed the search for a paper topic. (Twenty years later, in 1974, Hayek was awarded the Nobel Prize in Economic Science.)
Having enjoyed Economics 1, I ventured into an advanced course in economics and political thought, taught by O.H. Taylor, a sad, shy man who led the seminar-size class with great gentleness through the work of the important theorists of the state and economic activity: Smith, Ricardo, Locke, Hume, Marx, Weber, and Veblen. Of all these thinkers, I was most intrigued by Weber and his argument, in The Protestant Ethic and the Spirit of Capitalism (1904), that Calvinist religious beliefs provided the economic basis of capitalism. Perhaps Taylor knew already that the place of this philosophical course in the economics curriculum would soon be doomed, at Harvard and elsewhere, by the increasingly econometric and empirical tendencies of the discipline.
Robert G. McCloskey, a political scientist, was a distinguished expert and ï¬‚uent lecturer on the Supreme Court. His course was a stimulating review of the Court’s jurisprudence, emphasizing the historical forces that shaped the direction of the decisions of the Court. From him I ï¬rst glimpsed something of the grandeur of public law. He especially emphasized the political alertness of the Court and the way in which it had historically tended to follow or conï¬rm public opinion rather than challenge it. “[P]ublic concurrence sets an outer boundary for judicial policy making,” McCloskey wrote. “[J]udicial ideas of the good society can never be too far removed from the popular ideas.” In the 19th and 20th centuries, the Supreme Court occasionally challenged public opinion, often to its chagrin (as in the cases ï¬nding New Deal legislation unconstitutional), sometimes to its glory (as in Brown v. Board of Education, holding segregated public schools unconstitutional). Indeed, McCloskey emphasized the value to the Court as a deliberative institution in having one or more former elected ofï¬cials (governors and senators) among its members.
During my sophomore year I took Edwin Honig’s course in creative writing, English C. I learned, to my grim disappointment, that I was not meant to be a writer of ï¬ction. Honig was a poet, and he gave each of his 15 students detailed personal attention. He was a calming inï¬‚uence on his often tense, anxious students, never seeming to tire of reading endless manuscripts on the familiar subjects of ï¬rst love, sexual initiation, and generational conï¬‚ict. Honig appreciated that a teacher cannot teach students to write, but that he could, by wise and gentle criticism, teach them to improve their writing.
I wrote a number of short stories for the course, all of them wooden and unimaginative, obvious and predictable in their plotting. As was Honig’s practice weekly, he read one of my stories anonymously to the class for criticism; I squirmed in the hope that my classmates would not recognize it as mine, even though it was the best of the impoverished lot that I wrote for the course.
I admired Honig -- he was a humane man, tall, craggy, shy in demeanor, halting in speech -- and I read most of his books of poetry as well as his critical book on the Spanish poet Federico Garcia Lorca. From him, I learned that a writer must not only have a versatile command of language, he must also have something to say. Novels must have themes and make points; the best writers are thinkers. As a ï¬‚edgling writer, I had a thin imagination and was bereft of striking ideas. I had no conception of what I wanted to say. I concluded that I did not have the creative qualities of a writer....
For all my admiration for my Harvard teachers, as a student I never met or had a conversation with any of them, with the exception of Professor Kelleher. After completing a lecture, most professors hurried from the podium as quickly as they could, well before any student could come forward to ask a question. The Harvard system of undergraduate education was not conducive to faculty-student interaction. Professors did not hold ofï¬ce hours for undergraduates, and they rarely took meals or attended social events at Lowell House. They were apparently too busy or important to spend time with students. The system was designed to ensure that students’ moments of personal discourse were with the teaching fellows who taught our sections, not with members of the faculty.
A few professors, inevitably, were terrible lecturers, and I wondered why the quality of their teaching was not better. The issue usually was not substance but style. James Bryant Conant, Harvard’s former president, once quoted Edward Gibbon on Greek scholars in the 10th century: “[The teachers of the day] held in their lifeless hands the riches of their fathers without inheriting the spirit which had created and improved that sacred patrimony.”
Many professors didn’t seem to care about the organization or ï¬‚uency of their presentations. Occasionally some seemed unprepared. Lecturing to a large audience was, I believed, an art that could be improved by instruction and practice -- wasn’t that what Dale Carnegie purported to do? -- and I assumed that professors themselves would ï¬nd satisfaction in perfecting their lecturing styles.
Every faculty undoubtedly has its share of opinionated, self-centered teachers like Miss Jean Brodie, whose unorthodox prime is chronicled in Muriel Spark’s novel. For all her fervent dedication to her students, Miss Jean Brodie was a self-deluded admirer of fascist regimes who abused her position of authority. But my worst professors were not especially opinionated or self-centered -- merely dull. Were all of my teachers models of intellectual power and pedagogical clarity, let alone of moral stature and common sense? Surely, they were not, although I was probably too inexperienced -- or too dazzled by Harvard’s reputation -- to appreciate that.
Despite these limitations, I admired beyond measure the wisest, most learned members of the faculty and have been forever grateful for the models of the life of the mind that they provided me. From them I learned, as George Steiner wrote in Lessons of the Masters (2003), “There is no craft more privileged.... To awaken in another human being powers, dreams beyond one’s own; to induce in others a love for that which one loves; to make of one’s inward present their future: this is a three-fold adventure like no other.”
James O. Freedman
This essay is an excerpt from Finding the Words, an autobiography by James O. Freedman of the first 27 years of his life. Freedman served as president of Dartmouth College and the University of Iowa and was the author of Idealism and Liberal Education (University of Michigan Press) and Liberal Education and the Public Interest (University of Iowa Press). Freedman died of cancer last year, weeks before Finding the Words was to move into production at Princeton University Press. The press -- working with two of Freedman's friends, Stanley N. Katz of Princeton University and Howard Gardner of Harvard University -- finished the book, which has just been released. This excerpt is printed with permission of the Princeton University Press.
By the conclusion of Secretary of Education Margaret Spellings' recently-convened Test of Leadership Summit on Higher Education, I finally understood why her proposals are so ... well, so ill-conceived. They rest on a faulty metaphor: the belief that education is essentially like manufacturing. High school students are "your raw material," as Rhode Island Gov. Donald Carcieri told us. We need "more productive delivery models," economies of scale, even something called "process redesign strategies." Underlying everything is the belief that business does things right, higher education does things wrong, and a crisis is almost upon us, best symbolized by that coming tsunami of Chinese and Indian scientists we hear so much about. Time for higher ed to shape up and adopt the wisdom of business.
But the whole metaphor is wrong. Education is nothing like business, especially not like manufacturing. Consider the Spellings Summit's faulty assumptions:
1. "If it isn't measured, it isn't happening." This slogan we heard in formal talks and casual conversations. Therefore more testing, more reporting, more oversight, as Spellings is proposing, should improve colleges and universities. The one certain result of the Spellings initiatives will be a mountain of new reporting by colleges and universities, funneled to the Federal government via accreditors. Without formal assessment, this view holds, nobody learns anything.
But for human beings, it's obviously wrong, unmeasured good things happen all the time. Left alone, a 5-year old will explore, discover, and learn. So will a 20-year-old. They get up in the morning and do things, for at least a good part of the day, whether anyone watches and measures them or not. Many people read even if they aren't forced to. The professor does nothing; the student learns anyway. Medical doctors live by the dictum Primum non nocere: first, do no harm. Sometimes the best treatment is to leave the person alone. That's because - unlike steel girders - students are living creatures. (We'll return to this point.)
2. Motivation is simple. "Rewards drive behavior," said several speakers with no more thought on the matter, moving easily to the use of money to guide institutions. Students and professors alike were considered to be easily directed. If tests are "high stakes," students will automatically want to do well, and if colleges as a whole do poorly, they should just be punished. Nowhere did the Spellings Commission report, or the "action plan" presented at the summit, consider that students might not like standardized tests, that administrators find report-writing onerous, or that professors could resent the nationalization of educational goals-and quit teaching altogether. Coercion, it is believed, is a simple and effective method for directing people. After all, if you put a steel girder on a flatcar, it will stay there until moved. And if you melt a steel girder to 4,000 degrees F., it almost never gets angry and storms out of the room or broods.
Consider one of the immediate results of No Child Left Behind, the resignation of hundreds of fourth-grade teachers. Coercion costs; people will try to avoid it. They'll quit their job, for instance. They'll get angry and sulk in the back of the room. "Getting tough" is not the answer.
3. Clearly stated goals at the outset are a prerequisite for success. In machining, or the production of microchips, precise specifications, measured to the nanometer, are necessary. Everything must be planned, laid out in advance, then rationally carried through to completion. As several speakers said, "We all know what needs to be done," as if that were a simple thing.
But in fact, serendipity -- the occurrence of happy, if unpredicted, outcomes seems to have no place in this scheme. The great Peter Drucker recognized that in business, unplanned outcomes can be better than planned outcomes. Post-it Notes and Viagra, for instance, were not intended outcomes in planning; they were huge successes.
People set their own (often conflicting) goals; they resist coercion; they often surprise us. Admittedly, that makes working with them (healing them, leading them to salvation, encouraging their curiosity) a messy process. But I've seen no evidence that business people are better at it than educators.
Daniel F. Chambliss
Daniel F. Chambliss is chair of the sociology department at Hamilton College and director of the Project for Assessment of Liberal Arts Education. He is the author of Champions: The Making of Olympic Swimmers and Beyond Caring: Hospitals, Nurses and the Social Organization of Ethics.
How often have we heard, “People with talent and ideas are America’s greatest resource”? And yet, while colleges and universities have as their primary goal the delivery of top quality academic programs, few take full advantage of the talents that are available to help meet this goal from the retired professionals in their communities.
In most university and college communities there is a growing pool of talented retired or transitioning individuals who would like nothing more than to make a difference by using their knowledge and experience to improve their communities and institutions while continuing the process of their own personal development.
Added to this resource is the emerging wave of boomers who will be not retiring in the traditional way. They will be reinventing themselves as they enter new careers and develop new active roles of service. These will be professionals from a wide variety of fields (education, health, government, the arts, business and nonprofit executives, scientists, engineers, and retired military etc.) who have the energy, interest and ability to continue as active contributing members of society for a longer period of time than any preceding generation. With each year thousands of highly trained individuals are added to this growing but under-utilized pool of talent.
Unfortunately, few colleges and universities have made any formal attempt to develop a successful working relationship between the institution and this exciting and capable source of talent. Relationships have been more a matter of chance than conscious planning.
Most of these focus on the use of retired faculty living in the area or local professionals to serve as part-time faculty to meet a very specific and unmet instructional need. For many retired individuals, this form of relationship is inappropriate, of little interest, or impractical since they may be available for periods of time that do not mesh with the academic calendar. The question then becomes how to best take advantage of more diverse individuals to improve the quality of our institution?
There are a wide range of possible options for involving transitioning or full-time retired persons in the day to day operation of every institution. The alternatives have the potential not only of being extremely beneficial to a college or university and to the community, but at the same time can significantly improve the personal well-being of those who are offering their services. The institution, the community, and the volunteer can all gain from this relationship.
Using the Talent
In addition to teaching a course for credit, other services that these individuals can provide are:
Professional Expertise: Building on their backgrounds, they can serve as guest lecturers, members of panels or as special advisers to students working on team projects In addition, they can be tutors for students who enter courses with special needs or mentors to those students who would like assistance as they address advanced topics in greater depth. The challenge here for faculty is finding the right person or persons with the right set of competencies who will be able to mesh into the instructional sequence that is planned.
Life Experiences: One area of possible service that is often overlooked is the ability for these individuals to bring to the classroom a perspective that may have little or nothing to do with their professional fields of expertise. For example, in every community there are individuals who have lived through the depression of the early 1930’s, served in the military in WWII or the wars that followed, individuals who have lived through the Holocaust or other major genocides, people who have had to face religious or racial intolerance, were active in the Civil Rights Movement, have lived through the challenges of moving to the United States from another country, or have spent parts of their careers working overseas. In each instance, their participation can add a unique dimension to any class studying these periods or subjects. Bringing experts in music, art, or theater into a discussion of a particular period of time or social movement or inviting natives of other countries to discuss the culture and attitudes of different societies can add a texture to a discussion that is otherwise impossible. The key, once again, is the creative use of these various talents within the context of courses and programs.
In nontraditional settings: As more institutions view the out-of-classroom environment as a vital element of the academic and learning experience, these individuals can be used as guest resident counselors, club advisers, program consultants, discussion leaders, etc. Not only can they add a vital element of reality that is so often missing in such activities but, in many cases, they may be available to students at times and in places when most faculty are not.
Adding another dimension: There is one additional use of these citizens that, while rarely taken advantage of, can be of significant benefit to the entire institution. Recent research on how people think has shown that as people mature they become what has been called “transformative” or “critical” thinkers, willing and able to question assumptions, beliefs and traditions. With their extensive backgrounds, these individuals have the potential of adding a unique element to a classroom and the campus. These mature and experienced people can help both students and institutional leaders make plans for the future and address new and often unique challenges.
There are a number of existing programs that can provide details on various approaches. As institutions and communities are different, so are the options. Every program reflects the unique culture of the sponsoring institution; they are not cut from any cookie cutter.
The Elderhostel Institute Network is a central office providing information and resources for Institutes for Learning in Retirement (ILR) in the United States, Canada and Bermuda. Elderhostel and Olli programs (the Osher Lifelong Learning Institutes) provide a core of talented retired individuals. In many other countries these programs are known as Universities of the Third Age (U3A). See this Web site for a complete listing:
In the U.S. there are four interesting programs that reflect this diversity:
The Plato Society, at the University of California at Los Angeles, is a good example of an active program in a complex multipurpose university, with excellent outreach in the community.
The North Carolina Center for Creative Retirement is part of an extensive research program in adult learning issues. The outreach and variety of programs it offers has become a major force in drawing early retirees to this region of the country.
The Academy of Senior Professionals at Eckerd College is one of the earliest and most comprehensive programs in the U.S. In a single day, members are advising students, participating in formal on-campus class activities, or attending peer led sessions for members on drama, studio arts computer technology, science and society, the classics, magic, music, current events, and offering a public forum on “The Politics of Identity in a Global Context." Members with scientific backgrounds have, at the request of government officials, conducted a major study of water resources in the region while others played a key role in designing a leadership training program for implementing change for school districts that was funded by a major community foundation in the area. In the course of a year, 28 forums and lecture series in archeology and musicology were given by members for the general public. Members served on many nonprofit boards and government agencies and played an active role in Elderhostel programs offered at the institution. An annual publication includes creative research and writings by members. Working with Eckerd College, the academy also serves as the sponsor and source of coaches for the college’s student award winning participation in the annual national Ethics Bowl. Members have been requested to serve in about 100 class rooms as either “faculty colleagues” or “resource” persons. In addition, one member, a retired diplomat, funded an endowed scholarship in International Affairs and the members contributed about $750,000 to renovate the building in which they meet, which was once the college president’s home.
Civic Ventures provides a portal through which active seniors can make a difference in society. While not necessarily related to a college or university, many of the Civic Ventures approaches can easily be applied to other programs.
The first challenge that institutions face is establishing a process to locate the individuals with the needed talents and willingness to participate; educate faculty and administrators about the potential use of this group; and make the match between needs and opportunities.
Most significantly, this relationship between the college or university and the community cannot be left to chance. It needs to be planned, communicated and perceived as an integral element in the mission of the institution. Fortunately, the costs involved are modest and the benefits will far outweigh the time, energy and the dollars required. Some key suggestions:
The initial first step is establishing an office to facilitate the program. While, in time, it has the potential of bringing financial resources to the institution, the program should be located in the office of academic affairs and not under development. Avoid any hint of second class academic status in the initial design. It is vital that priorities be placed in three distinct areas: 1) the immediate and long term needs of the institution; 2) the intellectual needs of the volunteers; and 3) the future needs of the community.
Provide some appropriate title (Fellows) with academic privileges such as access to library, research facilities and parking. While most volunteers would not expect to be paid for their services, some formal program of recognition and appreciation should be established.
Draw up an initial list of potential recruits from distinguished prospective professionals in fields that are related to your institution’s curriculum, strengths and needs and to other fields that are of importance to the well-being of the community. It is important that this group be as diversified as possible and not dominated by any one profession or group.
Get faculty, administrative and community involvement from the beginning. Establish a high quality advisory board with representatives from all three categories.
Provide adequate space for meetings and for growth. The space can serve multiple purposes, but transitioning professionals require a “place” as a surrogate office where they can work, meet and network with colleagues, etc. Since parking will be essential, a location near but not necessarily on-campus is most important.
Provide funding and staff for the initial year or two. If the group is successfully meeting the needs of its members it will become self-sufficient in a relatively short period of time.
Create some simple, but formal, organizational structure through Bylaws that will give the group an identity, and related through the office of Academic Affairs. Normally the group itself will be involved in this process during the first year of organization.
This program, if developed with care, has the potential of generating far more benefits to the institution, the individual volunteers and to the community than is immediately apparent. For example, in addition to their instructionally related functions, such a group might serve as:
Ambassadors of the school in the community (volunteers are more credible than paid employees).
A core think-tank, with sub-groups, on a wide variety of issues, and commissioned by community groups for special studies and tasks.
A source of potential research colleagues and collaborators for faculty.
The resource bank for speakers, consultants, etc.
The energy source and place from which professionals develop their own talents, form new professional relationships and spin off new enterprises.
A special “advisory” group for senior institutional officers and sounding board for testing new ideas, evaluation and planning.
A talent bank from which the community can draw pro bono professional services to benefit the non profit infrastructure and municipal government.
A Final Word of Caution
Working with talented and dedicated people is always challenging and rewarding for everyone involved. Therefore it is crucial in programs of this type that both the faculty members and resource persons keep their focus on the objectives of improving the quality of the academic experience for students, the wellbeing of the community and health of the institution. If this primary goal is not clearly articulated from the beginning, some some faculty and administrators may perceive this relationship as an attempt by experienced “outsiders” to take over the classroom or program. The potential for significant impact and a delightful personal experience for faculty, students, administrators and the resource persons is there. They key is to keep focusing on the mission of working together toward a common goal.
Robert M. Diamond and Merle F. Allshouse
Merle F. Allshouse was director of the Academy of Senior Professionals at Eckerd College from 1994-2002. He has been president of Bloomfield College, vice president of the University of Colorado Foundation, and a professor of philosophy and religion and associate academic dean at Dickinson College. He is a Fellow of the Florida Studies Program at the University of South Florida. Robert M. Diamond is president of the National Academy for Academic Leadership and professor emeritus at Syracuse University. His publications include Designing and Assessing Courses and Curricula. He has held joint administrative and faculty positions at Syracuse University, SUNY Fredonia, the University of Miami and San Jose State University.
“HITS WITH THE APPROXIMATE FORCE AND EFFECT OF ELECTROSHOCK THERAPY” raved Roger Kimball’s review in The New York Times, as quoted on the paperback jacket of Allan Bloom’s The Closing of the American Mind, a surprise best-seller in 1987 and the opening salvo in a ceaseless conservative war against the academic and cultural left. On the 20th anniversary of The Closing, and 15 years after Bloom’s death, the most salient issues concerning Bloom are his role in neoconservative Republican circles and his semi-closeted homosexuality, possibly culminating -- as in Saul Bellow’s thinly fictionalized account in Ravelstein -- in death from AIDS.
In Bloom's introductory chapter to his 1990 collection of essays Giants and Dwarfs, titled "Western Civ," previously published in Commentary, he responded to the reception of The Closing as a conservative tract by claiming that he was neither a conservative ("my teachers--Socrates, Machiavelli, Rousseau, and Nietzsche -- could hardly be called conservatives") nor a liberal, "although the preservation of liberal society is of central concern to me." He saw himself, rather, as an impartial Socratic philosopher, above political engagement or "attachment to a party" and denying, against leftist theory, that "the mind itself must be dominated by the spirit of party."
A close re-reading of his books, however, confirms that they are lofty-sounding ideological rationalizations for the policies of the Republican Party from Ronald Reagan to George W. Bush.
Bloom rages against the movements of the 60s -- campus protest, black power, feminism, affirmative action, and the counterculture -- while glossing over every injustice in American society and foreign policy (he scarcely mentions the Vietnam War).
Bloom’s personal affiliations further belied his boast of being above “attachment to a party” and captivity to “the spirit of party.” Today these statements appear to go beyond coyness into the kind of hypocrisy that has become boilerplate for conservative scholars, journalists, and organizations like the American Council of Trustees and Alumni or National Association of Scholars, whose leaders vaunt their dedication to intellectual disinterestedness while acting as propagandists for the Republican Party and its satellite political foundations. The magazine in which Bloom made these boasts, Commentary, and its then-editor Norman Podhoretz, were prime examples of this hypocrisy. Podhoretz proclaimed in his 1979 book Breaking Ranks about Commentary, “I could say that the reason for our effectiveness [against the New Left’s alleged subordination of intellectual integrity to political partisanship] was a high literary standard.” But in the 80s he turned Commentary into a fan mag for President Reagan and in 1991 commissioned David Brock, in his self-confessed “right-wing hit man” days, to write an encomium to the intellectual gravitas of Vice President Dan Quayle.
For years, Bloom was co-director of the John M. Olin Center for Inquiry into the Theory and Practice of Democracy at the University of Chicago, which received millions from the John M. Olin Foundation. That foundation, whose president was William J. Simon, multimillionaire savings and loan tycoon and Secretary of the Treasury under President Ford, at its peak spent some $55 million a year on grants "intended to strengthen the economic, political, and cultural institutions upon which ... private enterprise is based.
William Kristol wrote a rave review of The Closing in The Wall Street Journal (where his father was on the editorial board), which is also quoted on the paperback jacket; he was at the time Vice President Quayle's chief of staff, and is now editor of Rupert Murdoch’s Weekly Standard. Kimball, the Times reviewer, was an editor of The New Criterion, and yet another Olin beneficiary. (So much for the Times’ fabled vetting of reviewers for conflicts of interest.) The supposedly liberal mainstream media have been complicit at worst, silent at best, in these conflicts of interest concerning Bloom and other conservative culture warriors, as in failing to consider how much the success of The Closing was attributable to Republican-front publicity channels. Yet conservatives have the chutzpah to accuse liberal academics and journalists of cronyism!
More significant for today is Bloom’s influence as mediator between the ideas of his mentor at the University of Chicago, Leo Strauss, and what has become known as the “neconservative cabal” of Straussians behind the Iraq War in the administration of George W. Bush. Paul Wolfowitz was Bloom’s student; Bellow’s Ravelstein says of Wolfowitz’s fictitious counterpart, “It’s only a matter of time before Phil Gorman has cabinet rank, and a damn good thing for the country.” Ravelstein depicts Ravelstein’s apartment as a high-tech communications center with a Wolfowitz-like disciple in Washington and other movers and shakers in international affairs during the Reagan and first Bush administrations, including the Gulf War -- in which Ravelstein and his protégés (few of whose real-life counterparts ever served in the military) privately condemn President Bush for a failure of nerve in not taking Baghdad and toppling Saddam Hussein.
Avowed Straussians invoke Strauss’s key ideas such as the defense of the manly, militaristic exercise of power by nation-states acting for virtuous ends, the running of government by a behind-the-scenes intellectual elite serving as advisors to the ostensible rulers, and the elite’s use of “noble lies” extolling patriotism, war, religion, and family values, to manipulate the ignorant masses into supporting pursuit of tough-minded realpolitik. My sense, however, is that Strauss’s, and Bloom’s, high-minded philosophical formulations of these ideas have just been vulgarized by Republicans as a pretext for unprincipled American imperialism, hypocritical manipulation of their conservative base’s faith in “moral values,” opportunism by would-be Machiallevian advisors-to-the-Prince, and the kind of tawdry autocracy, secretiveness, venality, and lying that came to mark George W. Bush’s administration. Bellow portrays Ravelstein reveling in the money, celebrity, and influence in Republican politics that ironically resulted from his best-selling book that decried such vulgar distractions from the life of the mind. He is thrilled at being feted by President Reagan and Prime Minister Thatcher, as Bloom was. The implicit moral is that intellectuals, whether of the left or right, who aspire to be the erudite power behind the throne typically end up groveling before it.
Even more anomalously, in the past decade or so, conservative attacks on liberal academics -- whom Bloom and others like Podhoretz earlier accused of betraying scholarly non-partisanship and intellectual standards -- have taken a turn toward ever-more-stridently populist, partisan derision of scholarship and intellect altogether. I made this point in an a column here last year with reference to David Horowitz inciting know-nothing Republican state legislators like Larry Mumper of Ohio and Stacey Campfield of Tennessee to government interference with academic freedom under the banner of The Academic Bill of Rights. (All of the bloviating conservative commentators on my column evaded the issue of conservative flip-flopping between elitist and ad populum lines of argument.)
Horowitz, Ann Coulter, Bill O’Reilly, and Rush Limbaugh have presented themselves as champions of the rights of the ordinary people against the latté-sipping “cultural elitists” in university faculties, media, and politics itself -- as in the belittling of John Kerry in 2004 as French-looking, or the derision of Al Gore’s scholarly demeanor. Thus Bloom continues to be revered by conservatives, without their registering the bothersome fact that he advocated precisely the kind of cultural elitism that they now savage. (Never mind his atheism and preference for ancient Greek over Judeo-Christian culture, or his tidbits of gay lingo, as in his half-admiring description in The Closing of Mick Jagger, “male and female, heterosexual and homosexual ... tarting it up on the stage.”
All these contradictions in Bloom’s texts and life are an exemplary case of the long-running schizophrenia of American conservatism in what resembles the old Good Cop-Bad Cop routine -- Good Cop intellectuals who lay claim to aristocratic traditions and high moral or academic/intellectual standards, and Bad Cop philistines who are the public face of conservatism, in presidents from Coolidge to Reagan to Bush II, in vulgarian billionaires like Rupert Murdoch and Richard Mellon Scaife, along with the corporations and their executives whose pursuit of ever-increasing profits debases culture to the level of the lowest common denominator of taste. The baddest of the Bad Cops are rabble-rousing enforcers like Coulter, O’Reilly, Limbaugh, and Horowitz. It is the utter failure of Bloom and other conservative intellectuals to dissociate themselves from or even acknowledge the vulgar variety of conservatism, that ultimately exposes the hypocrisy of their lofty ideals and their selective indignation against every variety of liberal/leftist villains.
The same compartmentalized thinking that enables highbrow conservatives to champion Straussian-Bloomian elitism yet not speak out against philistine conservatism has enabled them to evade the issue of Bloom’s homosexuality, particularly in regard to Bellow’s Ravelstein. Bellow avowed that his Ravelstein was modeled on Bloom in virtually every detail. The novel’s narrator, a Bellow near-double and Ravelstein’s best friend, repeatedly insists that Ravelstein designated him as memorialist and instructed him to tell the unvarnished truth. That Bloom like Ravelstein was and notoriously misogynistic, is undisputed. In an article on Ravelstein in The New York Times Magazine, D.T. Max quoted Wolfowitz saying that in Bloom’s Chicago circle when he was alive, “‘It was sort of, Don’t ask, don’t tell.’” But whether Bloom had AIDS is disputed. Bellow’s narrator explicitly describes Ravelstein having the symptoms and medical treatment for HIV. After galley proofs circulated, Max reports that pressure was put on Bellow to revise, and he backed down to the extent of telling D.T. Max, “‘I don’t know that [Bloom] died of AIDS, really. It was just my impression that he may have.’” Yet Bellow subsequently made only minor revisions in the passages about HIV for the final book.
Furthermore, both Bloom and Ravelstein had a young, long-term male companion, whom they held in high regard and made their heir. In the galleys, they are said to be lovers, but in the published version Ravelstein “would sometimes lower his voice in speaking of Nikki, to say that there was no intimacy between them. ‘More father and son.’” Ravelstein also “disapproved of queer antics and of what he called ‘faggot behavior.’” Yet Bellow’s narrator also says Ravelstein “was doomed to die because of his irregular sexual ways.” Ravelstein himself says, “I’m fatally polluted. I think a lot about those pretty boys in Paris. If they catch the disease, they go back to their mothers, who care for them.’’
A rather vague sequence in which the dying Ravelstein says he still obtains sexual relief from “kids,” and asks the narrator to write a check for an unidentified one, is apparently a censored version of a passage in the galleys that, according to Christopher Hitchens in The Nation, read as follows:
Even toward the end Ravelstein was still cruising. It turned out that he went to gay bars.
One day he said to me, “Chick, I need a check drawn. It’s not a lot. Five hundred bucks.”
“Why can’t you write it yourself?”
“I want to avoid trouble with Nikki. He’d see it on the check-stub.”
“All right. How do you want it drawn?”
“Make it out to Eulace Harms.”
“That’s how the kid spells it. Pronounced Ulysee.”
There was no need to ask Ravelstein to explain. Harms was a boy he had brought home one night. . . . Eulace was the handsome little boy who had wandered about his apartment in the nude, physically so elegant. “No older than sixteen. Very well built...."
I wanted to ask, what did the kid do or offer that was worth five hundred dollars....
James Atlas’s biography of Bellow confirms Hitchens’ account of the galleys and adds, “On one occasion [Ravelstein] recruits a black youth from the neighborhood to satisfy him, insisting that he practices ‘safe sex.’” The racial factor here is disturbing, especially since the passage immediately follows a scornful comment by Ravelstein about the South Chicago “ghetto.” It also highlights the absence of any mention in The Closing of the vast black “neighborhood” that surrounds Bloom’s idyllic University of Chicago.
Atlas’s Bellow biography, published shortly after Ravelstein in 2000, contains only a few pages about the novel tacked on at the end, which cite Max and Hitchens but add little to their accounts of Bloom, other than the above sentence and another saying, “A frequenter of the sex emporiums of North Clark Street, Bloom confessed to Edward Shils that he ‘couldn’t keep away from boys.’”
Now, both at the time of Bloom’s death, when the obituaries labeled his cause of death a combination of bleeding ulcers and liver failure, and subsequently, the issues of Bloom’s predatory homosexuality and AIDS have been evaded by Bloom’s allies. Bellow’s own accounts are quite confusing. After proclaiming that Ravelstein was a true-to-life homage to his great friend Bloom, it would seem malicious beyond belief for him to have fabricated out of whole-cloth a character “destroyed by his reckless sex habits,” whom he knew everyone would identify as Bloom -- especially if, as he later claimed in the Max interview, he really didn’t know the truth.
These questions might not be worth dwelling on if Bloom and his book had not been canonized by social conservatives and Straussians, both of whom anathematize homosexuality and sexual promiscuity of all kinds. If Bloom’s private life was indeed louche, doesn’t that render suspect the encomiums in The Closing to Platonic love, especially in The Symposium, and Bloom’s denunciation of modern sexual license? And might the stonewalling by Bloom’s allies be an instance of the Straussian “noble lie”? (In his Commentary review of Ravelstein, Podhoretz, who elsewhere rages against homosexual promiscuous “buggery” and pederasty, displayed his infallible double standard toward leftist adversaries versus rightist allies in ignoring the more tawdry details to give Bloom dispensation for keeping his homosexuality discreetly closeted, and in accepting at face value Bellow’s late disclaimer about Bloom having AIDS.) Indeed, the Bloom case might be paradigmatic of neoconservatives’ predisposition toward dissembling and covering up vices in their own ranks that belies their exacting of moral rectitude from everyone else.