Another semester, more syllabi. Is it possible there was ever a semester where I really just strolled in on the first day, scribbled the texts on the blackboard, mentioned the number of tests, asked if there were any other questions, and then proceeded to take up most of the hour discoursing on the nature of the subject of the course. This is how I began semesters over 30 years ago.
No more. Now it is as conceivable to begin a course without a syllabus as it would be to begin by telling a racial joke. The syllabus demonstrates that departmental guidelines will be followed. The syllabus assures that the catalogue description will be conformed to. A course description? On the syllabus. Course outcomes? Listed. What about attendance policies? Allowance for students with "special needs"? Methods of assessment? All on the syllabus, along with bonuses such as, for example, a statement about fostering student growth if the course is part of a core curriculum or a statement about academic integrity. Some syllabi even include a description of material and tasks for every single day of the ensuing semester.
How can we explain why such excruciatingly detailed syllabi are now mandatory for each course? Simple: to defend against legal challenges by students -- most obviously concerning grades but finally encompassing any conceivable matter having to do with evaluation. Consequently, a professor faces opening day before students like a defense attorney preparing an opening statement to the jury.
But why have so many syllabi swelled to such length? The existence of syllabi as legal documents might explain why they have come into requisite being in the first place. It does not wholly explain why they have becomes encrusted with such details as the instructor's cell, the new assistant dean's office number, or links to all manner of Web sites.
It seems to me that we have become unsure about what not to put on syllabi because we have become unsure what a course is. It is no longer self-contained. My behavior decades ago on opening day was so carefree as to seem irresponsible today. It is as if the course was mine and mine alone. Of course it was not. For starters, it was the department's. But I felt as if the course was mine, if only because there were no assistant deans to which any students had recourse if they flunked the mid-term, and there were no e-mails to remind me to turn in two copies of each of my syllabi to the department secretary.
Today the more syllabus-heavy a course, I would argue, the more context-dependent. The course is now viewed as part of a department, the department is part of a program, the program is part of a division, the division is part of an institution, and so on. So when a syllabus details criteria for grading, or methods of instruction today, it is not merely about the course anymore. The syllabus is burdened with a definition of a course so expanded that the very existence of an individual instructor threatens to become effaced.
The various imperatives that govern the disposition of any one course are far more decisive. Indeed, part of the consequence of these imperatives is to act, in turn, to characterize the teacher as an "instructor" rather than as a "professor." In fact, the instructor of any one course is likely to be an adjunct, since upwards of half of the college-level courses taught throughout the United States at the present time are taught by adjuncts. This fact alone provides much of the reason why syllabi have become so important.
Adjuncts are marginal to departments, by definition. No wonder they are expected to produce handsome syllabi, through which they publicly demonstrate -- to themselves, as well as to their departments, their institutions, their professions or even their states-- their fealty to the sovereign wholes that authorize them to appear before students in the first place.
No wonder also, though, that many make use of what space they have on the syllabus to embellish it further, with everything from idiosyncratic stylistic riffs on the course description or more minute calibrations of the grading scale to explorations of nuances concerning class attendance. Some measure of authority, not to say self-respect, is thereby gained. How much depends upon the individual instructor, through whom, like the director of a play, the directives of the syllabus still remain to be performed.
But the result may nonetheless emerge ill-timed or poorly acted. I recently heard the following story. A young adjunct was teaching his first course. If he was not sure of himself, he was sure of his syllabus, until one day a student missed a test. When she appeared at the next scheduled class, he confidently declared thus: "You missed the test. You can't retake it. See the syllabus." "I did," the student replied, "and it says that I can take the test if I have a written explanation. Here's the explanation."
She presented a piece of paper, with a flimsy excuse she had written by herself. "I meant a doctor's excuse," protested the young adjunct. "Well," countered the student, "that's not clear from the syllabus."
The adjunct had to admit it was not. So he relented when the student threatened to "go straight to the dean," and agreed to give the test to the student that day. But she refused, insisting that she could only take the test the next day, at 7 a.m. Then the adjunct refused. The two compromised: 8 a.m. He should not have been surprised when the student failed to show up. I never learned the rest of the story.
Among many possible morals, let me emphasize one: a syllabus is not a script. As a legal document, it may backfire. As a pedagogic statement, it will be incomplete. The forces that surround syllabi -- ranging from deans down the hall to mandates from the state capitol -- are now too powerful. Not only can they not be resisted, but in many cases, they cannot even be determined, until the semester begins. There is a distinct sense in which the most detailed syllabi, whether by design or not, act to defer the beginning of the semester to a timeless moment, when all is fresh and new, the curtain is ever about to rise, and everybody is on the same page.
Who has not dreamt of such a moment? Sad to have to admit that the dream is vain. Any syllabus is fated to yield to the messy circumstances of its course, with results that cannot be predicted. This is reason enough to be against syllabi; their presentation of a course as a fully reasoned, systematically organized thing is spurious. A course that is only its syllabus, day after day, is a course where spontaneity, improvisation, and risk have been banished. The loss is too great.
Syllabi always put me in mind of that celebrated notion of Jorge Luis Borges, about the map that has grown so ambitious and comprehensive that it is finally stretched to cover the earth completely. The map and its land are one. No matter, in contrast, that the syllabus and its course can never quite be one. We -- students and instructors both -- ought to oppose syllabi because of the presumption they express as well as the legalism they confirm. A map is not necessary for every destination. Some of the most memorable ones result from just getting lost.
Terry Caesar is an adjunct professor at San Antonio College. He is the author or co-editor of seven books, including three on academic life, the most recent being Traveling though the Boondocks.Â
Last week, Ohio became the latest state where legislators introduced an "Academic Bill of Rights for Higher Education."
The bill seeks to impose on all private and public colleges and universities an administrative code allegedly designed to prohibit political and religious discrimination. It calls on the institutions to guarantee student access "to a broad range of serious scholarly opinion" and expose them to "a plurality of serious scholarly methodologies and perspectives." It insists that students "be graded solely on the basis of their reasoned answers" and prohibits discrimination on the basis of "political, ideological, or religious beliefs." Faculty members would be forbidden from using their classrooms "for the purpose of political, ideological, religious, or antireligious indoctrination"; and they would be barred from "persistently introducing controversial matter into the classroom ... that has no relation to their subject of study and that serves no legitimate pedagogical purpose." The bill extends its dubious protections to all student organizations, to the hiring and promotion process, and even to "professional societies formed to advance knowledge within an area of research."
I have to guess that the vast majority of college faculty and administrators find this legislation baffling. Surely most honor the ideals of impartiality in dealing with students as part of the air we breath; it goes without saying that these principles are the foundation of the university. So, at least here in Ohio, we're scratching our heads and wondering why the State Senate should be wasting its time considering legislation to fix something that isn't broken and correct a problem that doesn't exist.
But, of course, the oh-so-neutral language of the bill only hides its profoundly ideological purpose. The Ohio bill is just a knock off from David Horowitz's war against higher education. Here, at least, there is no doubting the motives behind the bill. One of its main sponsors, his quotes crying out for placement in a Sinclair Lewis novel, told the Columbus Dispatch that the bill was necessary because "80% or so of [college faculty] are Democrats, liberals, or socialists or card-carrying Communists." When asked for evidence that these radicals were corrupting "young minds that haven't had a chance to form their own opinions," as he described college students, the senator contended that, after months of investigation, he heard of a student who claimed to have been discriminated against because she supported Bush. One second-hand rumor is all he had after three months? His standards of evidence wouldn't get him through one of my introductory American Civ classes.
Given the intellectual dishonesty behind the bill, it is only reasonable to wonder what political forces are lurking behind it and whose agenda it is fulfilling. Horowitz long has found his calling in attacking the academic left, and he was prodded to obsession several years ago when some of his attempts to place ads opposing slavery reparations in various college newspapers were rebuffed. These incidents led to the establishment of the Students for Academic Freedom, a remake of the '60s-era Young Americans for Freedom that now claims 135 chapters. Spurred on through the heated atmosphere of the presidential election, Horowitz's now-organized obsession is finding sympathetic support among right-wing radicals in the various state wings of the Republican Party. Apparently, now that they can't attack John Kerry or gay marriage, the right-wing media machine and its followers in state governments have trained their sights on a next-most favored whipping boy, the university professor.
As parts of a larger ideological war, the Ohio bill is the political equivalent of a frat boy prank. It can do no good. It can do considerable harm, but only in the unlikely possibility that responsible people take it seriously. Any amateur can look at the bill as it stands and see what a sloppy piece of work it is. Nowhere does it define what constitutes "a plurality of serious scholarly methodologies," how "indoctrination" is to be measured, or how discrimination is to be detected.
When a Dispatch reporter asked the bill's sponsor what constituted "controversial matter" to be barred from the classroom, he didn't exactly narrow things down: "Religion and politics, those are the main things." There goes any discussion of Thomas Jefferson in my history classes, or Martin Luther King or -- well, pretty much any discussion of anything. The bill discriminates because it applies only to "humanities, the social sciences, and the arts," and leaves, thereby, those card-carrying Communists in business departments free to continue denouncing the evils of compound interest. And yet it is simultaneously so broad that the state's Bible colleges would have to shut down entirely. If this bill passed, we would either have to ignore it completely or stop teaching.
The sloppiness may well be intentional, since the goal isn't good law but political intimidation. The most plausible outcome is that the bill will die a quick but noisy death: After hearings in which radical right-wingers get headlines by blasting academics, college presidents pledge to promote fairness and the bill dies. Meanwhile, red-baiting students will get the not-surprising impression that they can level charges against any professor who makes the slightest polemical point, or, more important, who utters a disconcerting truth. Students who aren't satisfied with an administrative response are likely to sue. The university will waste precious money in either administrative or legal costs, and any atmosphere of robust and critical thought that now exists will dissipate as many instructors take the line of least resistance.
Not the least curiosity here is that the very same people who, 10 years ago, ridiculed the campus speech codes as "political correctness" now want to impose the most extreme sorts of speech codes through force of law and outrageous intimidation. The very people who howled about the debunking of the great Western traditions of free speech and critical reason are now engaged in a frontal action that can only squelch free speech and establish a radical subjectivity as the rule of the day.
After all, anything any student wishes to find discriminatory, under the law, could indeed be removed from the classroom; education would devolve into whatever pandered to the individual bias of every student. Truth, that noble thing conservatives always say they seek, will become the same degraded thing that it has become with the likes of Limbaugh, Fox News, and Horowitz: mere "spin." The radical right, it seems, has learned well from the postmodern left.
David Steigerwald is associate professor of history at Ohio State University and director of the history program at Ohio State's Marion campus. His latest book, Culture's Vanities ( Rowman & Littlefield ), is, incidentally, a critique of much that passes for academic leftwing thought today.
Submitted by KC Johnson on February 2, 2005 - 4:00am
Last week, Inside Higher Ed reported on the latest call by the Association of American Colleges and Universities (AAC&U) to reorient "liberal education." The new initiative reflects the organization's customary aim: abandoning the traditional goal of providing students with knowledge derived from the disciplines of the liberal arts and adopting an agenda focused on teaching students what to think about contemporary political and social issues. In the open, such a scheme could never obtain approval. So the AAC&U operates by stealth.
First, the organization employs commonly accepted words and phrases that sound unobjectionable but are vague enough to justify any type of instruction. Press releases outlining the new initiative, for example, spoke of "empowering" students to make "ethical judgments" as citizens of a "diverse democracy" by supplying them with a "practical" education that encourages "global knowledge and engagement" in "an era of greater expectations."
Second, the AAC&U targets non-elite, mostly public institutions, which usually lack regular involvement from parents or alumni, the figures most likely to oppose the feel-good, fuzzy curriculum that the organization promotes. These schools are also less likely to enroll students whose educational backgrounds would enable them to question ideologically biased classroom presentations.
Third, the organization champions a curriculum based not on transmitting knowledge but instead on providing students with skills -- critical thinking, effective writing, or "diversity skills." According to Debra Humphreys, the AAC&U's vice president for public affairs, "There's just no way that you can identify an educated person by a body of content."
As Humphreys well understands, however, college courses must teach students something -- even if they ostensibly stress skills. A glance at the institutions that have instituted an AAC&U-style curriculum reveals that the best for which students can hope is a dumbed-down set of classes from which they will learn nothing. One wonders how many AAC&U administrators or board members, whose education and salaries safely ensconce them in the upper levels of the middle class, would send their children to colleges that have implemented the organization's agenda.
For example, Indiana University-Purdue University-Indianapolis (IUPUI), with a student body of nearly 20,000, requires all freshmen to enroll in an interdisciplinary class teaching such "skills" as "a survey of campus resources" and "time management." The university's provost hopes that this structure eventually will allow students to receive academic credit for "self-acquired competency" through such means as "self-discovery."
Portland State, Oregon's largest university, requires a two-semester interdisciplinary course on how to "work in a diverse society and act in socially responsible ways." Students can avoid transparently one-sided courses such as "Us and Them: A History of Intolerance in America" only by enrolling in feel-good offerings such as "Empowerment of Youth on Probation -- Girl Power" or "The Spirituality of Being Awake." The latter course asks, "What is the cost of being wide awake?" At Portland State, apparently, the cost is tuition for six credit hours.
When courses at AAC&U-oriented schools offer content, the intent seems to be to indoctrinate rather than to educate. The catalog at Washington's Evergreen College, for instance, is filled with courses reflecting only one point of view on controversial political issues. A typical example, "Inherently Unequal" (Evergreen's class on U.S. history since the Brown decision), features a description stating -- as unquestioned fact -- that at the end of the 20th century, "racist opposition to African American progress and the resurgence of conservatism in all branches of government barricaded the road to desegregation."
The AAC&U envisions such openly political instruction as the norm. Shortly after the World Trade Center attacks, Senior Vice President Caryn McTighe Musil argued that the "heinous acts committed September 11" demonstrated the importance of "educating students in ways that promote active engagement" and that emphasized their status as citizens of a "diverse democracy." (The AAC&U always describes the United States as a "diverse democracy," not a "democracy," hinting that a fundamental difference exists between the two types of government.) Students, McTighe Musil continued, needed guidance "in advancing democracy and justice everywhere" and in creating "socially responsible, peaceful, and equitable societies."
The AAC&U seems unwilling to recognize that people, in good faith, define the path to "advancing democracy and justice" in very different ways, and so adopting such a goal requires colleges to take sides on political questions. Literally and theoretically, though never in practice, one could imagine a number of causes that would fit the organization's parameters -- fund raising for Israel, by demonstrating an "obligation to humanity" through defending innocent civilians against suicide murderers; or a Roman Catholic pro-life campaign, by promoting justice through preventing destruction of innocent life; or rallying for the war in Iraq, by "advancing democracy" in a country that never previously had a free election. But in the AAC&U's universe, matters such as the "heinous acts committed September 11" could yield only one set of policy recommendations -- the organization's own -- and college courses should teach this ideological approach as gospel.
By providing a fig leaf to administrators and professors who want to shape students' political opinions rather than to educate undergraduates, the AAC&U deserves condemnation. Yet the organization's insidious nature comes more from its shameless framing of a paternalist educational agenda in populist terms.
Despite their frequent calls for "empowering" students, AAC&U supporters actually have contempt for the intellectual abilities of the middle- and lower-class students that they claim to represent. One of the AAC&U's favorite presidents, Wagner College's Richard Guarasci, justified his curricular agenda by describing a campus that no objective observer would recognize. Students, he claimed, arrived at Wagner "fearing encounters with 'the stranger' " (this in New York City, the most diverse city in the world) and in "deep denial about the contours of inequality." Undergraduates who harbored such inappropriate beliefs could only learn "the arts of democracy" through a reoriented curriculum based on "intercultural and diversity education" that would promote "the objectives of pluralist or multicentric democracy." The AAC&U makes similar claims about student attitudes.
Perhaps, as Guarasci and the AAC&U imply but never state directly, a liberal arts education is appropriate only for students at elite institutions, and others should receive a "liberal" education that focuses more on skills and behavioral issues. But that theory requires accepting on faith two highly dubious assumptions: first, that racial, ethnic, and gender tensions are so extreme on today's campuses as to mandate "diversity skills" as the central goal of a college education; and second, that an AAC&U-style curriculum represents the only way to instill in students the values necessary to function as citizens of the United States.
Over the past three years at my own institution, Brooklyn College, various personnel and curricular controversies (including my tenure case) associated with the institution's adoption of AAC&U policies spawned a remarkable grassroots movement of students -- of differing genders, races, ethnicities, and socio-economic backgrounds -- that made clear that they did not need feel-good courses structured by condescending administrators. Dan Weininger, who led the movement while preparing for law school and interning for Federal District Judge Richard M. Berman, summed matters up for one reporter: "What students want is knowledge, not to be fed dogma." Martine Jean, who came to the United States from Haiti at the age of 11, graduated from Brooklyn as the winner of the Mellon and Ruth Kleinman fellowships; from Yale, where she is studying for her Ph.D. degree, Jean expressed her concern lest Brooklyn embrace an academic culture in which "mediocrity and partisanship are valued over quality of scholarship." Christine Sciascia transferred into Brooklyn and wound up a Phi Beta Kappa nominee; in published letters to The New York Sun and The Chronicle of Higher Education she excoriated the college for insufficiently valuing faculty research. Yehuda Katz, editor of the campus newspaper, published a devastating multi-part series explaining how the AAC&U's "liberal" education would devalue a Brooklyn degree; in response, the campus administration tried to shut down his newspaper. Apparently only those students who supported the AAC&U agenda should be "empowered."
A deep-seated class prejudice exists at the core of the AAC&U's philosophy. Stripping away the sloganeering, AAC&U activists never explain why students at public colleges -- students like Weininger, Jean, Sciascia, and Katz -- should be cheated of their access to a world of knowledge that would truly empower them to exercise their own free will in what is the world's most diverse democracy. Instead, the AAC&U operates by stealth, fully aware that in the light of day, most politicians, administrators, parents, faculty, and students would see their agenda for what it really is: an attempt to create a new generation of social activists through a watered-down, feel-good curriculum that no quality college or university ever would tolerate.
KC Johnson, a professor of history at Brooklyn College and the CUNY Graduate Center, is a visiting professor at Harvard University for the spring 2005 term.
Newspapers across the country paid significant significant attention last week to the publication of "Educating School Leaders," a report by Arthur Levine, president of Columbia University’s Teachers College. We at the Renaissance Group, a consortium of 36 universities that prepare 1 of every 10 new teachers for the nation’s classrooms and a significant number of principals and school system administrators, take very seriously the business of preparing school personnel.
And we take umbrage at yet another study that paints all colleges of education with the same broad brush stroke on how ineffective we are -- when, in fact, our accrediting agencies and clientele report how well we are doing our jobs and are impressed with the quality of graduates from our member institutions.
We agree with Levine that some school leadership preparation programs lackquality in preparing their students, and for the Renaissance Group’s 16 years of existence we have strived to engage in public debate to help improve these programs. But we feel it is wrong and dangerous to make the kind of sweeping generalizations that Levine does. Among our concerns with his work:
1) Levine’s study did in-depth interviews on only a few campuses with educational leadership programs. Using this small sample to represent the numerous educator preparation programs in the U.S. is misleading. The ultimate question that should be investigated and answered is whether or not those who are being prepared as building and district leaders have the knowledge, skills, and dispositions to promote vision, create positive learning outcomes for children, and be successful in today's ever changing schools. Policy makers and politicians must accept the fact that not all individuals have what it takes to be an effective leader. Likewise, it is true that some institutions and programs are going to have to get serious about quality issues instead of focusing on quantity. If institutions are not willing to make these tough decisions, states will need to intervene.
2) We feel Levine's paper makes many unsubstantiated claims about educational leadership programs, which we don’t want to repeat here to lend them credence. No data is provided to support the negative statements in the paper.
3) Levine’s recommendation that a new degree be created, a master’s in educational administration, with a curriculum in both management and education, approaches the issue from a one-size-fits-all model. This is an old solution to a new set of issues and challenges. It fails to acknowledge that not all programs are alike and that institutions are right now redesigning their educational leadership programs to align them with the work of the public schools the colleges of education have relationships with, as well as with acknowledged standards of student learning and of preparation for school employees.
Numerous Renaissance Group institutions are not only using new and effective leadership preparation models but faculty are actively and directly working with employees in the schools. Faculty members at various schools of education are currently working with local education agencies on principal mentoring programs, an elementary school district on improving student achievements, and with administrators outside the United States on programs to improve their schools. Faculty routinely conduct reorganization studies and curriculum audits for school districts, and work with state boards of education.
The Renaissance Group would agree that certain school preparation programs need either a significant overhaul or to be closed. It is time both to start identifying those programs and institutions and to give more credit to those institutions that are effectively preparing school leaders. Let’s not throw the baby out with the bath water. The Renaissance Group’s vision is that its member institutions will be exemplars for P-16 collaboration, noted for their impact on student learning and leadership in professional education for America’s schools.
Leo W. Pauls
Leo W. Pauls is executive director of the Renaissance Group. Also contributing to this article were Sam Evans, dean of education at Western Kentucky University; Ric Keaster, associate dean of education at Western Kentucky; Tes Mehring, dean of education at Emporia State University; Bonnie Smith-Skripps, dean of education at Western Illinois University; and Tom Switzer, dean of education at the University of Toledo.
Nowadays, many liberal arts colleges promote the economic value of a liberal education. They boast that the impressive careers of liberal arts graduates offer an excellent return on students' tuition investment. Thus, while the cost of a quality liberal education may be high, the economic benefits down the line are greater still.
But while the economic success of liberal arts graduates is certainly worth lauding, we may be missing something more fundamental here. When, as a lawyer-turned-professor, I consider my own liberal education, I can see how it did much more than enhance my career prospects. In fundamental ways, it helped me connect my career aspirations to a meaningful, satisfying life. Looking back over 25 years now, I see how at its best my liberal education offered me increased possibilities not only of money, but significantly, of happiness.
An enduring puzzle of our times is why our well-documented rise in incomes has not led to an increase in our subjective well-being. While well educated Americans are clearly getting wealthier, we are not reporting higher levels of happiness.
Economist Robert Frank offers an intriguing explanation to this puzzle, one that bears on how we think about the value of a liberal arts education. The problem, he says, is not what we make, but how we spend it. "[G]ains in happiness that might have been expected to result from growth in absolute income have not materialized because of the ways in which people in affluent societies have generally spent their incomes."
The difficulty, according to Frank, is that we spend our money in conspicuous ways - such as on bigger houses - that are especially subject to the psychological process of adaptation. Under this process, as people generally buy bigger houses, the social norm for house size increases. Adapting to this rising standard, we need to spend more to get a house we can regard as acceptable. But while we come to spend more for our homes, we do not derive greater pleasure from them. Rather, the size of house that is needed to satisfy us has simply increased. If we wish our growing wealth to help make us happier, says Frank, we need to shift our resources to what he calls "inconspicuous goods." These goods aren’t really goods, but are conditions, like avoiding a long commute or leaving a stressful job. And when our wealth helps us do these things, it does make us happier.
The picture is different for long commutes and stressful jobs because such experiences are less subject to the psychological process of adaptation that occurs with the increasing number of larger houses. "As it turns out," writes Frank, "our capacity to adapt varies considerably across domains." While we easily get used to larger homes, we never completely adjust to longer commutes.
Thus, the key to happier lives is spending more of our resources on inconspicuous goods, those marked by our lesser capacity to adapt. Because increased spending on such goods is more likely to foster our subjective well-being, we are here better able to get our money's worth.
Frank's argument is an intriguing one for me, as at midlife I deepen my understanding of the value of my own liberal education. A central benefit of a liberal arts education is an enhanced capacity for critical thinking, the ability to subject to independent scrutiny the received norms of our environment. It is because of this enhanced capacity to scrutinize social convention that liberal education works to liberate individuals, enabling them to choose freely their own views, rather than simply relying on tradition or authority.
Thus in principle, a liberally educated individual should be less subject to the process of adaptation Frank describes. This is because this adaptation process is rooted in the very social norms the liberal arts graduate has developed the capacity to scrutinize critically.
Because a liberally educated person develops a critical distance from the norms of his environment, he has, under Frank's analysis, a greater potential for happiness. In conspicuous purchases such as houses, he is less likely to need to exceed the norm to insure happiness and more likely to avoid unhappiness if below the norm. Less bound to more conspicuous spending, he also has the freedom to devote more of his resources to the inconspicuous goods that offer a greater contribution to his well-being.
I saw this transformation in myself, while undergoing my own liberal education. I had always been a night owl and fell easily into the rhythms of student life as an English major at Wesleyan University. As my college years progressed, I remember distinctly watching less TV. In classrooms and conversations, I was discovering a world more engaging and enduring than the world of conspicuous consumption then displayed on network television. I still kept my late-night hours, but the “Tonight Show” gave way to the stories of Melville and Kafka, two writers more concerned with understanding human psychology and relationships than acquiring material goods. The result was that, during my senior year, I don't recall ever discussing the size of house I hoped to live in. But I remember distinctly a line I repeated often when asked of my ambitions. I'd say: "Give me a library and the woman I love - and I'll be happy."
As a middle-aged, family man, my life is more complex now, but its underlying values abide. I met - and married - the woman I love. She delights and surprises me almost daily. And in my current academic job, I enjoy access to a first-rate library that satisfies even my overly curious mind. To be sure, I've even come to live in a very nice home, one that's far larger than the national norm. But when my friend tells me he could never move back to a smaller house, I immediately sense a difference between us. I've learned that my happiness depends less on where I live and more on what I treasure.
Vocational training, by definition, is designed to enhance our productive capacities. It equips us with skills for occupations ranging from X-ray technician to software engineer. Liberal education contributes to our productive lives as well, as I know firsthand from my own legal career.
But liberal education can do more. Significantly, it affects not only our skills as producers, but also our discernment as consumers. When it works, it changes for the better the satisfactions we seek. Over the course of a lifetime, a discriminating sensibility in this regard can contribute more to our happiness than the raises our jobs provide.
Of course, liberal education performs this broader role only when it confers more than intellectual insights. A liberal education must reinforce such insights in a way that fosters in students a new set of habits and dispositions. Such an education's intellectual virtues must, in short, become moral ones.
I have no doubt that this has always been a difficult task. Indeed, as a professor teaching today, I see it's becoming harder as an already overly commercialized culture becomes even more so. But I know from my current vantage point how a liberal education succeeded with me in ways my earlier self couldn't have foreseen. More importantly, I see in my classes how students surprise themselves daily with the persons they are becoming.
Thus, in promoting the value of a liberal education to the wider public, we should attend to the way it can change the consumers we become. Altering the satisfactions a person seeks changes his life in ways more profound than the paycheck he receives. For the wider public, this is the story of liberal education that has yet to be told. I suspect we can tell it best by telling our own individual stories, how our liberal educations transformed our lives, and how happiness in an unexpected way became possible.
Jeffrey Nesteruk is a professor at Franklin & Marshall College.
Little doubt exists that the nation’s college faculty has become less intellectually diverse over the past generation. According to one recent study, self-described liberals or leftists increased from 39 percent in 1984 to 72 percent now, with even higher percentages among the ranks of humanities and social science professors. Speaking for the educational establishment, Jonathan Knight of the American Association of University Professors doubted "that these liberal views cut very deeply into the education of students."
Knight might have looked at teacher-training programs before issuing his comment. There, the faculty’s ideological imbalance has allowed three factors -- a new accreditation policy, changes in how students are evaluated and curricular orientation around a theme of “social justice” -- to impose a de facto political litmus test on the next cohort of public school teachers.
There would seem little or no reason why academic departments would seek to promote social justice, which is essentially a political goal. Though the concept derives from religious thought, “social justice” in contemporary society is guided primarily by a person’s political beliefs: on abortion, or the Middle East, or affirmative action, partisans on both sides deem their position socially just. Literally and theoretically, though never in practice, education programs could define a number of causes as demonstrating a commitment to social justice -- perhaps championing Israel’s right to self-defense, so as to defend innocent civilians against suicide murderers; or celebrating a Roman Catholic anti-abortion initiative, so as to promote justice by preventing the destruction of innocent life; or opposing affirmative action, so as to achieve a socially just, color-blind, legal code. Yet, as surveys like those criticized by Knight suggest, adherents of such views are scarce in the academy.
Despite this clear threat of politicization, however, dozens of prominent education programs demand that their students promote social justice. For example:
At the State University of New York at Oneonta, prospective teachers must “provide evidence of their understanding of social justice in teaching activities, journals, and portfolios . . . and identify social action as the most advanced level.”
The program at the University of Kansas expects students to be “more global than national and concerned with ideals such as world peace, social justice, respect for diversity and preservation of the environment.”
The University of Vermont’s department envisions creating “a more humane and just society, free from oppression, that fosters respect for ethnic and cultural diversity.”
Marquette’s program “has a commitment to social justice in schools and society,” producing teachers who will use the classroom “to transcend the negative effects of the dominant culture.”
According to the University of Toledo, “Education is our prime vehicle for creating the ‘just’ society,” since “we are preparing citizens to lead productive lives in a democratic society characterized by social justice.”
This rhetoric is admirable. Yet, as the hotly contested campaigns of 2000 and 2004 amply demonstrated, people of good faith disagree on the components of a “just society,” or what constitutes the “negative effects of the dominant culture,” or how best to achieve “world peace . . . and preservation of the environment.”
An intellectually diverse academic culture would ensure that these vague sentiments did not yield one-sided policy prescriptions for students. But the professoriate cannot dismiss its ideological and political imbalance as meaningless while simultaneously implementing initiatives based on a fundamentally partisan agenda.
Instead of downplaying the issue, education programs have adjusted their evaluation criteria to increase its importance. Traditionally, prospective teachers needed to demonstrate knowledge of their subject field and mastery of essential educational skills. In recent years, however, an amorphous third criterion called “dispositions” has emerged. As one conference devoted to the concept explained, using this standard would produce “teachers who possess knowledge and discernment of what is good or virtuous.” Advocates leave ideologically one-sided education departments to determine “what is good or virtuous” in the world.
In 2002, the National Council for Accreditation of Teacher Education explicitly linked dispositions theory to ensuring ideological conformity among education students. Rather than asking why teachers’ political beliefs are in any way relevant to their ability to perform well in the classroom, NCATE issued new guidelines requiring education departments that listed social justice as a goal to “include some measure of a candidate’s commitment to social justice” when evaluating the “dispositions” of their students. As neither traditional morality nor social justice commitment in any way guarantee high-quality teachers, this strategy only deflects attention away from the all-important goal of training educators who have command of content and the ability to instruct.
The program at my own institution, Brooklyn College, exemplifies how application of NCATE’s new approach can easily be used to screen out potential public school teachers who hold undesirable political beliefs. Brooklyn’s education faculty, which assumes as fact that “an education centered on social justice prepares the highest quality of future teachers,” recently launched a pilot initiative to assess all education students on whether they are “knowledgeable about, sensitive to and responsive to issues of diversity and social justice as these influence curriculum and pedagogy, school culture, relationships with colleagues and members of the school community, and candidates’ analysis of student work and behavior.”
At the undergraduate level, these high-sounding principles have been translated into practice through a required class called “Language and Literacy Development in Secondary Education.” According to numerous students, the course’s instructor demanded that they recognize “white English” as the “oppressors’ language.” Without explanation, the class spent its session before Election Day screening Michael Moore’s Fahrenheit 9/11. When several students complained to the professor about the course’s politicized content, they were informed that their previous education had left them “brainwashed” on matters relating to race and social justice.
Troubled by this response, at least five students filed written complaints with the department chair last December. They received no formal reply, but soon discovered that their coming forward had negative consequences. One senior was told to leave Brooklyn and take an equivalent course at a community college. Two other students were accused of violating the college’s “academic integrity” policy and refused permission to bring a witness, a tape recorder, or an attorney to a meeting with the dean of undergraduate studies to discuss the allegation. Despite the unseemly nature of retaliating against student whistleblowers, Brooklyn’s overall manner of assessing commitment to “social justice” conforms to NCATE’s recommendations, previewing what we can expect as other education programs more aggressively scrutinize their students’ “dispositions” on the matter.
Must prospective public school teachers accept a professor’s argument that “white English” is the “oppressors’ language” in order to enter the profession? In our ideologically imbalanced academic climate, the combination of dispositions theory and the new NCATE guidelines risk producing a new generation of educators certified not because they mastered their subject but because they expressed fealty to the professoriate’s conception of “social justice.”
KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.Â
Over the last generation, most colleges and universities have experienced considerable grade inflation. Much lamented by traditionalists and explained away or minimized by more permissive faculty, the phenomenon presents itself both as an increase in students’ grade point averages at graduation as well as an increase in high grades and a decrease in low grades recorded for individual courses. More prevalent in humanities and social science than in science and math courses and in elite private institutions than in public institutions, discussion about grade inflation generates a great deal of heat, if not always as much light.
While the debate on the moral virtues of any particular form of grade distribution fascinates as cultural artifact, the variability of grading standards has a more practical consequence. As grades increasingly reflect an idiosyncratic and locally defined performance levels, their value for outside consumers of university products declines. Who knows what an "A" in American History means? Is the A student one of the top 10 percent in the class or one of the top 50 percent?
Fuzziness in grading reflects a general fuzziness in defining clearly what we teach our students and what we expect of them. When asked to defend our grading practices by external observers -- parents, employers, graduate schools, or professional schools -- our answers tend toward a vague if earnest exposition on the complexity of learning, the motivational differences in evaluation techniques, and the pedagogical value of learning over grading. All of this may well be true in some abstract sense, but our consumers find our explanations unpersuasive and on occasion misleading.
They turn, then, to various forms of standardized testing. When the grades of an undergraduate have an unpredictable relevance to a standard measure performance, and when high quality institutions that should set the performance standard routinely give large proportions of their students “A” grades, others must look elsewhere for some reliable reference. A 3.95 GPA should reflect the same level of preparation for students from different institutions.
Because they do not, we turn to the GMAT, LSAT, GRE, or MCAT, to take four famous examples. These tests normalize the results from the standards-free zone of American higher education. The students who aspire to law or medical school all have good grades, especially in history or organic chemistry. In some cases, a student’s college grades may prove little more than his or her ability to fulfill requirements and mean considerably less than the results of a standardized test that attempts to identify precisely what the student knows that is relevant to the next level of academic activity.
Although many of us worry that these tests may be biased against various subpopulations, emphasize the wrong kind of knowledge, and encourage students to waste time and money on test prep courses, they have one virtue our grading system does not provide: The tests offer a standardized measure of a specific and clearly defined subset of knowledge deemed useful by those who require them for admission to graduate or professional study.
Measuring State Investment
If the confusion over the value of grades and test scores were not enough, we discover that at least for public institutions, our state accountability systems focus heavily on an attempt to determine whether student performance reflects a reasonable value for taxpayer investment in colleges and universities. This accountability process engages a wide range of measures -- time to degree, graduation rate, student satisfaction, employment, graduate and professional admission, and other indicators of undergraduate performance -- but even with the serious defects in most of these systems, they respond to the same problems as do standardized tests.
Our friends and supporters have little confidence in the self-generated mechanisms we use to specify the achievement of our students. If the legislature believed that students graduating with a 3.0 GPA were all good performers measured against a rigorous national standard applied to reasonably comparable curricula, they would not worry much about accountability. They would just observe whether our students learned enough to earn a nationally normed 3.0 GPA.
Of course, we have no such mechanism to validate the performance of our students. We do not know whether our graduates leave better or worse prepared than the students from other institutions. We too, in recognition of the abdication of our own academic authority as undergraduate institutions, rely on the GRE, MCAT, LSAT, and GMAT to tell us whether the students who apply (including our own graduates) can meet the challenges of advanced study at our own universities.
Partly this follows from another peculiarity of the competitive nature of the American higher education industry. Those institutions we deem most selective enroll students with high SATs on average (recognizing that a high school record is valuable only when validated in some fashion by a standardized test). Moreover, because selective institutions admit smart students who have the ability to perform well, and because these institutions have gone to such trouble to recruit them, elite colleges often feel compelled to fulfill the prophecy of the students’ potential by ensuring that most graduate with GPA’s in the A range. After all, they may say, average does not apply to our students because they are all, by definition, above average.
When reliable standards of performance weaken in any significant and highly competitive industry, consumers seek alternative external means of validating the quality of the services provided. The reluctance of colleges and universities, especially the best among us, to define what they expect from their students in any rigorous and comparable way, brings accreditation agencies, athletic organizations, standardized test providers, and state accountability commissions into the conversation, measuring the value of the institution’s results against various nationally consistent expectations of performance.
We academics dislike these intrusions into our academic space because they coerce us to teach to the tests or the accountability systems, but the real enemy is our own unwillingness to adopt rigorous national standards of our own.
In its 1966 declaration on professional ethics, the American Association of University Professors, the professoriate’s representation organization, states:
"Professors, guided by a deep conviction of the worth and dignity of the advancement of knowledge, recognize the special responsibilities placed upon them....They hold before them the best scholarly and ethical standards of their discipline.… They acknowledge significant academic or scholarly assistance from (their students)."
Notwithstanding such pronouncements, higher education recently has provided the public with a series of ethical solecisms, most spectacularly the University of Colorado professor Ward Churchill’s recidivistic plagiarism and duplicitous claim of Native American ancestry along with his denunciations of 9/11 victims. While plagiarism and fraud presumably remain exceptional, accusations and complaints of such wrong doing increasingly come to light.
Some examples include Demas v. Levitsky at Cornell, where a doctoral student filed a legal complaint against her adviser’s failure to acknowledge her contribution to a grant proposal; Professor C. William Kauffman’s complaint against the University of Michigan for submitting a grant proposal without acknowledging his authorship; and charges of plagiarism against by Louis W. Roberts, the now-retired classics chair at the State University of New York at Albany. Additional plagiarism complaints have been made against Eugene M. Tobin, former president of Hamilton College, and Richard L. Judd, former president of Central Connecticut State University.
In his book Academic Ethics, Neil Hamilton observes that most doctoral programs fail to educate students about academic ethics so that knowledge of it is eroding. Lack of emphasis on ethics in graduate programs leads to skepticism about the necessity of learning about ethics and about how to teach it. Moreover, nihilist philosophies that have gained currency within the academy itself such as Stanley Fish’s “antifoundationalism” contribute to the neglect of ethics education. . For these reasons academics generally do not seriously consider how ethics education might be creatively revived. In reaction to the Enron corporate scandal, for instance, some business schools have tacked an ethics course onto an otherwise ethically vacuous M.B.A. program. While a step in the right direction, a single course in a program otherwise uninformed by ethics will do little to change the program’s culture, and may even engender cynicism among students.
Similarly, until recently, ethics education had been lacking throughout the American educational system. In response, ethicists such as Kevin Ryan and Karen Bohlin have advocated a radical renewal of ethics education in elementary schools. They claim that comprehensive ethics education can improve ethical standards. In Building Character in Schools, Ryan and Bohlin compare an elementary school to a polis, or Greek city state, and urge that ethics be fostered everywhere in the educational polis.
Teachers, they say, need to set standards and serve as ethical models for young students in a variety of ways and throughout the school. They find that manipulation and cheating tend to increase where academic achievement is prized but broader ethical values are not. They maintain that many aspects of school life, from the student cafeteria to the faculty lounge, ought to provide opportunities, among other things, to demonstrate concern for others. They also propose the use of vision statements that identify core virtues along with the implementation of this vision through appropriate involvement by staff and students.
We would argue that, like elementary schools, universities have an obligation to ethically nurture undergraduate and graduate students. Although the earliest years of life are most important for the formation of ethical habits, universities can influence ethics as well. Like the Greek polis, universities become ethical when they become communities of virtue that foster and demonstrate ethical excellence. Lack of commitment to teaching, lack of concern for student outcomes, false advertising about job opportunities open to graduates, and diploma-mill teaching practices are examples of institutional practices that corrode rather than nourish ethics on campuses.
Competency-based education, broadly considered, is increasingly of interest in business schools. Under the competency-based approach (advocated, for example, by Rick Boyatzis of Case Western Reserve University, David Whetten of Brigham Young University, and Kim Cameron of the University of Michigan), students are exposed not only to theoretical concepts, but also to specific competencies that apply the theory. They are expected to learn how to apply in their lives the competencies learned in the classroom, for instance those relating to communication and motivating others. Important ethical competencies (or virtues) should be included and fostered alongside such competencies. Indeed, in applied programs such as business, each discipline and subject can readily be linked to ethical virtues. Any applied field, from traffic engineering to finance, can and should include ethical competencies as an integral part of each course.
For example, one of us currently teaches a course on managerial skills, one portion of which focuses on stress management. The stress management portion includes a discussion of personal mission setting, which is interpreted as a form of stress management. The lecture emphasizes how ethics can intersect with practical, real world decision making and how it can relate to competencies such as achievement orientation. In the context of this discussion, which is based on a perspective that originated with Aristotle, a tape is shown of Warren Buffett suggesting to M.B.A. students at the University of North Carolina that virtue is the most important element of personal success.
When giving this lecture, we have found that street smart undergraduate business students at Brooklyn College and graduates in the evening Langone program of the Stern School of Business of New York University respond well to Buffett’s testimony, perhaps better than they would to Aristotle’s timeless discussions in Nicomachean Ethics.
Many academics will probably resist integration of ethical competencies into their course curriculums, and in recent years it has become fashionable to blame economists for such resistance. For example, in his book Moral Dimension, Amitai Etzioni equates the neoclassical economic paradigm with disregard for ethics. Sumantra Ghoshal’s article “Bad Management Theories are Destroying Good Management Practices,” in Academy of Management Learning and Education Journal, blames ethical decay on the compensation and management practices that evolved from economic theory’s emphasis on incentives.
We disagree that economics has been all that influential. Instead, the problem is much more fundamental to the humanities and social sciences and has its root in philosophy. True, economics can exhibit nihilism. For example, the efficient markets hypothesis, that has influenced finance, holds that human knowledge is impotent in the face of efficient markets. This would imply that moral choice is impotent because all choice is so. But the efficient markets hypothesis is itself a reflection of a deeper and broader philosophical positivism that is now pandemic to the entire academy.
Over the past two centuries the assaults on the rational basis for morals have created an atmosphere that stymies interest in ethical education. In the 18th century, the philosopher David Hume wrote that one cannot derive an “ought” from an “is,” so that morals are emotional and cannot be proven true. Today’s academic luminaries have thoroughly imbibed this “emotivist” perspective. For example, Stanley Fish holds that even though academics do exhibit morality by condemning “cheating, academic fraud and plagiarism,” there is no universal morality beyond this kind of “local practice.”
Whatever its outcome, the debate over the rational derivability of ethical laws from a set of clear and certain axioms that hold universally is of little significance in and of itself. It will not determine whether ethics is more or less important in our lives; nor will it provide a disproof of relativism -- since defenders of relativism can still choose not to accept the validity of the derivation.
Yet ethics must still be lived -- even though the knowledge, competency, skill or talent that is needed to lead a moral life, a life of virtue, may not be derived from any clear and certain axioms. There is no need for derivation of the need, for instance, for good interpersonal skills. Rather, civilization depends on competency, skill and talent as much as it depends on practical ethics. Ethical virtue does not require, nor is it sustained by, logical derivation; it becomes most manifest, perhaps, through its absence, as revealed in the anomie and social decline that ensue from its abandonment. Philosophy is beside the point.
Based on much evidence of such a breakdown, ethics education experts such as Thomas Lickona of the SUNY's College at Cortland have concluded that to learn to act ethically, human beings need to be exposed to living models of ethical emotion, intention and habit. Far removed from such living models, college students today are incessantly exposed to varying degrees of nihilism: anti-ethical or disembodied, hyper-rational positions that Professor Fish calls “poststructuralist” and “antifoundationalist.” In contrast, there is scant emphasis in universities on ethical virtue as a pre-requisite for participation in a civilized world. Academics tend to ignore this ethical pre-requisite, preferring to pretend that doing so has no social repercussions.
They are disingenuous – and wrong.
It is at the least counterintuitive to deny that the growing influence of nihilism within the academy is deeply, and causally, connected to increasing ethical breaches by academics (such as the cases of plagiarism and fraud that we cited earlier). Abstract theorizing about ethics has most assuredly affected academics’ professional behavior.
The academy’s influence on behavior extends, of course, far beyond its walls, for students carry the habits they have learned into society at large. The Enron scandal, for instance, had more roots in the academy than many academics have realized or would care to acknowledge. Kenneth Lay, Enron’s former chairman, holds a Ph.D. in economics from the University of Houston.Jeff Skilling, Enron’s former CEO, is a Harvard M.B.A. who had been a partner at the McKinsey consulting firm, one of the chief employers of top-tier M.B.A. graduates. According to Malcolm Gladwell in The New Yorker, Enron had followed McKinsey’s lead, habitually hiring the brightest M.B.A. graduates from leading business schools, most often from the Wharton School. Compared to most other firms, it had more aggressively placed these graduates in important decision-making posts. Thus, the crimes committed at Enron cannot be divorced from decision-making by the best and brightest of the newly minted M.B.A. graduates of the 1990s.
As we have seen, the 1966 AAUP statement implies the crucial importance of an ethical foundation to academic life. Yet ethics no longer occupies a central place in campus life, and universities are not always run ethically. With news of academic misdeeds (not to mention more spectacular academic scandals, such as the Churchill affair) continuing to unfold, the public rightly grows distrustful of universities.
It is time for the academy to heed the AAUP’s 1915 declaration, which warned that if the professoriate “should prove itself unwilling to purge its ranks of … the unworthy… it is certain that the task will be performed by others.”
Must universities learn the practical value of ethical virtue by having it imposed from without? Or is ethical revival possible from within?
Candace de Russy and Mitchell Langbert
Candace de Russy is a trustee of the State University of New York and a Hudson Institute Adjunct Fellow. Mitchell Langbert is associate professor of business at Brooklyn College of the City University of New York.
"They're just darlings," my co-worker said. "Absolute darlings."
"Uh-huh," I agreed, staring at my grading sheet. She is discussing five athletes at the private, four-year university where we teach. As another part-time foreign-language instructor comes in, I overhear their conversation.
"Well, they'll never get through nine chapters," said the "darling" woman.
"Oh," responded my friend, a woman who teaches Spanish.
"I'm going back to Chapter Five," said the first instructor, "I just love teaching these darling, darling boys."
I sat there, stunned. Was I hearing correctly? Was she simply dropping half of the curriculum to cater to a few students who couldn't do the work? Later, when we were alone in the office, I commented, "It's so hard to get them to work, but I keep pushing. I've got to get them through the whole book or they're sunk next semester."
"Oh, well, that's how it is with English comp, I'm sure," said the "darling" woman. "I mean you've got to cover the material."
"How is that different with Spanish?" I finally asked.
"Oh, well, I've got to make sure that they really get it." She responded. Frustrated, I couldn't think of anything else to say. This adjunct had developed a curriculum based on department-approved course objectives. She had turned in copies of her syllabus to the academic dean for approval. Then, frustrated by her students' inability or unwillingness to learn, she had simply chopped off the back end of her course.
Later she had confided that there were a few students who were "getting it," but that they would simply have to review the same materials over and over until the end of the semester because she was catering to the athletes. That morning, as my colleague left for her class, I jotted the note, "curriculum rip-off" in my notebook. Something would come of this, I thought. Something.
At lunch that day, with the provost at the head of the table, I commented that a fellow instructor wasn't teaching the curriculum. "What do you mean," the provost asked, voice surprisingly kind for a man in power.
"She said the athletes in her class weren't learning," I paused, unsure if I should go on, "so she cut out the last four chapters of the book."
"You're kidding!" said a physics instructor to my right.
"She'll review them later, right?" the provost asked.
Trembling, I kept my hands in my lap, "I got the impression that she wasn't going to teach the last four chapters at all."
"Really," said the provost. "What's her name?"
"Oh, I really couldn't say," I mumbled, gathering up my half-finished tray.
Face reddening, I made my way to drop off my tray. What had made me speak up? Me, an adjunct? A part-timer with no tenure, no security, no voice. I didn't bring it up again. In the next days, I asked co-workers innocuous questions about their classes. I found it hard to make eye contact with the provost.
What had made me speak up? Anger. A feeling that not only would the next instructors to teach these students be frustrated, their jobs only made that much more difficult, but the students were being ripped off in a wholesale fashion.
According to the students, the less they were taught, the better. But I knew better. And I had been on the receiving end of some of these half-taught students. One of my colleagues at a large community college in California had confessed that he passed any student who would sit through his course. With no work to grade them, he simply gave them all C's. He was not the only one, I realized.
When I had struggled with a student whose grammar was shockingly poor and who could not form a decent paragraph or essay, I sometimes wondered if they had simply tested well on the eligibility exam or if an unwitting colleague had passed them on to me.
And what did the students get out of this? Yes, their semester was easier. Yes, they had less homework. Yes, they could spend more time on sports. But at what cost? Their education was being whittled away by instructors who could not or would not insist on the curriculum. It was a simple matter of trading the short-term for the long-term goal. Given the choice, I knew that a smaller percentage of the students would vote for learning all that they were promised. Yes, some would complain and wheedle, but I must believe that instructors know better.
We are in a position of power and we must not misuse that power by stealing. And when we lop off a part of the curriculum that is too bothersome or too difficult for some students, we are stealing from all of the students. One colleague confessed that she often had to switch lesson plans around to teach what she needed to -- but she always covered the chapters that she had promised.
I'm not sure if she had been burned by a colleague or if she simply knew what the right thing to do was, but I admire her stance. I, too, frequently find that I need to "borrow from Peter to pay Paul" in lesson making, but I always cover the curriculum. Even in the classroom, when I am tempted to cut out a section that once seemed important, I review the materials later in my office and talk to senior instructors who can guide me.
It is dangerous to make impromptu decisions at the chalkboard. More often than not, I am dreaming of new ways to teach something that seems tedious -- a new essay, a new exercise, or examples taken from my own classes. Anything to get them to see the lesson in a new way. My struggle sometimes reminds me of my effort to clip my terrier's nails. After an hour my struggling and his howling, I finally brought my dog to the local veterinarian and paid the $15. His nails did get clipped. In the same way, I struggle with curriculum, but in the end, it gets taught.
My last concern was a big one -- what about our accreditation? This four-year university already had a poor reputation. Once known as a feeder campus for Stanford University, its price tag now seemed to have no correlation to its rigor or value. What if our accreditors found that we were not teaching the curriculum? What if they somehow found out that we were not achieving the course objectives that they had originally approved. What then?
After working on committees at the large community college in California, I had learned a healthy respect for the powers that be. Whether one was a tenured full-time instructor or an adjunct, we simply did not have the right to make such decisions on our own.
Suddenly I was thankful for those who had mentored me -- even those kind souls who sat at lunch with me. Their opinions, ideas and suggestions were helping to shape me. Every day, every semester. So many teachers, struggling, wrangling, working to be sure that curriculum gets taught. What a blessing to be one of those who hold the line. And those who benefit? We do. Instructors, administrators, and, most importantly, the students.
Shari Wilson is the pseudonym of an adjunct who has taught at many colleges in California. In a column last month, she wrote about the unintended consquences of the "six year rule" on faculty members who are off the tenure track.
Ethical lapses are in the news again. Former CEOs on trial. Journalists receiving secret payments. Congress revising its ethics rules to protect one of its own. In such a troubling environment, we understandably hear calls for colleges and universities to incorporate more ethics into their programs of study. But amid such calls, I see little appreciation of the deeply personal challenge of teaching ethics.
After almost 20 years of teaching ethics, I'm still trying to get my footing. This isn't because of my lack of familiarity with the subject. I've long studied the classics in my field and eagerly devour the latest in journal articles. Nor am I at a loss for ways to bring the subject to my students. I've taught ethics in a variety of settings, from an elite business school to a struggling community college to the selective liberal arts institution that's currently my home. In each academic setting, I've been able to discover a set of pedagogical techniques that fostered lively class discussions.
My struggle in teaching ethics involves something more. It involves, as Parker Palmer states in The Courage to Teach, "the self you bring to the project, your identity and integrity." In the morally shifting and conflicted world in which we live, my identity as an ethicist has always been a precarious enterprise.
This is because my identity as an ethicist is tied to the moral coherence of the culture in which I live. James Boyd White once defined culture as "a set of ways of claiming meaning for experience." Without a common wellspring of values, it's difficult for an ethicist to claim a definitive meaning for the work he does. Yet I teach within a culture that's morally at odds with itself. Contemporary life combines a pervasive skepticism about traditional morality with the strident reemergence of such morality. This moral dissonance can even come to reside in our individual psyches. My guess is that more than a few Christian fundamentalists watch "Desperate Housewives."
Within this divided moral culture, academia offers only limited options. Thus, David Brooks writes that young people seeking moral guidance on college campuses typically encounter two possibilities. There are those mostly on the left "who tell them to renounce commercialism, materialism, and vulgar endeavoring." There are those mostly on the right who counsel "a consciousness of ... original sin" and a commitment to "the fixed truth of natural law" and "the traditions of orthodox faith."
Within such a cultural mix, my choice of identity as an ethicist has always posed a risk. This is because the wholeness that individual integrity presupposes is difficult in a culture so morally divided. Within the dichotomy that Brooks identifies, for example, where does someone like me -- someone who regularly both recycles and prays -- fit?
The cultural awkwardness of my identity as an ethicist poses a distinctive challenge in the classroom. This is because "the self you bring to the project" is central to teaching ethics. Moral education is about more than the information you convey to students or the skills you help them develop. It is ultimately about the persons they become in the process. With a subject as intimately linked to students' development as ethics, the self you bring to a course is crucial to its outcome. Students view all I do and say in the classroom through the lens of who they think I am.
The cultural sensibilities they bring to their assessments of my character also complicate matters. My students' outlooks are often unreceptive to the possibilities of a common moral dialogue. They've grown up in a world in which the country has been carved up into red states and blue states. They log on to blogs that reinforce their own tastes and ignore those of others. They tune in to Fox News with its portrayal of issues from stem cell research to affirmative action as an ongoing morality play between secular humanists and religious conservatives. Thus, if I appear to come at things from either side of a cultural divide, I'll lose at least half the members of the class, even if they are unwilling to tell me exactly why.
Teaching across our cultural divides as an ethicist today requires drawing on an understanding of the moral life that is noticeably absent from our public media. The flaming practices of cyberspace and the shouting matches of talk radio encourage us to see the essence of our moral lives as residing in the views we espouse. Across much of our vast electronic commons, you are, morally speaking, what you believe. Speak favorably, for example, of gay marriage and you become, depending on who's judging, either morally enlightened or morally corrupt.
But away from the public airwaves, a different and deeper understanding of the moral life prevails in the more intimate relations of our daily lives. It is an understanding of the moral life that allows us to continue to talk to our neighbors, swap recipes, borrow drills, and enjoy our kids playing together, even if we voted differently in the last presidential election. This is an understanding of the moral life as depending more on the dispositions you have than the views you hold. The moral life, after all, is primarily something we do rather than something we talk about. It depends on traits deeper than the views we hold of the hot-button moral issues of our time. It centers instead on what Aristotle would have called virtues, our basic dispositions or ways of being in the world. Asked to describe the moral life, we typically include traits such as our capacity for kindness, our aspirations toward integrity, our respect for principle, our desire for worthy accomplishments.
More and more, I am drawing on this deeper understanding of the moral life in my courses. In a morally polarized world, it offers a classroom identity that keeps open the potential for a common moral dialogue. There are daily opportunities. A few minutes spent listening to a student relate his anxieties over an upcoming exam. Stooping down to help a student when she drops her books on the floor. When my actions reveal I care about my students before they discover my views of capital punishment, I take on for them an identity that still has resonance in a morally fragmented world. I'm a person who is trying, however imperfectly, to lead a moral life.
Acknowledging the moral life as a practice we all imperfectly engage in engenders a distinctive understanding of the moral life. As an imperfectly realized practice, the moral life is always richer than our conceptions of it. We are always learning the meaning of kindness as we encounter it in its myriad manifestations. Each time we are able to stand on principle, we deepen our appreciation of integrity.
Recognizing the moral life as richer than our conceptions has a poignant value in the classroom. This is because of the way this recognition can cultivate our respect for our moral differences. In order to be meaningful, this respect must be genuine, not a lazy or indifferent tolerance. It needs to be a respect that compels us to want to learn more about why we disagree with each other.
Lately, I've noticed a reoccurring reaction I get from my students. They put it in different ways, but its essence is this: "You always treated everything we said as if it had value." In whatever form this reaction takes, it's one of my favorite compliments. For, as is true of us all, everything a student says has value. Not equal value, of course. There are some classroom comments that are arrestingly insightful. There are plenty that are downright silly. But errors, even of the grievous sort, have value, even if for no other reason than they force us to better articulate the truth. Recognizing this, my students are beginning the kind of genuine moral conversation so many of our public pundits seemingly no longer believe is possible.
Jeffrey Nesteruk is a professor of legal studies at Franklin & Marshall College.