Academic freedom

Proving the Critics' Case

Inside Higher Ed recently reported on four University of Pittsburgh professors critiquing the latest survey suggesting ideological one-sidedness in the academy. According to the Pitt quartet, self-selection accounts for findings that the faculty of elite disproportionately tilts to the Left. "Many conservatives," the Pitt professors mused, "may deliberately choose not to seek employment at top-tier research universities because they object, on philosophical grounds, to one of the fundamental tenets undergirding such institutions: the scientific method."
 
Imagine the appropriate outrage that would have occurred had the above critique referred to feminists, minorities, or Socialists. Yet the Pitt quartet's line of reasoning -- that faculty ideological imbalance reflects the academy functioning as it should -- has appeared with regularity, and has been, unintentionally, most revealing. Indeed, the very defense offered by the academic Establishment, rather than the statistical surveys themselves, has gone a long way toward proving the case of critics who say that the academy lacks sufficient intellectual diversity.

In theory, ideology should have no bearing on how a professor teaches, say, physics. Even so, should responsible administrators worry that the overwhelming partisan disparity is worthy of further inquiry? And, in theory, parents who make their money in traditionally conservative professions such as investment banking or corporate law probably do not encourage their children to enter academe. Yet, as money-making fields have always been attractive to conservatives, why has the proportion of self-professed liberals or Leftists in the academy nearly doubled in the last generation?

Had members of the academic Establishment confined themselves to such arguments (or had they ignored the partisan-breakdown studies altogether), the intellectual diversity issue would have received little attention. Instead, the last two years have seen proud, often inflammatory, defenses of the professoriate's ideological imbalance. These arguments, which have fallen into three categories, raise grave concerns about the academy's overall direction.
 
1. The cultural left is, simply, more intelligent than anyone else. As SUNY-Albany's Ron McClamrock reasoned, "Lefties are overrepresented in academia because on average, we're just f-ing smarter." The first recent survey came in early 2004, when the Duke Conservative Union disclosed that Duke's humanities departments contained 142 registered Democrats and 8 registered Republicans. Philosophy Department chairman Robert Brandon considered the results unsurprising: "If, as John Stuart Mill said, stupid people are generally conservative, then there are lots of conservatives we will never hire."
 
In a slightly different vein, UCLA professor John McCumber informed The New York Times that "a successful career in academia, after all, requires willingness to be critical of yourself and to learn from experience," qualities "antithetical to Republicanism as it has recently come to be." In another Times article, Berkeley professor George Lakoff asserted that Leftists predominate in the academy because, "unlike conservatives, they believe in working for the public good and social justice, as well as knowledge and art for their own sake." Again, imagine the appropriate outcry if prominent academics employed such sweeping generalizations to dismiss statistical disparities suggesting underrepresentation of women, gays, or minorities.
 
These arguments become even more disturbing given the remarkably broad definition of "conservative" employed in many academic quarters. Take the case of Yeshiva University's Ellen Schrecker, recently elected to a term on the AAUP's general council. This past spring, Schrecker denounced Columbia students who wanted to broaden instruction about the Middle East for "trying to impose orthodoxy at this university." The issue, she lamented, amounted to "right wing propaganda."
 
The leaders of the Columbia student group, who ranged from registered Republicans to backers of Ralph Nader's 2000 presidential bid, were united only in their belief that matters relating to Israel should be treated objectively in the classroom. Probably 98 percent of the U.S. Congress and all of the nation's governors would fit under such a definition of "right wing."
 
Indeed, it seems as if the academic Establishment considers anyone who does not accept the primacy of a race/class/gender interpretation to be "conservative." To most outside of the academy, such a definition would suggest that professors are using stereotypes to abuse the inherently subjective nature of the hiring process.

2. A left-leaning tilt in the faculty is a pedagogical necessity, because professors must expose gender, racial, and class bias while promoting peace, "diversity" and "cultural competence." According to Montclair State's Grover Furr, "colleges and universities do not need a single additional 'conservative' .... What they do need, and would much benefit from, is more Marxists, radicals, leftists -- all terms conventionally applied to those who fight against exploitation, racism, sexism, and capitalism. We can never have too many of these, just as we can never have too few 'conservatives.'"

Furr's remarks echoed those of Connecticut College's Rhonda Garelick, who decried student "disgruntlement" when she used her French class to discuss her opposition to the war in Iraq and teach "'wakeful' political literacy." Rashid Khalidi, meanwhile, rationalized anti-Israel instruction as necessary to undo the false impressions held by all incoming Columbia students except for "Arab-Americans, who know that the ideas spouted by the major newspapers, television stations, and politicians are completely at odds with everything they know to be true."

To John Burness, Duke’s senior vice president for public affairs, such statements reflect a proper professorial role. The "creativity" in humanities and social science disciplines, he noted, addresses issues of race, class, and gender, leading to a "perfectly logical criticism of the current society" in the classroom.

At some universities, this mindset has even shaped curricular or personnel policies. Though its release generated widespread criticism and hints from administrators that it would not be adopted, a proposal to make "cultural competence" a key factor in all personnel decisions remains the working draft of the University of Oregon's new diversity plan. Columbia recently set aside $15 million for hiring women and minorities -- and white males who would "in some way promote the diversity goals of the university ." And the University of Arizona's hiring blueprint includes requiring new faculty in some disciplines to "conduct research and contribute to the growing body of knowledge on the importance of valuing diversity."

On the curricular front, my own institution's provost, Roberta Matthews (who has written that "teaching is a political act") intends for the college's new general education curriculum to produce "global citizens" -- who, she commented, are those "sensitized to issues of race, class, and gender."

Given such initiatives, it is worth remembering the traditional ideal of a university education: for faculty committed to free intellectual exchange in pursuit of the truth to expose undergraduates to the disciplines of the liberal arts canon, in the expectation that college graduates will possess the wide range of knowledge and skills necessary to function as democratic citizens.

3. A left-leaning professoriate is a structural necessity, because the liberal arts faculty must balance business school faculty and/or the general conservative political culture. University of Michigan professor Juan Cole, denouncing the "ridiculous and pernicious line" that major universities need greater intellectual diversity, complained about insufficient attention to the ideological breakdown of "Business Schools, Medical Schools, [and] Engineering schools." UCLA's Russell Jacoby wondered why " conservatives seem unconcerned about the political orientation of the business professors." Duke Law professor Erwin Chemerinsky more ambitiously claimed that "it's hard to see this as a time of liberal dominance" given conservative control of the three branches of government.

Professional schools reflect the mindset of their professions: Socialists are about as common on business school faculty as are home-schooling advocates among education school professors. But, unlike business schools, liberal arts colleges and universities do not exist to train students for a single profession. Nor are they supposed to balance the existing political culture. If the Democrats reclaim the presidency and Congress in the 2008 elections, should the academy suddenly adopt an anti-liberal posture?

The intellectual diversity issue shows no signs of fading away. Ideological one-sidedness among the professoriate seems to be, if anything, expanding. And so, no doubt, will we see additional surveys suggesting a heavy ideological imbalance among the nation's faculty -- followed by new inflammatory statements from the academic Establishment that only reinforce the critics' claims about bias in the personnel process.
 
In an ideal world, campus administrators would have rectified this problem long ago. A few have made small steps. Brown University's president, Ruth Simmons, for instance, has expressed concern that the "chilling effect caused by the dominance of certain voices on the spectrum of moral and political thought" might negatively affect a quality education; her university's Political Theory Project represents a model that other institutions could follow.

To my knowledge, however, no academic administration has made the creation of an intellectually and pedagogically diverse faculty its primary goal. This statement, it should be noted, applies equally as well to institutions frequently praised by conservatives, such as Hillsdale College. Such an initiative, of course, would encounter ferocious faculty resistance. But it would also, just as surely, excite parents, donors, and trustees. If successful, an institution that made intellectual diversity its hallmark would encourage imitation -- if only because other colleges would face the free-market pressures of losing talented students and faculty. So, the question becomes, do we have an administration anywhere in the country willing to take up the cause?

Author/s: 
KC Johnson
Author's email: 
info@insidehighered.com

KC Johnson is a professor of history at Brooklyn College and the CUNY Graduate Center.

Designed to Please

If intelligent design gets taught in the college classroom, here are some other propositions we can look forward to:

Was Shakespeare the author of all those plays? Competing theories suggest that the Earl of Oxford, Francis Bacon, or even Queen Elizabeth herself penned those immortal lines. You be the judge. Henceforth, the prefaces to all those editions by “William Shakespeare” should be rewritten to give equal time to the alternate-authorship idea.

Does oxygen actually support that flickering candle flame, or is an invisible, weightless substance called phlogiston at work? First suggested by J. J. Becher near the end of the 17th century, the existence of phlogiston was eventually pooh-poohed by supporters of the oxygen hypothesis, but, as they say in the legal profession, the jury’s still out on this one.

Drop a candy bar on the sidewalk, and come back to find ants swarming all over it. Or put a piece of rotten meat in a cup and later find maggots in it, having come out of nowhere! This is called spontaneous generation. Biologists eventually decided that airborne spores, like little men from parachutes, wafted onto the food and set up shop there, but does that make any sense to you?

In the morning, the sun rises over the tree line, and by noon it’s directly overhead. At night, as the popular song has it, “I hate to see that evening sun go down.” Then why do so many people think that the earth moves instead of the sun? Could this be a grand conspiracy coincident with the rise of that Italian renegade Galileo, four centuries ago? Go out and look at the sunset! As they say, seeing is believing.

Proper grammar, the correct way of speaking, the expository essay model -- how rigid and prescriptive! There are as many ways to talk as there are people on this good, green earth, and language is a living organism. Or like jazz, an endless symphony of improvisation. No speech is wrong, just different, and anyone who says otherwise is just showing an ugly bias that supports white hegemony.

“History is bunk,” declared the famous industrialist and great American Henry Ford. All those names and dates -- why learn any of that when not even the so-called experts can agree on exactly what happened? Besides, most of those historical figures are dead by now, so what’s the point? From now on, all history departments must issue disclaimers, and anything presented as a narrative will be taught in the creative writing program.

Speaking of which, creative writing itself has long been controlled by a bunch of poets and fiction writers who determine who wins what in the world of letters. But who really knows whether the latest Nobel Prize winner is any better than, say, that last Tom Clancy novel you read. It all boils down to a matter of taste, doesn’t it?

Or what about that "Shakespeare"? Was he/she/it really any better than the Farrelly brothers? Let’s all take a vote on this, okay?

Author/s: 
David Galef
Author's email: 
david.galef@olemiss.edu

David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).

Teach Only What You Know

About two weeks before the 2004 presidential election, one of the students in a government class that I was teaching raised his hand and demanded to know who I was supporting for president. I paused for a moment, somewhat taken back by the stridency of the student’s request. Noticing my reaction, he offered some background, explaining that he was not the only one in the class who had this question. We had, after all, been talking about the election during nearly every session, and my reticence with regard to what seemed to the students to be a crucial point was a source of confusion.

Despite his protests, I refused to answer, and quickly moved to the topic of the day. Later on, however, I had some time to consider the exchange. And the more that I thought about the student’s question, the more pleased I became. This was, I thought, one of the best evaluations that I had ever received. Here was real evidence that I was doing my job!

Here’s why: My students should not be able to tell, at least from what I say in class, who I prefer to sit in the oval office. For one thing, this would be a form of “bait and switch,” since nothing about the sharing of my political opinions appears in the catalogue that the students presumably consult before paying their money and scheduling my course.

More to the point, however, is that I am not qualified to teach students about who should be elected. In fact, I am no more qualified to tell people who they should vote for than I am to teach a class in quantum mechanics. I have colleagues over in the physics department who are qualified to offer a course in the latter subject; none of us has the same credibility when it comes to the former. Indeed, in an important way, this blanket incompetence is a part of the class lesson -- particularly, though not exclusively, in a class on American government. It is an implicit argument for democracy, or at least democratic equality. It is also, however, an argument about education.

If professors, or anybody else for that matter, actually "knew" who the president should be, then voting, especially by those who did not know, would be unnecessary, and probably counterproductive. This is easy to illustrate by considering the following example: Suppose that I feel ill, and would like to know what I might do to feel better. One approach would be to poll my friends, asking each of them what I should do. But suppose that among my friends was a medical doctor. Would it not make sense to follow her advice, eschewing the opinions of the rest of my friends? Now, what if I were on a deserted island, with no trained medical professionals available? Then, I might as well seek out the advice of friends, summing their opinions. When we are all equally ignorant, we might as well vote.

Most Americans seem to intuitively grasp this notion, and have gradually moved our political system away from any form of “rule by the experts.” The best example of this may be found within the evolution of our electoral system for choosing the president. .If one reads carefully through the Constitution, one finds that the document does not call for the popular election of the president. Instead, state legislatures are charged with appointing presidential electors (the real voters) in any manner which they see fit.

By practice, though not amendment, Americans have reformed this process. Indeed, fairly quickly, legislative appointment was replaced by the popular election of presidential electors. The reason why elections like the one in 2000 -- in which the electoral and popular votes do not reach the same outcome -- are so disturbing is because most Americans think that they do, and should, select the president. No one stands up for an independent board of electors, because scarcely anyone believes that a qualified electoral elite exists. Again, where there are no experts, let’s let everyone have their say. This should serve as a reminder -- particularly to my colleagues in the academy -- about equality. We are all equally entitled to our opinion on electoral matters. That is why we vote.

This understanding has implications for the classroom that extend beyond politics. What we know, we should teach. We ought to keep our opinions to ourselves. This is an important point to keep mind as we read polls, including a recent one by the Zogby organization, that suggest that the public thinks that political bias among academics is a real problem. The public might well have a valid point.

Too much is made of the fact that the views expressed by these academics seem at best out of the mainstream, and at worst dangerously radical. One would, after all, expect those who have dedicated themselves to the careful study of a subject to know more than most about their area of expertise. And those who know should not be bound by -- or be expected to teach about -- the opinions of those who do not know, even if those opinions are held by a majority of people.

This leads to the real objection that ought to be lodged against those who bring their political opinions into the classroom: Do they know what they are talking about? In the classroom, a basic distinction ought to be maintained between knowledge and opinion. To return to my earlier example, I “know” how the mechanics of the electoral system work. I have an opinion about who should be elected using this system. Therefore, I should teach only the former; not because I might offend the delicate political sensibilities of my students, but rather because this distinction between knowledge and opinion is fundamental to any academic endeavor.

Ideally, what scholars seek -- indeed what every educated person hopes to attain, however partially -- is to replace opinion with knowledge. Through both what and how we teach, instructors inspire in their students a sense of both what is known, and how much remains to be discovered. This is what the philosopher Socrates meant when he argued that the first step in the educational process is "to know what we do not know." By becoming aware of how little we know, we are motivated to learn.

The sin committed by any teacher who spouts his or her political views in the classroom is, therefore, not political, but academic. By feigning certainty where there is only opinion, they encourage ignorance in their students. Teachers are free to hold and express (outside of the classroom) any opinions that they wish. What they must not do (in the classroom) is to pretend to know more than they do.

As the writer G. K. Chesterton wisely observed, "It is not bigotry to be certain we are right; but it is bigotry to be unable to imagine how we might possibly have gone wrong." This type of bigotry does not serve our students or our democratic system. Avoiding it is not always easy, but it is our job.

Author/s: 
Paul A. Sracic
Author's email: 
newsroom@insidehighered.com

Paul A. Sracic is a professor and chair of the political science department at Youngstown State University.

Academic Freedom, Outside the Academy

I recently had a discussion that led me to a basic question: Why is the concept of academic freedom as a semi-protected activity limited by custom to people who teach in universities? Why doesn’t it apply to any person engaged in research and publication on issues important in our lives? What is the theoretical underpinning of the argument that non-faculty don’t have academic freedom in the same sense that faculty do? What is it that faculty actually do that is different from what I do, at least part of the time?

Is it that faculty need to be free to publish important books and articles? I have published four books as author or contributing editor (three with a university press), one of which is a five-pounder and is considered the definitive modern work in its field. I have published chapters in other major books, 36 articles or commentaries on education issues, 75 on ornithology (mostly in non-refereed outlets) and another two dozen that don’t fit neatly into categories. This doesn’t count work that I produce in my job as a college evaluator. I’m also the new book review editor for a small, well-respected refereed journal and a glorious but undiscovered poet.

Because I work as a college evaluator and routinely review faculty qualifications, I can say that my actual output of what would normally be considered scholarly work is quite similar to what I would expect of a mid-career professor at a mid-level college. In short, in terms of tangible product, I do what they do.

Is it that faculty teach? Let us define teaching. Let me know when you’re done -- with luck, I will have retired by then. I suppose we have an obligation to at least attempt to answer the question, but allow me to argue that teaching and learning take place all the time in all parts of society, whether or not a traditional cage is constructed around the putative teachers and learners.

Is the difference that I as a non-faculty member have been classified by society as fit for some tasks but not for others? By whose order? Under what theory? With what brief? Certainly as a state employee I am obligated to perform the tasks that are in my job description, and likewise obligated not to go about publicly trashing the goals of my employer. Beyond this, am I not free to pursue the truth wherever it may take me?

Universities have traditionally been assigned by society the role of pursuing truth and transferring knowledge in a semi-protected setting, if not beyond the reach of interfering powers, at least having some defenses against those powers. This is a good thing, but doesn’t it seem strange that a special kind of institution in society must be set aside for this purpose?

I do not think that the traditional collegiate cloister as our sole reservation for academic freedom works very well any more. The ability of independent scholars to operate outside institutions has increased along with the utility of the Internet. The Supreme Court wrote, in an era before the personal computer, PDA and cell phone (to say nothing of iPhone), that:

“Our nation is deeply committed to safeguarding academic freedom, which is of transcendent value to all of us and not merely to the teachers concerned. That freedom is therefore a special concern of the First Amendment, which does not tolerate laws that cast a pall of orthodoxy over the classroom.” (Keyishian v. Board of Regents, 1967)

Where, and what, is the classroom today, 40 years downstream from Keyishian? If a friend of mine publishes a detailed study of hospital spending practices, molt strategies in the American Wigeon or the perfidy of Donald Rumsfeld on a blog, Web site or other nontraditional venue, and invites comment from all comers, isn’t that just as much a classroom as an enclosed space in which one human is bleating in person at a roomful of (mostly) younger humans? Certainly the gray area is taking on more and more layers and shades with the advent of more varieties of distance-learning.

To spend a moment longer in the relatively cramped legal arena, the Supreme Court has also granted certain kinds of academic freedom protections to universities themselves, under a theory that they as institutions have a special role in society and need to have some protection from unseemly attempts to influence their work. Yes, to be sure, that is true, but there are other institutions in society, e.g., publishers, think tanks, foundations; whose role is, if not the same in structure, surely overlapping in goal and function.

At a time when more and more people of all ages get their news and information off the Internet, and when young people of traditional college age do a vast amount of their fact-gathering online (whether the facts are, if you will, true, is another question), the argument that universities need a special protected status as our principal conductors of information and values to young adults has been losing weight for years.

We see more and more corporate sponsorships of research or faculty positions and degree programs that, as a practical matter, relate solely to the products of one or two companies. The idea that the university is separate from the pressures of the outer world (and therefore that people who work there should have a special status for themselves and their work) is getting harder to sustain. Should people employed by banks, supermarkets or governments who publish academic work be afforded protection under an academic freedom theory from retaliation by their employer if the employer happens to dislike the work? I can’t think why not.

When we have resources as good as, for example, Reginald Shepherd’s teaching-blog on poetry, the argument that the traditional classroom is necessary as a baseline for the theory, practice and legal protections of academic freedom begins to look like an argument that a sufficiency of draft horses is necessary for national security.

Norms move forward. I argued a while ago ("Accrediting Individual Instructors," The Independent Scholar 18(1):10-12, Winter 2004) that we need to stop accrediting colleges and start accrediting teachers. The fact that a top-flight poet like Shepherd now contracts with students privately and engages in significant dialogues on poetry and culture via a blog is but one example of an educational trend that militates toward recognition that academic freedom, in its purposes, results and legal classification, needs to be decoupled from the nature of an individual scholar’s employment.

Academic freedom adheres to the purpose and function of academic inquiry, not to technicalities of institutional affiliation. Anyone who engages in inquiry and publication according to the norms of academe is entitled to the scholar’s woolen cloak. It may not protect against all enemies, but it serves to reduce the chill of unpopular thought.

Author/s: 
Alan Contreras
Author's email: 
newsroom@insidehighered.com

Alan Contreras works for the State of Oregon, where Article 1, Section 8 of the Oregon Constitution allows him to publish what he pleases. His views do not necessarily represent those of the commission. He blogs at oregonreview.blogspot.com.

A Crisis of Ethic Proportion

The financial sector catastrophe and consequent worldwide recession are a crisis of “ethic” proportion, in Vanguard founder John Bogle’s words. Higher education’s own responsibility for the failures of ethical leadership in business, the gatekeeper professions, and government should trigger a careful self-assessment. Could it be that the academic profession, whose members both educate and serve as role models in the formation years for leaders in business, government, and all the other peer review professions, is falling short in its own ethical responsibilities?

A major theme of "The Future of the Professoriate: Academic Freedom, Peer Review, and Shared Governance," the first in the Association of American Colleges and Universities' new Intentional Leadership in the New Academy series of essays, is that the academic profession has been failing for many years in its ethical duty to acculturate new entrants into the tradition and ethics of the profession. The central argument in "The Future of the Professoriate" is that members of a peer-review profession cannot aggressively justify and defend their control over professional work when they do not both understand the profession’s social contract and internalize their responsibilities under the social contract. The social contract of each peer-review profession is the tacit agreement between society and members of a profession that regulates their relationship with each other, in particular the profession’s control over professional work. Essentially, in order for the public to grant a peer-review profession more autonomy and control over the work different from the control that society and employers exercise over other occupations, the public must trust that the profession and its members will use the autonomy at least to some degree to benefit the public in the area of the profession’s responsibility, not abuse occupational control over the work merely to serve self-interest.

The simple fact is that all the data available indicate that a substantial proportion of graduate students and faculty members do not clearly understand the profession's social contract, academic freedom, shared governance, and each professor's and the faculty's specific duties that justify the profession's claims to autonomy. Osmosis-like diffusion of these concepts and duties does not work. There must be required education on professional ethics for graduate students and entering and veteran faculty just as there is for law students in all states and members of the legal profession in many states. (Academic Ethics (American Council on Education/Oryx Press, 2002) outlines the content of this education.)

The governing boards of many colleges and universities represent the public in the social contract between the public and the academic profession. "The Future of the Professoriate" argues that the boards and their senior administrative teams have faced substantial market changes in higher education in recent decades; the current budgetary disaster driven by reduced taxpayer support for public higher education and reduced endowments is among the most difficult of these market changes. While members of all peer-review professions carry an ongoing burden to justify to the public (and the boards representing the public) the profession’s occupational control over the work, carrying this burden is particularly critical during a time of rapid market change.

The report's analysis is that during this period of market change, the academic profession has been almost totally missing in action in mounting a robust public defense of both how the public benefits from the profession’s autonomy and control over its work in the form of academic freedom, peer review, and shared governance and how the profession and its members are actively fulfilling their duties under the social contract. Paradoxically, while we are educators, we are not educating. The situation is similar to the failure of the medical profession to mount a robust public defense of its autonomy during the 1980s and 1990s when the health care market changed toward managed care that dramatically reduced the medical profession’s control over its professional work.

At a significant swath of institutions, the academic profession’s defense of the social contract has focused on rights and job security. As Eliot Freidson in Professionalism: The Third Logic (University of Chicago Press, 2001) has observed, when the peer-review professions defend their social contracts, they typically rely on a rhetoric of rights, job security, and “good intentions, which [are] belied by the patently self-interested character of many of their activities. What they almost never do is spell out the principles underlying the institutions that organize and support the way they do their work and take active responsibility for [the realization of the principles].” They do not undertake responsibility for assuring the quality of their members’ work. The academic profession’s anemic defense of its social contract confirms Freidson’s observation.

The predicable result of an anemic defense of a profession’s social contract during a time of market change is that the society and employers will restructure control of the profession’s work toward the regulatory and employer control typical for other occupations -- essentially the default employment arrangements in a market economy. This is what has been happening to the academic profession. The boards at many colleges and universities have been renegotiating a sweeping change in the academic profession’s social contract over many years to reduce the profession’s autonomy and control over professional work. "The Future of the Professoriate" details how the renegotiation is most evident with the dramatic increase in contingent faculty to the point that, by 2003, 59 percent of all newly hired full-time faculty started in non-tenure-track positions.

The academic profession must not resign itself to the current trend toward contingent faculty, but it cannot reverse the trends toward a higher proportion of contingent faculty and less occupational control over professional work by employing a rhetoric of rights, job security, and good intentions. However, professors cannot defend the social contract without both having the knowledge necessary to make the defense and actively meeting their duties under the social contract. The single most important step for the profession is improving the acculturation of graduate students and veteran academics into the tradition and ethics of the profession. The best starting point at each institution may be a simple faculty self-assessment of the degree to which the faculty is helping new and veteran faculty members understand and internalize both the minimum standards of competence and ethical conduct for the profession (the ethics of duty) and the core values and ideals of the profession (the ethics of aspiration).

If the academic profession at many institutions does not undertake these responsibilities, then this crisis of ethic proportion will continue, and the trajectory for the academic profession for the next twenty years will, in all likelihood, look like the trajectory for the last thirty years. Members of the profession will continue a slow transformation toward employment as technical experts subject to the dominant market model of employer control over work.

While many in the profession believe the battle is against oppressive governing boards, administrators, and market forces, the battle is actually for the soul of the profession. Imagine a world in which each professor at an institution had fully internalized the tradition and ethics of the profession. We are educators. From a position of knowledge and moral authority, not just self-interest, we could then convince the public -- and, most importantly, the governing boards and administrative leadership who are trustees for the public good of creating and disseminating knowledge -- that academic freedom, peer review, and shared governance best serve the institution’s mission.

Author/s: 
Neil W. Hamilton
Author's email: 
newsroom@insidehighered.com

Neil Hamilton is professor of law and director of the Holloran Center for Ethical Leadership in the Professions at the University of St. Thomas.

Don't Avoid Conflicts; Mine Them

It was late at night on a spring evening in 2006 at Columbia University, and a dozen of us remained around a table; no one wanted to leave. Earlier I had spoken about how to identify what was and what was not anti-Semitism. This group of progressive Jewish students wanted to keep talking. I had expected their post-presentation conversation to be about Zionism or definitions of anti-Semitism, but what made the students want to stay for that last hour was a discussion about the college experience itself.

After talking about the expected topics, one student had said, “This is the first time I’ve felt comfortable saying what I really think about Israel.”

When I asked why that was, she said, “Because I always have to gauge, if I say what I think, whether that would impact a grade or a friendship.”

“Is this only about Israel that you find yourself repressing your views?” I asked. “No,” she said. Others agreed – this culture of double-checking one’s thoughts, they said, applied to many issues, and was experienced by non-Jewish students at Columbia, too, as well as by students they knew at other campuses.

How depressing that at an institution designed to shake up the thinking of smart young people, the message heard instead is the importance of self-censoring. Not because of harassment or intimidation, but because there was insufficient space created and cultivated for students to take intellectual risks. College should be the time when students receive encouragement to say things that others might find difficult or even offensive, as part of the learning process.

The flip side of this problem occurred Feb. 8 at the University of California at Irvine. Israeli Ambassador Michael Oren spoke, or at least he tried to. He was repeatedly interrupted by anti-Israel students, heckling him.

This is part of a disturbing trend of Israeli speakers on campus being denied the ability to speak or speak without harassment (as has happened at University of California at Los Angeles, University of Pittsburgh, University of Chicago and elsewhere). The UCI campus has had a long history of anti-Israel and anti-Semitic incidents, usually tied to its Muslim Student Union. These students were not afraid to say what they thought, but they displayed a complete unwillingness to listen. They interfered not only with Oren’s ability to share his ideas and experiences, but even more importantly, the ability of their classmates to learn.

To the university’s credit, protesters were removed and arrested. An Irvine official noted that disrupting a speaker violates the campus code of conduct, and that suspensions or expulsions might ensue. The students who disrupted the event must be disciplined. What they did directly undermines the integrity of the academic process, much as plagiarism does, and should not be tolerated.

But there is a larger issue here. As young adults become engaged with new ideas, especially ones that touch some supercharged aspect of their identity, they may lose the capacity to see complexities and grays, and self-righteously see themselves as arbiters of correct thoughts or morality.

Rather than just accept this developmental zealotry as a fact of life, university leaders should strive to educate students who can think clearly about -- as opposed to demonize and dismiss, or be fearful of engaging -- ideas with which they do not agree.

The problem is that students do not sufficiently understand the nature of the academic enterprise, and what is expected of them. To help them learn, faculty members and university leaders must mine conflicts (about anything, not just views toward the Middle East), not avoid them. It is from difficult and contentious questions, not the easy or formalistic ones, that students can learn the most.

Yes, it is true that students identify with one faction or another and may have a great desire to “win” a political contest. But universities should be much clearer about defining the importance, and uniqueness, of the academic enterprise and the culture it requires. No student should be afraid to say what he or she thinks, and no student should prohibit another from learning. It is no accident that the smartest people I know are more likely to begin a sentence with “I might be wrong, but....” It would help if students learned they might be too, and that being wrong is not the end of the world.

Academic freedom, of course, requires that people have the right to talk on campus. But for that freedom to be real, rather than a nice-sounding notion, much more than policy statements and enforcement of rules is required. Campus administrators need to have a clear goal: that every student should understand, in his or her core, that the purpose of their college education is to help them learn one thing -- to be a critical thinker.

A critical thinker appreciates ideas that challenge or contradict more than those that endorse or confirm. And a critical thinker takes risks.

Author/s: 
Kenneth Stern
Author's email: 
newsroom@insidehighered.com

Kenneth Stern is director of the American Jewish Committee's Division on Antisemitism and Extremism.

The N-Word

As a child of the civil rights movement of the 1950s and 1960s, I cannot say the n-word unself-consciously. Nevertheless, it is regularly placed on my tongue — not by Joseph Conrad, since I no longer teach fiction, but rather by African American poets from Langston Hughes to Amiri Baraka. Some faculty members no longer teach Conrad’s The Nigger of the Narcissus, since the title alone is enough to place an impassable roadblock on the syllabus. I have no choice. I am not about to teach African-American poetry while repressing its diction. So much for the notion that the word is forbidden in the classroom.

I take a certain pride in reading poetry dramatically. I urge my students to perform the week’s poems at home out loud before coming to class. I perform Robert Bly’s Vietnam poem “Counting Small-Boned Bodies” much as he did, in a wrinkled rubber mask suggesting ancient evil gloating over death. I try to occupy the bodies of the racist speakers in Robert Hayden’s “Night, Death, Mississippi.” But the n-word exits my mouth in a flat, decathected tone that does no justice to its injustice. The word “nigger” defeats me.

I would not myself say what Allen Zaruba did this semester, though I understand how both personal experience and a certain collective cultural inertia brought him there. A part-time faculty member at Towson University, he described himself in class as “a nigger on the corporate university plantation.” People now regularly refer to the corporate university’s plantation mentality, to an authoritarian style of top-down management invariably indifferent to its underpaid and exploited employees. And I, among others, have called its contingent teachers “wage slaves,” drawing the usage from Marx and Emma Goldman and the long history of the labor movement. The phrase resounds through Joe Hill’s songs. So Zaruba’s next step was surely inevitable. Indeed the administration’s decision to fire him summarily exposed the extreme vulnerability of contingent faculty members and reflected the very power relations his declaration evoked.

Zaruba was not teaching a Langston Hughes poem, so Towson University administrators finessed their abridgement of his academic freedom and due process rights by saying the declaration was not specific to the classroom context. But it was germane to every moment of every class he and others teach at Towson. It underpins the classes Towson’s students take. The exploitation of contingent teachers is now the bedrock of American higher education. Zaruba has the right to testify to his working conditions and all students’ learning conditions in every class he teaches. He cannot devote his entire course to those facts, but neither can he be compelled to suppress them. While the analogy between today’s contingent teachers and plantation era slaves is far from exact, and it is arguably clumsy and historically inept, the formulation is well within his pedagogical rights.

The staff of the American Association of University Professors has appropriately emphasized that Towson violated our recommended standards. Zaruba was entitled to a hearing before his faculty peers, not dismissal by administrative fiat. The 13th section of the AAUP’s Recommended Institutional Regulations makes it clear that part-time faculty members, like full-time faculty members, can be fired before the terms of their appointments end only for cause. “Cause” must be adjudicated at a faculty hearing. And every faculty member is due a hearing when a violation of academic freedom is asserted. Zaruba never received the educational equivalent of his day in court.

Of course a more enlightened and courageous administration might have expressed regret at Zaruba’s rhetoric while defending his right to use it and might even have acknowledged a certain logic underlying his claim. But that would have entailed a critique of Towson’s hiring practices. Instead administrators chose to behave like plantation managers, punishing him by fiat. If administrators wanted to sanction Zaruba in any way, they had to base their action on the recommendation of a faculty hearing, a procedure they failed to follow. But I do not believe Zaruba’s remark justified either intervention or punishment.

Zaruba apologized abjectly, more than he should have needed to. That should have put the matter to rest, but the apology was to no avail. The administration treated what was in fact an economic and political analysis in miniature — decidedly within his academic rights — as a crime so monstrous that frontier justice had to be meted out without delay.

Whatever else it was, Zaruba’s remark was not a racial slur. It was not directed at people of color generally, as was its 2007 use by a board member at Roger Williams University who was compelled to resign. It was not uttered as an assault on one or more students, as it was when a baseball coach at the University of Oklahoma lost his job in 2005. Zaruba’s declaration was a warranted expression of personal anguish, an instructor’s reflexive act of higher education witness. Firing him summarily undermines academic freedom, eviscerates shared governance, and diminishes us all. Allen Zaruba deserves his job back.

Section: 

The New Campus Culture Wars

I was an undergraduate in the mid-1990s, the heyday of identity politics. We read copious amounts of Cornel West and bell hooks, demanded multicultural centers and gender studies departments, applauded Ellen’s coming out and protested demeaning mascots like Chief Illiniwek. “Race-class-gender-ethnicity-sexuality” was repeated so often, it almost became a single word.

No doubt parts of the identity politics movement went off the cliff (down with Western Civ!). And there was pushback, of course (only the West has civilization!). But over all, university administrations recognized an important opportunity and charted a sensible middle course.

In a society with too much racism and sexism, in a globalized world with too much ignorance and misunderstanding, campuses could be alternate universes – models where equity, harmony and appreciative knowledge of other cultures were the norm, launching pads for leaders who absorbed that larger vision and learned the skill set to improve the broader society when they graduated.

So new centers were started, new professors hired, new course requirements added. And most importantly, new norms were set. College leaders at the highest level defined their campuses as models of inclusiveness and open-minded learning. Incidents that were seen as marginalizing a particular group (white students showing up to a party in blackface), or books like The Bell Curve that argued that some races simply had lower aptitude than others, were met with the higher education equivalent of social outrage. Of course, the flags of “free speech” and “academic inquiry” were raised, but the mantle of building an inclusive learning community carried the day.

Muslim students waking up to chalk drawings mocking the Prophet Muhammad on their college quads are probably likely wondering why their identity is not a cherished part of the college ethos of inclusiveness. In case you missed it, “Everybody Draw Muhammad Day,” which is today, is a campaign that has hit several campuses already and has the potential to scale, fast.

The only ingredients you need are a handful of students who believe they are crusaders for free speech, some chalk and the cover of darkness. The campaign was sparked by Comedy Central’s decision to censor an episode of “South Park” that depicted the Prophet Muhammad in a demeaning manner. “South Park” has a reputation for offending roundly, and is, of course, on a cable channel people pay for and opt in to. A college quad is a public place where there is an implicit promise by the university that students of all backgrounds will feel safe and accepted.

When there is a racially demeaning event on a college campus – like the Compton CookOut at the University of California at San Diego – higher education responds like it’s a five-alarm fire. Administrators organize town hall meetings to discuss the threats to inclusiveness, Presidents send out e-mails to the whole campus calling for racial sensitivity. Faculty committees are formed to submit recommendations on how to make minority students feel welcome. The incident is used, appropriately, as a teachable moment, an opportunity to affirm and expand the university as an inclusive learning environment.

If there was any alarm raised by higher education in response to the chalking Muhammad incidents, it’s been hard to hear. (With the important exception of chaplaincies on certain campuses that have adapted to engage religious diversity.) For the most part, the discussion has been in the free speech vs. Fundamentalist Islam frame. But isn’t this incident also a teachable moment about identity? Shouldn’t universities be boldly advancing the narrative of actions that build an inclusive campus vs. actions that marginalize a community?

While this particular incident may be about the sensitivities of Muslim students, there is a much larger issue at play here. What the race-class-gender-ethnicity-sexuality movement of the 1990s missed was religion. But faith can’t be swept under the rug any longer. Religion is the new fault line in the culture wars. From the “The Passion of the Christ” to the passions raised by the Middle East, from the new aggressive atheism to the religious revival among evangelicals and Muslims, conflicts in the culture are quickly becoming conflicts on the quad.

Colleges ought to view this as an opportunity to be embraced, rather than a headache to be ignored. Just as campuses became models of multiculturalism, so too can they become models of interfaith cooperation. After all, campuses gather students from different religious backgrounds (including no religion at all), they view themselves as a vanguard sector that models positive behavior for the broader culture, and they already have an ethos of pluralism.

An awful lot is at stake here, especially if campuses want to maintain their reputation as inclusive learning environments. Just about the only agreement among different religious student groups right now is that the only identity you can openly insult on a campus without inviting social outrage is religion.

And as far as being the nation’s flagship learning environments, higher education ought to consider this: Probably the most salient thing many U.S. college students know about the central figure in the world’s second largest religion -- among the most influential people in history -- is that Comedy Central won’t let him be portrayed on South Park.

Author/s: 
Eboo Patel
Author's email: 
newsroom@insidehighered.com

Eboo Patel is the founder and executive director of Interfaith Youth Core, an organization that works with college campuses on religious diversity issues.

New Orleans, Back in the Fold

AAUP clears two New Orleans universities from its censure list, but cites Bethune-Cookman, Idaho State and RPI.

Trial Balloon on Ward Churchill

University of Colorado officials find themselves boxed in a corner as they try to figure out what to do about Ward Churchill.

Pages

Subscribe to RSS - Academic freedom
Back to Top