Toby (not his real name) flunked a graduate course I taught last year. He failed the in-class assignment (a mid-term essay exam) as well as the out-of-class assignments (a couple of case analyses and a take-home exam). Reviewing Toby’s work was excruciating; extracting coherence from his paragraphs was a futile exercise, even with repeated readings. Theoretical analysis in his writing was virtually nonexistent. Put simply, this was an academic train wreck.
As I interacted with Toby over the course of the term, I kept asking myself, “How did this pleasant young man ever manage to obtain an undergraduate degree?” He certainly had one, awarded by a regionally accredited institution (not mine). And how did he get into yet another institution (my institution, but not my program) to pursue a master’s degree?
Welcome to the world of Lower Education. Toby’s case may be extreme, but it underscores a fundamental reality that shapes a major segment of higher education in the United States: Colleges cannot survive without students, so colleges that have a difficult time competing for the “best” students compete for the “next best” ones. And colleges that have trouble securing the “next best” students focus on the “next-next best” ones, and on and on and on, until a point is reached where the word “best” is no longer relevant. When this occurs, students who are not prepared to be in college, and certainly not prepared to be in graduate school, end up in our classrooms.
This is not startling news. It’s a rare college or university that does not have an academic remediation/triage center of some kind on campus, where an enormous amount of time is spent teaching students skills they should have learned in high school. To be sure, many of these unprepared students drop out of college before graduation, but a significant percentage do make it to the finish line. Some of the latter will have indeed earned their degree through great effort and what they’ve learned from us. But others will have muddled through without displaying the skills we should require of all students. My 35 years of university experience tell me that in these cases faculty collusion is often a contributing factor.
What is the nature of this collusion? In far too many instances, little is required of students in terms of the quality and quantity of their academic work, little is produced, and the little produced is, to put it mildly, graded generously. Some might argue that the mind-numbing proportions of A’s we often see these days, along with the relative scarcity of low grades, is a reflection of more effective teaching strategies being employed by professors, coupled with a growing population of bright students committed to academic excellence. Unfortunately, this uplifting scenario strikes me as much less persuasive than one that implicates factors such as transactional/contract grading (“5 article reviews equal an A, 4 equals a B,” etc.), faculty who wish to avoid arguing with increasingly aggressive students about grades, faculty who believe that awarding high grades generates positive student evaluations, faculty who express their philosophical opposition to grading by giving high grades, and the growing percentage of courses taught by part-time and non-tenure-track faculty members who might see the assigning of a conspicuous number of low grades as a threat to their being re-hired.
One of the most pernicious consequences of this state of affairs is cynicism toward higher education among those most directly responsible for delivering higher education -- the faculty. Research suggests that one of the most powerful sources of motivation for outstanding employee performance is goal/value internalization. This occurs when espoused organizational goals and values are “owned” by organizational members, who then strive to achieve the goals and live up to the values in their work. Colleges and universities have traditionally been in a privileged position with respect to drawing upon this type of motivation, given their educational mission. The beliefs associated with this mission can include a sizable chunk of myth, but as societal myths go, the ones embraced by higher education (e.g., the ability of research, knowledge, and analytical skill to enhance the public good) tend to have high social value.
In the current zeitgeist, however, many faculty are dismayed to see the provision of educational credentials trumping the actual provision of education. (Fifty might not be the new forty, but the master’s degree is certainly the new bachelor’s.) This perception is enhanced by a proliferation of curriculum-delivery formats (weekend courses, accelerated and online programs, etc.) whose pedagogical soundness often receives much less attention than the ability of the formats to penetrate untapped educational markets. It is difficult for a strong commitment to academic integrity to thrive in such environments.
Faculty who are distressed over all of this should not wait for presidents, provosts and deans to rescue higher education from itself. Moreover, regional accrediting bodies, despite their growing emphasis on outcomes assessment, do not typically focus on courses, programs and admissions standards in a way that allows them to adequately address these issues. For the most part it is faculty who teach the classes, design and implement curricula, and, at least at the graduate level, establish admissions policies for programs. What should faculty do? I offer three modest suggestions:
At the departmental level, work to develop a culture where expectations for student performance are high. When faculty members believe that teaching challenging courses is “the way we do things here,” they are less likely to offer non-challenging ones.
Advocate throughout the institution for the centrality of academic quality to policy making, program development, and program implementation. The question “What are we doing to ensure that X embodies a commitment to academic excellence?” should never be left implicit.
Create opportunities for faculty and administrators to come together in small groups to explore the issues raised by Lower Education. These two constituencies need to find a way to collaborate more effectively, and the mutual stereotyping that frequently characterizes their relationship represents a major obstacle. If we want our conversations relevant to Lower Education to change, let’s experiment with changing the structure within which some of those conversations take place.
Contemplating Lower Education reminds us that faculty members will always face pressures to compromise their academic principles. But explanations of unethical behavior should never be confused with justifications for such behavior. Ultimately, it was the faculty who gave Toby his credential of a bachelor’s degree. They shouldn’t have.
Michael Morris is professor of psychology at the University of New Haven, where he directs the master’s program in community psychology. Since 1998 he has served as an evaluator for NEASC, the New England Association of Schools and Colleges.
Even with the anniversary approaching, reading about 9/11 feels like a matter of duty, not desire. Especially with the anniversary, in fact: Magazines and PDF printouts devoted to the 10th anniversary of 9/11 accumulated on my desk for more than a week before I found the will to do more than stare at them. Eventually, the work ethic asserted itself, and this column will digest some of the recently published material on 9/11.
But this spell of hesitation bears mentioning, because a temporary failure of nerve was probably more than my idiosyncrasy. The event itself is hard to think about -- just as it was at the time. My recollection of that day has is not primarily one of fear, though there was plenty of that. (We live in Washington; according to a news report that morning, a car bomb had gone off downtown; this proved false but it stuck with you.) Rather, it was a state of extremely vivid confusion -- of being keenly aware of each passing hour, yet unable to take in the situation, let alone to anticipate very much of anything.
Who could? The experience was unprecedented. So much of the past decade of American life can be traced back to that day: wars, drones, security, surveillance, detention, “enhanced interrogation,” torture porn, the extremes of public emotion about having a president whose middle name is Hussein…. One thing that changed after 9/11 was that, after a while, people quit saying that “everything changed” on that date. But something did change, so that it is difficult to consider the way we live now without returning, sooner or later, to 9/11.
It is a date that names an era. Melvyn P. Leffler’s essay “9/11 in Retrospect,” appearing in Foreign Affairs, tries “to place the era in context and assess it as judiciously as possible.” That means from the perspective of an unexcitable centrism, with an eye to calculating the long-term effects on U.S. power. Leffler, a professor of history at the University of Virginia, is co-editor, with Jeffrey Legro, of In Uncertain Times: American Foreign Policy After the Berlin Wall and 9/11, published this summer by Cornell University Press. (At the time of this writing, his article is behind the journal’s paywall.)
Against those of us who believe that George W. Bush came into office with the intention of taking on Iraq, Leffler maintains that the administration was overwhelmingly preoccupied with domestic policy before 9/11 and improvised its doctrine of “anticipatory self defense [or] preventative warfare” out of “a feeling of responsibility for the public and a sense of guilt over having allowed the country to be struck.” In shifting gears, Bush and his advisers “had trouble weaving the elements of their policy into a coherent strategy that could address the challenges they considered most urgent.”
The combination of tax cuts and increased military expenditures “seriously eroded” the country’s “financial strength and flexibility,” even as occupations and counterinsurgencies undermined U.S. credibility as a force in the Middle East and Persian Gulf. “Iraq was largely eliminated as a counterbalance to Iran,” writes Leffler, “Iran’s ability to meddle beyond its borders increased, and the United States’ ability to mediate Israeli-Palestinian negotiations declined.” Meanwhile, “China’s growing military capability” began “endanger[ing] the United States’ supremacy in East and Southeast Asia” -- which was probably not high on Osama Bin Laden’s agenda, but history is all about the unexpected consequences.
The attacks on 9/11 “alerted the country to the fragility of its security,” Leffler concludes, as well as “the anger, bitterness, and resentment toward the United States residing elsewhere, particularly in parts of the Islamic world. But if 9/11 highlighted vulnerabilities, its aftermath illustrated how the mobilization of U.S. power, unless disciplined, calibrated, and done in conjunction with allies, has the potential to undermine the global commons as well as protect them.”
“It’s been a sad, lost, and enervating decade,” says the editorial note introducing the discussion of 9/11 in Democracy, a quarterly journal calling for “a vibrant and vital progressivism for the 21st century.” With contributions by 11 academics and journalists -- running to 35 pages of the fall issue -- there is too much to synopsize, but the title sums things up reasonably well: “America Astray.” (The symposium is currently posted online, in advance of the print edition.)
But two interventions stand out from the prevailing tone of frustration and worry. Being the gloomy sort myself, I want to emphasize them here, just to see what that’s like.
Elizabeth Anderson, who is a professor of philosophy and women’s studies at the University of Michigan at Ann Arbor, writes with evident disappointment that Bush’s legacy lives on: “Overall, Obama’s record on executive power and civil liberties diverges little from his predecessor. In certain respects it is even worse....” She refers to continued domestic spying, huge expenditures for the National Security Agency, the prosecution of leakers “on an unprecedented scale,” and Obama’s targeting of an American citizen, Anwar al-Awlaki, for “extrajudicial killing … even outside any battlefield context.”
“The traumatic experience of 9/11 lies behind all of these [actions and policies],” Anderson writes. But the revival of “public demand for privacy, civil liberties, and greater transparency is likely -- one hopes, anyway – to override the fears that underwrite state violations of constitutional rights.” The profound demographic shifts of the coming decades means that political parties “will soon see that they have more to gain by integrating immigrants and their American children into society than by pandering to anti-immigrant prejudice.”
Well, it’s hard to make predictions, especially about the future (as Yogi Berra said, or should have) but the notion of moving beyond the post-9/11 rut is certainly appealing. The other Democracy contributor to offer an encouraging word is Fawaz A. Gerges, the director of the Middle East Center at the London School of Economics, who recapitulates some of the argument from his book The Rise and Fall of Al-Qaeda, just published by Oxford University Press.
“The Arab Spring reinforced what many of us have known for a while,” he writes. “Al Qaeda’s core message is in conflict with the universal aspirations of the Arab world…. Bin Laden and his successor, Ayman al-Zawahiri, neither speak for the umma (the global Muslim community) nor exercise any influence on Arab public opinion.”
The organization has shrunk from three or four thousand fighters to perhaps a tenth of that, with its best cadres now either dead or “choosing personal safety over operational efficiency.” While Zawahiri is dangerous, Gerges says, he lacks Bin Laden’s charisma or strategic sense. The best way to undermine what remains of the organization would be to withdraw American troops from Muslim countries.
Far bleaker is Michael Scheuer's assessment in “The Zawahiri Era,” published in the new issue of The National Interest, a conservative policy journal best known as Francis Fukuyama's venue for proclaiming “The End of History” in 1989. Scheuer is a former CIA analyst and the author of Osama Bin Laden (Oxford University Press, 2011). While noting Zawahiri’s “potentially debilitating personality traits and leadership quirks,” Scheuer also calls him “a rational, prudent, brave, dedicated and media-savvy leader,” fully capable of rebuilding the movement.
But that assumes Zawahiri can attract new fighters. Whatever recruitment spike Al Qaeda enjoyed after 9/11 has long since exhausted itself, to judge by the excerpt from The Missing Martyrs by Charles Kurzman appearing in the September-October issue of Foreign Policy (which is something like Foreign Affairs' younger, better-dressed sibling). Kurzman is a professor of sociology at the University of North Carolina at Chapel Hill. As with Gerges and Scheuer, his book is from Oxford University Press. “By my calculation,” he writes, “global Islamist terrorists have managed to recruit fewer than 1 in 15,000 Muslims over the past quarter century and fewer than 1 in 100,000 Muslims since 9/11.” (The article is available to subscribers.)
Mohammad Atta and his associates were not riding the wave of the future, then: “There aren’t very many Islamist terrorists,” Kurzman says, “and most are incompetent. They fight each other as much they fight anybody else, and they fight their potential state sponsors most of all. They are outlaws on the run in almost every country in the world, and their bases have been reduced to ever-wilder patches of remote territory, where they have to limit their training activities to avoid satellite surveillance.”
So much of the discussion leading up to this anniversary looks to the present or the future – as if 9/11 were not in the past, but rather something that still abides. As Jurgen Habermas said in an interview a few years ago, 9/11 may have been the first event to be experienced, as it was happening, on a really global scale. That may have something to do with the way it seems to have irradiated everything, and to linger in the air.
In their article “The September 11 Digital Archive,” appearing in the fall issue of Radical History Review, Stephen Brier and Joshua Brown seem to echo the philosopher’s point. “One difference demarcating September 11, 2001, from previous epochal historical moments,” they write, “was its status as the first truly digital event of world historical importance: a significant part of the historical record – from email to photography to audio to video – was expressed, captured, disseminated, or viewed in (or converted to) digital forms and formats.”
To preserve these traces for the future was an undertaking both urgent and vast. Within two months of the attacks, the American Social History Project at the City University of New York Graduate Center and the Center for History and New Media at George Mason University began working to gather and store such material, as well as thousands of recollections of the day submitted by the public. (Brier was a co-founder of the ASHP and Brown is currently its executive director.)
While still under development – adding adequate metadata to the files, for example – the September 11 Digital Archive is available online and should be taken over by the Library of Congress in 2013. It should not be confused with the LoC’s September 11, 2001, Web Archive, which has screen shots of websites around the world that were taken, according to the library’s description, between September 11 and December 1 of 2001. Unfortunately the collection is rather primitive and unreliable. A number of items are actually from late 2002 and have no bearing on 9/11; some entries in the register turn out to have no corresponding webpage.
No doubt a much better digital archive for 9/11 is on an NSA server somewhere. It may be some while before historians get to see it – maybe by the centennial? In the meantime, the rest of RHR's special issue "Historicizing 9/11" can be downloaded here.
Always excited by conflict, media commentators have recently been riveted by the even more dramatic spectacle of impasse, more specifically, the ever-proliferating standoffs between adversaries who refuse to budge: players and owners in the NBA and NFL at odds and triggering lockouts in both leagues; the legislature and the governor in Minnesota holding their ground and shutting down state services; and Republicans and Democrats in Washington failing to agree on anything.
These stalemates – and others I could cite — are challenging the reputation of colleges and universities as unrivaled paragons of inaction. In fact, at its most dysfunctional – in the grip of a filibuster, say, or tangled up in arcane rules – the stalled, quarreling U.S. Senate can make the Faculty Senate look like a SWAT team.
I want here to look at what universities can learn from legislative paralysis, particularly the gridlock stymieing Washington. I start from the assumption that universities, more than most organizations, emphasize achieving consensus in decisions. At times, many of us in academe take pride in our commitment to consensus-based decision making, aligning it with such positive values as involving people in the decisions that affect them and favoring persuasion over coercion. At other times, however, even the most forceful advocates of consensus-based decision-making, among whom I count myself, get impatient. Our frustration leads to familiar complaints about herding cats, never getting another accomplished, and enduring interminable meetings that only complicate problems instead of resolving them.
Our commitment to consensus waxes and wanes for many reasons but primarily because we are ambivalent about compromise. Compromise is almost always essential to achieving consensus in higher education. A proposed major change in a university – for example, a revision in the academic calendar or curriculum – typically attracts a core of supporters and an equally vocal group of naysayers. Between these extremes lies a not yet committed, more or less curious group, sometimes a majority of faculty members, who need to be brought along if the proposal is going to succeed. I say "succeed" rather than "pass" because without sufficient support, even a proposal approved by the majority can still be sabotaged or at least stalled. Tenured faculty opponents of the change can continue their dissent with impunity. Lukewarm faculty members can maintain their disengagement, refusing to staff key committees that may be necessary to implementing the change. Although unanimity is neither essential nor realistic, sufficient consensus, not just a majority vote, is crucial.
Measuring "sufficient consensus" is a judgment call administrative and faculty leaders must make before moving on. Familiar marketplace metaphors often guide our reasoning. The "buy-in" of the uncommitted results from "selling" them something. It can be something tangibly in their self-interest – e.g., the curricular change might lower teaching loads – but often in colleges and universities, carrots are as hard to come by as sticks, especially now, when budgetary pressures are increasing class size, freezing salaries, and whittling away travel support. Rewarding cooperation becomes as problematic as punishing intransigence. Buy-in accordingly comes from allowing the initially disaffected to leave their mark on the proposal that results: offering amendments, rewriting sections, raising objections to be dealt with later, all with the ultimate goal of achieving broad "ownership."
The difficulty of reaching this goal is compounded not only by the paucity of material incentives in universities but by a culture that justifiably affirms the intellectual independence and creativity of its members and has difficulty mustering enthusiasm for anything that sounds written by a committee. Absent fiscal exigency, ending a campus discussion of a contentious issue thus becomes as difficult as starting one.
Some critics have used the slow pace of decision making in universities against them -- as evidence, for example, that universities need to be run more efficiently, like businesses, or that tenure allows professors to remain narcissistic, irresponsible adolescents who never learn to work with others. Fans of for-profit higher education, where CEO’s need not wait for a faculty committee to review anything, like to talk about the glacial pace of deliberation in traditional higher education.
But insulting professors and universities deflects attention from the even slower progress of national legislative decision making, where much more is at stake and deadlines loom ineffectually, like warning signs no one reads. At this level, suspicion of compromise has given way to hostility, with President Obama the target, contributing to the national legislative gridlock. It is striking how criticisms of Obama from the left and the right consistently disparage compromise. From the point of view of Frank Rich, Paul Krugman and other liberal columnists, Obama is a disappointing centrist who caves in too readily to his adversaries. From the point of view of Tea Party Republicans, however, he is a steamrolling socialist who must be resisted at every turn, not appeased in any way.
Either way, compromise gets stigmatized: as something the president engages in too readily or as a trap his right-wing adversaries must avoid. The only resolution of their differences that these ideological opponents can imagine is somehow tilting the balance of power in their favor: a game-changing election that will finally allow their side to get something definitive done. The game being changed or ended is the process of debate and negotiation across differences, which few are confident will lead to a better outcome than their own initial position. We are left with paralysis, short-term fixes, posturing for one’s allies, and endless searches for opportunities to weaken the other party.
Here is one example among many of liberal columnists’ wanting to toughen up Obama’s resolve by curtailing what they see as his penchant for compromise: in his June 10, 2011 New York Times opinion piece, Joe Nocera expressed disappointment with Obama’s failure to nominate consumer advocate Elizabeth Warren to direct the new Consumer Financial Protection Bureau. Seemingly cowed by Republication opposition to Warren, "the president's response has been to dither" and search in vain for a compromise candidate. For Nocera, the root of the problem is that Obama is "a president who sees himself as a consensus-seeker. His first instinct is to try to cut a deal." Nocera admits that "there are certainly times when compromise is the right approach." But he goes on to say "this is not one of those times." Obama should finally do the right, not the most expedient, thing: nominate Warren and engage in the partisan fight that would result. Taking a firm stand would "redound nicely to the president’s advantage" by repairing his credibility as a leader in the eyes of the American people, even if Warren would end up not being confirmed.
Much as I concur with the political position of Nocela, it is hard for me to see how acting on his advice would break the impasse that frustrates him. This stalemate results from Republican intransigence, which Nocera is asking Obama to emulate by refusing to budge on certain key points. Fighting fire with fire – responding to one non-negotiable demand with another – is always tempting when stuck in a disagreement. But exchanging ultimatums only exacerbates the standoff one is trying to move beyond, like talking louder in a noisy restaurant. That isn’t to say that Obama should give in to every demand. It is to say that the root of the problem is not his preference for negotiating with his political opponents but their refusal to meet him half way.
Two recent books – Avisha Margalit’s On Compromise and Rotten Compromises (2010) and Robert Mnookin’s Bargaining with the Devil: When to Negotiate, When to Fight (2010) (the source of my title)– shed new light on the rejection of negotiation and the compromises that negotiation inevitably occasions. Margalit and Mnookin see the give-and-take of negotiation as essential to democratic political life and most relationships, from friendship and marriage to business partnerships. Openness to compromise signals respect for other points of view, trust in someone else’s word, and willingness to cooperate and work things out for the good of the relationship or community.
Nevertheless, despite their predilection for compromise, Margalit and Mnookin agree that in certain extreme situations "bargaining with the devil" should be rejected. In identifying these situations, they set the bar high. For Margalit, we should never enter into discussions where the outcome is likely "to establish or maintain an inhuman regime, a regime of cruelty and humiliation, that is, a regime that does not treat humans as humans." Mnookin similarly allows for rare occasions when personal moral objections to engaging in any kind of a dialogue with an adversary can override pragmatic considerations. For both writers, one refusal to negotiate meets these strict conditions: Churchill’s decision not to negotiate with Hitler would qualify.
I seem to have strayed far afield from the Republicans’ refusal to bend in their negotiations with Obama. But keep in mind the association of Obama with Hitler on numerous right-wing websites, not to mention Glenn Beck’s notorious injunction to read Mein Kampf as a guide to Obama’s policies. I am not suggesting that Republican senators and representatives see Hitler in Obama and recoil. I am saying that an undertow of fanaticism keeps them from moving beyond their all-or-nothing demands. "Fanaticism" is the conservative commentator David Brooks’s word for the Republicans’ disdain for "the logic of compromise, no matter how sweet the terms," their willingness to sacrifice everything to the “idol” of their ideological position ("The Mother of All No-Brainers," New York Times, July 4, 2011).
Mnookin offers the following safeguard against fanaticism: always discuss key decisions regarding negotiation with people who see things differently. The need to seek out other opinions is especially important for leaders whose decisions affect others. Gut feelings need to be exposed to public debate even, or maybe especially, when we are sure we are right.
The absence of debate with people who hold different views sustains the fanaticism and demonization that are fracturing national political discussions. Numerous commentators have pointed out how gerrymandered House districts insure the reelection of incumbents or expose them to primary challenges only from candidates to their right or left. In addition, the electronic media foster what Cass Sunstein has called "enclave extremism" and "cyberpolarization": individuals coming together electronically to ratify and compound one another’s biases, suspicious of outsiders and sheltered from opposing views, even ones that claim the backing of empirical evidence and fact. The logical conclusion of this insularity is Sarah Palin’s acolytes rushing to Wikipedia, not to verify her account of Paul Revere but to rewrite what contradicts it, bringing every recalcitrant source of information into her orbit.
This narrow mindedness is antithetical to everything we teach and value in academe. But before we congratulate ourselves too much, we should remember that self-serving dogmatism is where our own ambivalence toward negotiation and compromise can take us when it goes too far and fuels categorical rejection of bargaining with whoever we are tempted to see as the devils in our everyday professional lives: the colleague we can never agree with, the department chair who seems always to say no, the senior administrator whose every word sounds false.
Giving up on dialogue with these individuals, denying them the possibility of change with totalizing words like "never," "always," and "every," inspires dreams of escape and revenge that make us susceptible to much the same self-pity and bitterness that motivate Palin. For some faculty members, the blanket refusal of negotiation results in opting out of university service, seeing the classroom or the study as a refuge from an otherwise hostile institution, the only places where they feel vindicated and whole. For administrators, thoroughgoing disenchantment with negotiation can lead to staying cloistered with like-minded supporters, bypassing consultative processes, and issuing edicts from on high, chiding whoever dares to dissent.
I am particularly concerned here with college and university leaders, who bear a special responsibility for counteracting these forms of withdrawal and the myopia they can promote. As Mnookin suggests, leaders need to set the example of seeking out opposing views and striking the right balance between empathy (understanding someone else’s needs and perspective) and assertiveness (clearly articulating one’s own point of view), between patience and prodding.
Listening is especially important to fostering constructive conversations. When people feel unheard, they clam up or shout. It is hard to listen to someone else when we ourselves feel unacknowledged, when we are stewing over our own bottled up thoughts and feelings instead of expressing them to a responsive audience. The best university leaders show how we all can move from monologues – venting to friends, lecturing to subordinates, complaining to a spouse or partner – into learning conversations with the very people we want to avoid.
"Learning conversations" comes from the Harvard Negotiation Project, in particular the influential book Difficult Conversations: How To Discuss What Matters Most (1999). The Harvard project assumes that when we are caught up in a conflict and anticipate confronting the person or group we are at odds with, we fear at best a difficult conversation or at worst a shouting match. We feel defensive, anxious, and unsure how to proceed, like students the first day of a challenging class. Our most appealing options seem to be fight (marshalling arguments, zeroing in on the vulnerabilities of our adversaries) or flight (escaping to our comfort zone).
The Harvard project aims at moving us past attacking or retreating and toward seeing disagreement not as a zero-sum power struggle but as an opportunity for mutual enlightenment. If this growth toward engagement were easy, there would be no need for the many books, seminars, and workshops that promote it. Well aware of the obstacles in our way, the Harvard project nevertheless encourages us to create a community strengthened not by lockstep agreement but by edifying debate.
If by bargaining we mean the learning conversations celebrated by the Harvard project – not the bickering and showboating that pass for debate in our national politics – then we should be bargaining with the provost, the Faculty Senate, and our colleagues every chance we get. A healthy culture of collaborative decision-making should characterize universities as much as effective teaching and exciting research, especially now, when confidence in negotiation and compromise is crumbling in other institutions. A healthy culture of collaborative decision-making means not only getting things expeditiously accomplished, but also creating educational opportunities each step of the way. The open-endedness of university discussions, their characteristic lack of fiscal urgency, encourages us to make the process as meaningful as the anticipated outcome.
The absence of economic incentives for engaging in these discussions – more harshly, the fact that they take time and do not pay – can lend credence to the cynical adage that academic debates are so vicious because there is so little stake. But it can also mean that individuals join in these discussions for the best of reasons: for the relationships they enable, the insights they provide, and the changes they bring about along the way and in the end.
We have become accustomed to assessing universities by their graduation rates, student learning outcomes, and other quantitative measures. I have been suggesting that this is what great universities sound like: lively conversations outside as well as inside the classroom, informed by new ideas and energized by respectful disagreement and widespread participation.
Michael Fischer is vice president for faculty and student affairs at Trinity University, in Texas.