Last week, an independent investigation of the American Psychological Association found that several of its leaders aided the U.S. Department of Defense’s controversial enhanced interrogation program by loosing constraints on military psychologists. It was another bombshell in the ongoing saga of the U.S. war on terror in which psychologists have long served as foot soldiers. Now, it appears, psychologists were among its instigators, too.
Leaders of the APA used the profession’s ethics policy to promote unethical activity, rather than to curb it. How? Between 2000 and 2008, APA leaders changed their ethics policy to match the unethical activities that some psychologists wanted to carry out -- and thus make potential torture appear ethical. “The evidence supports the conclusion that APA officials colluded with DoD officials to, at the least, adopt and maintain APA ethics policies that were not more restrictive than the guidelines that key DoD officials wanted,” the investigation found, “and that were as closely aligned as possible with DoD policies, guidelines, practices or preferences, as articulated to APA by these DoD officials.” Among the main culprits was the APA’s own ethics director.
Commentators claim that the organization is unique, and in some ways it is. The APA’s leaders had the uncommonly poor judgment and moral weakness to intentionally alter its ethics policy to aid their personal enlistment into the war on terror. Then they had the exceptional bad luck to get caught.
Yet the focus on a few moral monsters misses a massive, systemic quirk in how the APA -- and many other organizations -- creates its code of ethics. The elite professionals who are empowered to write and change an ethics policy have tremendous influence over its content. But ethics policies are anonymous because they have force only to the extent that they appear to represent the position of an entire organization, not a few powerful people. The process is designed to erase the mark of those heavy hands who write the rules for everyone.
The APA’s current scandal may be new, but its problems on this front are decades old. The APA passed its first comprehensive code of ethics in 1973 after seven years of work by six top U.S. psychologists who had been appointed by the APA’s leadership. I have examined the records of this committee’s work housed at the Library of Congress and recently published my findings in Journal of the History of the Behavioral Sciences. The men were given an impossible task: to write a code that represented the ethical views of all psychologists and at the same time erase their own biases and interests. The effort was prompted by worries that if the organization neglected to regulate itself, the government would do it for them. “President Nixon is moving rapidly in this area,” one psychologist at the time put it. “Behavioral scientists must stay ahead of him or we will be in big trouble.” Among the troubles they were facing within the profession was how psychologists could continue to be employed and funded by the U.S. military and not appear to break the profession’s ethics policy -- precisely the contradiction that resulted in APA’s current imbroglio.
In an effort to appear democratic and transparent, the members of the 1973 ethics committee collected survey responses from thousands of psychologists and interviewed key stakeholders in the profession. Psychologists reported back with descriptions of activities that ranged from callous to criminal -- research with LSD, government-backed counterinsurgency efforts, neglect of informed consent. Still, the six psychologists had to boil down an ocean of responses into an ethics code that purported to fit with all psychologists’ needs and perspectives -- which included their own.
At the height of the Cold War, scores of psychologists painted a picture of a profession rife with secrecy and dodgy funding sources. They specifically told of military research that appeared to require an abdication of ethics. “These are seen as highly necessary studies,” one psychologist reported regarding research he did for the Defense Department. “Unless the research is highly realistic, it will not provoke psychological stress and hence will be useless.” In one study, the human subject was led to believe he was in an underwater chamber. “The subject sits in this chamber and performs specific tasks at an equipment console. If water rises inside the chamber one of the controls is supposed to exhaust it. At first the control operates. Later, however, if fails and the water gradually rises higher and higher around the subject’s body.” But the human subject was not really underwater and the psychologist was in control. “It is the practice to stop the experience at various points for different subjects, depending upon the amount of excitement they appear to show at different water levels.”
Studies like this were hotly disputed among psychologists at the time. Some felt that being deceived or hurt, especially by an authority figure like a psychologist, fundamentally damaged people. Humans are fragile, the line went, and can be psychologically scarred by psychologists themselves.
Yet the six members of the 1973 ethics committee were skeptical. The committee's leader, Stuart Cook, found the position implausible based on his own experience as a researcher and in his early training as a student. “When I was a subject I expected to be deceived; I knew that performance under stress was an issue,” he reflected. After talking with colleagues about the trade-offs of tighter ethics for psychologists, Stuart delivered the punch line: “We should cut down our obligation to fully inform."
Another ethics committee member, William McGuire, regarded the “fragile self” view as ludicrous in general and its main (female) proponents ridiculous in particular. McGuire had made a celebrated career studying persuasion -- largely funded by the U.S. government in light of its Cold War concerns about political indoctrination. McGuire is a good example of how the ethical views of the policy writers did not stray far from their own personal stakes in ethics policies. “My feeling is that the field must face up to the fact that there are a lot of moral costs in psychological research and that this can be done only by going through two steps,” McGuire told a colleague. “The first step is to admit, well, all right, there is something morally bothersome about many aspects of the research including leaning ever so slightly on people to get them to participate, or especially misleading them about the nature of the research even in minor ways, using their behavior or behavioral traces without their explicit consent, etc. But going through this first step frankly and admitting there are unpleasant aspects of the research does not mean that we cannot do it. On the contrary,” he continued, “it is necessary to go through the second step and decide whether the reasons for doing the research outweigh these reasons for not doing it.” This view fit tidily with support of military research using stress, deception, drugs and other contested methods.
In 1971, the committee published a draft of the ethics policy they had created to gauge APA members’ responses. When a few of the ethics committee members considered taking seriously the complaints from that large faction of psychologists who raised concerns about the laxity of the draft ethics code, McGuire threatened to quit. “It seems to me that there has been a change in mood in the committee in a somewhat conservative direction, which surprised me a little bit and made me worry lest I might have fallen out of tune with the other committee members,” he explained. “I do want to mention that the committee members had moved in a direction and distance that I had not quite anticipated so that perhaps I would be perceived as holding back progress or being an obstructionist.”
Instead, William McGuire, Stuart Cook and the four other psychologists stuck together and ushered in an ethics policy that corresponded to their own research needs and interests. The final version of the 1973 ethics code, for example, eased restrictions on psychologists’ use of deception that had appeared in earlier drafts. The final policy allowed researchers to lie -- for the sake of science -- despite the loudly announced disagreement from many psychologists that deception, stress and other forms of harm, however temporary, could do long-term damage to people and deserved to be controlled through the APA’s code of ethics.
In 1973, as in events leading to the APA’s current crisis, the organization’s ethics policy bore the marks of the handful of psychologists who were empowered to write the rules. Like anyone, they had their own political and scientific interests in the content of the ethics policy. But unlike others, and to a varying degree, they managed their own interests by changing the policy to suit their interests.
In recent weeks, critics have rightly and roundly condemned the current APA leaders who are at fault in the recent scandal. But it is misguided to think that the APA’s problem of professional ethics can be solved by throwing out a few exceptionally bad apples.
Next month, thousands of psychologists are meeting for the APA’s annual convention. They will have plenty to discuss. It is clear that some leaders behaved condemnably -- perhaps criminally -- and three have already been forced out. Yet continuing to castigate individuals alone misses the larger problem.
The APA’s current ethics mess is a problem inherent to its method of setting professional ethics policy and a problem that faces professional organizations more broadly. Professions’ codes of ethics are made to seem anonymous, dropped into the world by some higher moral authority. But ethics codes have authors. In the long term, the APA’s problems will not be solved by repeating the same process that empowers a select elite to write ethics policy, then removes their connection to it.
All ethics codes have authors who work to erase the appearance of their influence. Personal interests are inevitable, if not unmanageable, and it may be best for the APA -- and other professional groups -- to keep the link between an ethics policy and its authors. Take a new lesson from the Hippocratic oath by observing its name. The APA should make its ethics policies like most other papers that scientists write: give the code of ethics a byline.
It’s a widely noted fact that colleges and universities are under new pressure to justify their value and function. The same is true of tenure-track faculty members, who are at the heart of the higher education system whose benefits much of society now claims to find mysterious, and whose job security is increasingly criticized.
While colleges face criticism for converting most of their teaching posts to non-tenure-track status, they also face criticism for offering tenure to the rest. The final decision by the Wisconsin Legislature to weaken tenure and shared governance in the University of Wisconsin System teaches a lesson that should resonate beyond Wisconsin: the standard defense of tenure and shared governance isn’t good enough to address widespread skepticism about their public benefits.
Faculty members have gone as far as they can by pleading an academic exemption from the financial control and autocratic management that typify the U.S. workplace, crystallized in the power of summary dismissal. Faculty members now need to explain the value not only of their own job security but also of job security in the workforce as a whole. We will need to be much clearer about why tenure and shared governance enable core functions of the university and also of any productive, creative workplace.
I am aware of the dangers of this kind of escalation and expansion of what we’ve been taught are unpopular job protections. And yet academics can no longer defend tenure and shared governance as minority exemptions. We need to explain their principles and benefits for an overall workforce that has suffered from their absence -- and is now unmoved by our special pleading.
In the important case of Wisconsin, the state Legislature and governor have now passed and signed major qualifications of UW System tenure and governance, including student governance over the expenditures of their fees. One section introduces language legalizing layoffs of tenured faculty “due to budget or program decision,” and then offers a long, ornate set of procedures for dismissing tenured faculty as a result of pretty much any programmatic change. Another section eliminates statutory language that gives faculty members direct managerial authority in the university by vesting them “with responsibility for the immediate governance of [their] institution” while expecting them to “actively participate in institutional policy development." Though tenured faculty members aren’t yet living in the at-will employment utopia of the American right, where one can be fired without cause or due process, the plan makes them vulnerable to restructuring strategies that a range of commentators equate with making universities more efficient.
Since these proposals will now change UW significantly, and perhaps model changes in other states, what should faculty members do next?
The Typical Faculty Response
Let’s start with what faculty members usually do. The current state of the art was on display at a multicampus academic senate meeting in Madison where faculty members had gathered to discuss the situation. One much-admired intervention was delivered by Professor David J. Vanness, who argued that the weakening of tenure and of faculty governance threatened core academic activity:
"This is not an issue of Democrats versus Republicans. This is an issue of academic freedom. Freedom to discover and to teach new knowledge, regardless of whether it offends (or enriches) a specific business interest or political party …. If we allow ourselves to be led down this path laid out before us … there will be nobody left to 'follow the indications of truth wherever they may lead.' We will sift where it is safe to sift. We will winnow where we are told to winnow. Our pace of discovery will slow and our reputation will falter."
I heartily agree. But I am already inside the academic consensus that the pursuit of truth requires intellectual freedom and professional self-governance. Since most people don’t enjoy either of these in their working or even their personal lives, they wouldn’t immediately see why empowering chancellors will hurt teaching or slow the pace of discovery.
Rather than revealing the inner workings and effects of tenure and shared governance, faculty members generally do three other things. We cast tenure and shared governance as constitutional principles beyond the legitimate reach of politics. We instrumentalize these practices in the name of competitive excellence. We put our defense in the hands of our university’s senior managers. Each of these three moves made sense at various times in the past, but they are now serious mistakes.
First, what happens when faculty present academic freedom as transcending politics? The question was brought home to me again by a good op-ed called “What is driving Scott Walker's war on Wisconsin universities?” The author, Saul Newton, an Army veteran studying at UW-Waukesha, discusses the conservative Bradley Foundation’s role in intellectualizing reasons to bring education to heel. He cites a 1994 article by the foundation’s president that, in Newton’s phrase, justified “demolishing public institutions, specifically public education.”
I followed Newton’s advice and read the Bradley Foundation article, whose ideas about K-12 governance are now being applied to public universities. I was struck by two features. First, the piece advanced a quasi-Foucauldian vision of society in which any group’s principles lie within society’s structures of power rather than outside them. “Educational policy is always and everywhere a profoundly political matter,” wrote foundation president Michael S. Joyce. Second, it defined its attack on an “exhausted” progressivism as a movement for democratic accountability: “If educational policy is finally and irrevocably political, then surely, in a self-governing polity, the people themselves are the source of educational policy -- not a distant bureaucracy.”
When Joyce moved on to demonize teachers for wielding the “political hegemony of the ‘helping and caring’ professionals and bureaucrats,” he did so in the name of restoring democracy. It doesn’t matter whether this framework is right or wrong (it’s wrong). Once it has been established, and faculty then defend tenure as a privilege of their intellectual status, they don’t rebut the right’s democratic critique but validate it. The democracy frame makes academic freedom look like a license to ignore public concerns rather than to engage them in dialogue from an independent position.
On the second error: university administrators and faculty alike predict that quality decline will follow any weakening of tenure. A group of distinguished chaired professors at UW-Madison stated that qualifying tenure would make the university “suffer significant competitive disadvantages.” Competitiveness is often measured in rankings shorthand: UW-Madison is 47th in U.S. News and World Report’s rankings this year, is among the top 15 among public universities, and has a large number of top-20 departments, all of which may fall in the rankings as they come to lose every contest for top candidates to peers with stronger tenure protections.
But how much would lowered rankings reduce faculty quality and public benefit? Top rankings mostly concern the Madison campus, and so involve only a minority of the students and faculty in the UW System. Politicians also know that hundreds of qualified people apply for every good tenure-track position, and thus assume that the UW system will still enjoy a surplus of excellent candidates. Wisconsin departments may have a harder time landing their top one or two picks who have offers from other major universities, but politicians may reasonably doubt that their third or fourth candidates will offer a noticeably lesser student experience.
More fundamentally, departmental or university stature is an inaccurate proxy for the competitiveness most people care about, which is the economic kind that raises the standard of living. Universities have constantly asserted their direct economic impact, and conservatives are taking this rhetoric literally. Thus an alleged blueprint for the Walker changes, a report called “Beyond the Ivory Tower” that was published by the Wisconsin Policy Research Institute and authored by the longtime chancellor of UW-Stout, justifies its call for more flexible tenure and governance on the grounds that this will “help the UW System better fulfill its mission to help produce economic development.” University administrators agree that this is their mission, and STEM fields have benefited for decades from the emphasis on technological outputs, often at the expense of funding broad liberal arts-based capabilities. So faculty members’ talk of staying competitive encourages conservatives to ask UW to show them the money. In the U.S. business system, making money normally involves giving management a free hand over employees, thus hoisting professors on their own petard.
We arrive at the third faculty habit, in which a faculty assembly calls on senior managers for protection for tenure and shared governance. There are two issues here. One is the academic freedom to produce research even when its evidence contradicts the beliefs of politicians or business leaders, who then may seek to discredit the study, as recently happened in Wisconsin, by calling it “partisan, garbage research,” and/or by defunding an entire program, as happened in North Carolina. Senior managers often hang tough on this point, and defend the research autonomy of their faculty and their institution.
The other issue is direct faculty control over university policy that goes beyond offering nonbinding advice. I noted that the now-deleted Wisconsin statute expects faculty to be directly involved in “the immediate governance of [their] institution.” Governor Walker does not want this strong version of shared governance. But do System President Ray Cross or UW-Madison Chancellor Rebecca Blank? Careful Wisconsin faculty observers like Nicholas Fleisher, Richard Grusin and Chuck Rybak think not, and I can’t call to mind a senior manager who does want full co-governance with faculty.
In addition, UW’s senior managers have some history of efforts to increase their own authority. As Lenora Hanson and Elsa Noteman argue, former Madison Chancellor Biddy Martin’s “New Badger Partnership” sought to delete much state oversight over the university’s budgeting and human resources policies. The current UW administration continued the campaign under another name, even at the cost of accepting state funding cuts. Chancellor Blank told local television that the university could make up for cuts with more freedom from the state, if they just had more time. In other words, senior university managers de facto agreed with the core tenets of movement conservatism that state oversight lowers efficiency while executive authority increases it. Since so much of the conservative business position matched the university’s official position, the voting public could be forgiven for not seeing why the statute changes would affect faculty much.
General Public Benefits, Not Special Privileges
So what would motivate the wider public to fight for academic tenure and shared governance? To present them as general public benefits rather than as our special privileges.
To do this, we will need to undo each of the three mistakes I’ve described. First, rather than casting tenure and shared governance as necessary exceptions to normal workplace politics, we should define them as necessary to workplaces in general. Tenure is a simple idea: protection from the at-will employment practice of firing any employee without cause or due process. Tenure places an obligation on the employer not only to identify specific reasons for termination but to convince others of their validity.
Tenure doesn’t just protect academic freedom; it protects all employees’ investments in their skills, relationships, know-how, and commitment to their organization. I have always thought that tenure should appeal to conservatives, since it defends liberty by protecting one party against another’s arbitrary exercise of authority. Tea Partiers who accuse Barack Obama of being a dictator should logically favor limits on the lawful tyranny of the private sector supervisor. At the same time, Democrats should like generalized tenure for enabling a limited type of workplace democracy. A hundred years ago, the American Association of University Professors constructed academic freedom as the great exception to the autocratic managerialism of American business life. Faculty members will now need to promote workplace freedom from at-will dismissal as right for employees everywhere.
On the second mistake, of touting their competitiveness, faculty members should reject competition as a main driver of high-quality work. We enjoy top rankings and status as much as managers do, and yet in the long run they depend on research and teaching achievements that come from persistence, security, obtuseness, heretical thinking and collaboration. It’s not just that competition encourages wasteful duplication and intellectual imitation, but also that intellectual progress depends profoundly on complicated forms of cooperation among all kinds of people and expertise. Universities teach people to address massively complicated problems that require both individual originality and collaboration. The U.S. doesn’t have a competitiveness disadvantage: it has a collaborative disadvantage, and universities are needed more than ever to develop new kinds of collaborative capabilities. In addition, public universities help their regions, states and nations not by being better than other universities but by doing transformative work in the place they are and with the students they have. Faculty should help the American workplace move in the same direction.
The third mistake: instead of looking to senior management for defense, faculty members should look to employees in other workplaces in advocating democratic rather than autocratic organization. Until our current neo-Taylorist management revival, the efficiency of peer-to-peer self-management was widely understood. The uber-mainstream features the historian David Montgomery chronicling the contributions of indigenous and immigrant craft skill to 19th-century American industrialization, the management gurus Tom Peters and Robert Waterman advocating employee empowerment in their 1980s blockbuster In Search of Excellence, the sociologist Richard Sennett analyzing the centrality of mutually developed craft practices to effective work, and, in a backhanded way, the neoclassical economists now warning about the “skills gap,” since if top-down management were so great companies could simply boss their hirelings to competence.
Such research has established academic analogs, starting with peer review. Wisconsin faculty have pointed out that tenured faculty members must meet their own colleagues’ rigorous performance standards to get tenure and must then continue to satisfy them to progress. Another common academic practice is the combination of outcomes evaluation with freedom to organize everyday work. Although professionals have had an easier time claiming this right to direct their own work, to whom does this principle not apply? Everyone needs training and ongoing feedback, and everyone needs latitude to shape their own efforts.
The faculty’s central political problem is that their assertion of their tenure and governance rights is read as their tacit denial to everyone else. The problem starts with the “new faculty majority” of non-tenure-track professors on campus and spreads out from there. This sense of tenure as a special privilege (error one) is the cornerstone of the politically powerful stereotype of the elitist professors who proclaim their superiority to other people (error two) and can’t deal with regular people directly (error three). In making these mistakes, we have played into our opponents’ hands.
Rather than claiming academic freedom, tenure and fair governance as a special perk of our unique standing, we should hold them out as the general economic and social justice virtues that they are. Faculty have models of collaborative self-governance that we now rarely bother to develop, that we have allowed to serve an ever-smaller share of our colleagues, that are not taken seriously by many administrations, but that are designed to allow both intellectual originality and decent, honorable workplaces. Faculty must now model how shared governance, if spread to other workplaces, would improve society as a whole. And we are going to have to do it soon.
We talk, we text, we tweet. And that’s fine. We needn’t require ourselves to think deeply at all hours, in all weather. Some things just don’t need the time, space and gravity we associate with such terms as “discourse,” “debate” and “dialogue.”
But many things do. And one of the dangers for a society that gets too used to the frenetic and featherweight -- and to media tailored to delivering little else -- is that when a real issue comes along, with conflicting ideas and multiple facets, and complexity and weight and much in the balance, we simply have no way to discuss it.
Think immigration in Arizona, justice in Ferguson, religious freedom in Indiana, water wars in California, Confederate iconography in the South, sexual assault on college campuses.
And, importantly, issues like these don’t “come along.” They’re always with us, constantly testing us, and our decisions about them matter. They determine what lives we lead, and what world we’ll leave behind.
So it makes a difference that discourse today, when it happens at all, is often rife with personality, politics, opinion and noise but short on facts -- much less analysis and insight.
This predicament touches on a counterintuitive point that goes to the heart of the problem: facts are not enough. Daniel Moynihan’s eminently quotable “Everyone is entitled to his own opinion, but not his own facts” is true enough as far as it goes. But simply memorizing those facts, then trading them with people who already agree with you, advances no argument and makes no decision easier or wiser.
To argue in the sense of debate, to hold a constructive, reasoned conversation with the potential to change minds, we first have to engage the minds we would change: our own as well as others’. Students don’t necessarily arrive at college with the skills to do that, unfortunately, and it’s no wonder. Our digital bubbles and the social media hall of mirrors make it possible to seem connected to every person alive, discussing every topic under the sun, when in truth we’re often engaging merely with enclaves of the like-minded.
A hall of mirrors doesn’t add perspectives, it only multiplies your own; its depth is all pretense.
This is a major paradox of our time. Think, for example, of the heyday of broadcast TV, when news networks numbered exactly three and were indisputably more homogenous than today. And yet they routinely offered opposing points of view. Today our options for news and views have grown exponentially, but paradoxically so too have our tools for sorting ourselves into virtual silos. Two points of view are rarely sufficient in any case, as almost nothing worth arguing about has just two sides. But now, with 1,000 channels and countless blogs that double as echo chambers, we don’t have to listen to even that many.
And while real debate would doubtlessly improve our current landscape, perhaps “dialogue” is the best word for what is most needed today, and always. For true dialogue, two things are required, besides the will to think for oneself rather than accept some authority’s shrink-wrapped opinion package.
The first is to find a sense in which we’re in it together. “We” can be students in a classroom, business competitors, House and Senate colleagues, or newly established neighborhood associations, but the default position has to be the same: if an “us vs. them” dynamic prevails, everybody loses.
In this way, dialogue is equally pledge as practice: it urges us to uphold a sense of community above all, no matter the size of the controversy or the intensity of the conflict. It’s more huddle, less face-off.
It’s also our best tool for delivering productive, civil and nuanced results from even the most passionate disagreements -- which, it’s worth pointing out, is not only inevitable but desirable in a place dedicated to the life of the mind.
Here higher education plays a role that can be easily obscured by the very proper focus on difference. College should most definitely put young people in touch with the vast variety of human thought and experience. But if we do it right, they should also have a growing appreciation for what unites us beneath our differences in color and country, class and gender, age and era: the reassuring bedrock of the genuine human needs, abilities, drives and virtues that we hold in common.
And let’s not pretend any of this is easily done or effortlessly taught. It takes considerable self-awareness, patience and discipline to contribute to such complex conversations, and it takes even more to lead them, to say nothing of the skills and wisdom needed to teach others to do the same. It should be the goal of every intellectual community to advance the depth, breadth and sustaining power of face-to-face dialogue.
The second requirement for true dialogue may be even more important. It depends on a mind-set that can be expressed in four words: I might be wrong.
“The spirit of liberty,” Judge Learned Hand famously said in a 1944 speech, “is the spirit which is not too sure that it is right.” The rest of his sentence is less often quoted but equally pertinent: “The spirit of liberty is the spirit which seeks to understand the minds of other men and women … which weighs their interests alongside its own without bias.”
That’s a high bar for a species as tribal and fallible as ours, but it’s a worthy one. Truth be told, it’s the only way we’ll ever make progress in judging how best to live together, whether that means in colleges, communities or countries.
The best way to start clearing that bar?
Helping our young people to value reflection over reflexes, giving them effective ways to listen, think, converse and cooperate -- not offering them “cut flowers,” as the educational leader John Gardner once put it, but “teaching them to grow their own plants.”
Ryan Hays is executive vice president at the University of Cincinnati.
The most distracting thing about costume dramas set in any period before roughly the turn of the 20th century -- in my experience, anyway -- is the thought that everything and everyone on screen must have smelled really bad. The most refined lords and gentry on Wolf Hall did not bathe on anything we would regard today as a regular basis.
No doubt there were exceptions. But until fairly recently in human history, even the most fastidious city dweller must have grown accustomed to the sight of human waste products from chamber pots that had been emptied in the street. (And not just the sight of it, of course.) Once in a while a movie or television program will evince something of a previous era’s ordinary grunge, as in The Return of Martin Guerre or Deadwood, where almost everything looks soiled, fetid and vividly uncomfortable. But that, too, is exceptional. The audience for costume drama is often looking for charm, nostalgia or escapism, and so the past usually wears a deodorant.
The wider public may not have heard of it, but a “sensory turn” among American historians has made itself felt in recent years -- an attention, that is, to the smells, tastes, textures and sounds of earlier periods. I refer to just four senses, because the importance of sight was taken for granted well before the turn. In their more polemical moments, sensory historians have even referred to “the tyranny of the visual” within their discipline.
That seems a little melodramatic, but point taken: historians have tended to scrutinize the past using documents, images, maps and other artifacts that chiefly address the eye. Coming in second as the organ of perception most likely to play a role in historical research would undoubtedly be the ear, thanks to the advent of recorded sound. The remaining senses tie for last place simply because they leave so few traces -- which, in any case, are not systematically preserved the way audiovisual materials are. We have no olfactory or haptic archives; it is difficult to imagine a library of flavors.
Calls to overcome these obstacles -- to analyze whatever evidence could be found about how everyday life once sounded, smelled, felt, etc. -- came from American historians in the early 1990s, with a few pioneers at work in Europe even before that. But the field of sensory history really came into its own over the past decade or so, with Mark M. Smith’s How Race is Made: Slavery, Segregation and the Senses (University of North Carolina Press, 2006) and Sensing the Past: Seeing, Hearing, Smelling, Tasting and Touching in History (University of California Press, 2007) being among the landmarks. Smith, a professor of history at the University of South Carolina, also convened a roundtable on the sensory turn published in the September 2008 issue of The Journal of American History. A number of the contributors are on the editorial board of the Studies in Sensory History series published by the University Illinois Press, which launched in 2011.
The series’ fifth and most recent title is Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers by Adam Mack, an assistant professor of history at the School of the Art Institute of Chicago. Beyond the monographic focus -- it covers about fifty years of the city’s history -- the book demonstrates how much of the sensory field of an earlier era can be reconstructed, and why doing so can be of interest.
Overemphasis on the visual dimension of an urban landscape “mirrors a set of modern cultural values that valorize the eye as the barometer of truth and reason,” we read in the introduction, “and tend to devalue the proximate, ‘lower’ senses as crude and less rational.” Having thus recapitulated one of sensory history’s founding premises, the author wastes no time before heading to one site that must have forced its way deep into the memory of anyone who got near it in the 19th century: the Chicago River.
“A bed of filth,” one contemporary observer called it, where manure, blood, swill and unusable chunks of carcass from the slaughterhouses ended up, along with human sewage and dead animals -- all of it (an editorialist wrote) “rotting in the sun, boiling and bubbling like the lake of brimstone, and emitting the most villainous fumes,” not to mention drawing clouds of flies. A letter writer from 1862 mentions that the water drawn from his kitchen hydrant contained “half a pint or more of rotten fish.” Many people concluded that it was safest just to drink beer instead.
Laws against dumping were passed and commissions appointed to investigate the problem, for all the good it did. The poorest people had to live closest to the river, so disgust at the stench combined in various ways with middle- and upper-class attitudes towards them, as well as with nativist prejudices.
The horrific odor undermined efforts to construct a modern, rationally organized city. Imposing a grid of streets on the landscape might please the eye, but smell didn’t respect geometry. The same principle applied to the Great Fire of 1871, the subject of Mack’s next chapter. The heat and sheer sensory overload were overwhelming, and the disaster threw people from all walks of life together in the streets in a way that made social status irrelevant, at least for a while. The interplay between social hierarchy and sensory experience (exemplified in references to “the roar of the mob”) is the thread running through the rest of the book. Thinking of the “‘lower’ senses as crude and less rational” -- to quote the author’s phrase again -- went along with assumptions about refinement or coarseness as markers of class background.
The sources consulted by the author are much the same as any other historian might use: newspapers, civic records, private or otherwise unpublished writings by long-forgotten people, such as the recollections of the Great Fire by witnesses, on file at the Chicago History Museum. The contrast is at the level of detail -- that is, the kinds of detail the historian looks for and interprets. Perhaps the next step would be for historians to enhance their work with direct sensory documentation.
A prototype might be found in the work of John Waters, who released one of his movies in Odorama. Audience members received cards with numbered scratch-and-sniff patches, which they consulted when prompted by a message on the screen.
On second thought, it was difficult enough to read Mack’s account of the Chicago River in the 19th century without tickling the gag reflex. Olfactory realism might push historical accuracy farther than anyone really wants it to go.
Adjunct faculty members at the Community College of Allegheny County have voted 394-64 to unionize with the American Federation of Teachers. Another AFT unit has represented full-time faculty members at the college for more than 40 years.
Historians are reacting with outrage to the ruling of a German court that the estate of Joseph Goebbels, the Nazi minister of propaganda, may claim royalties on excerpts from his diaries in a new scholarly biography, Times Higher Education reported. The suit itself raised concern from many scholars, who have assumed they could quote freely from diaries of long-dead Nazis. “It’s quite shocking,” said Neil Gregor, professor of history at the University of Southampton, “that these diaries … are being used, effectively, to profit so shamelessly from one of the chief culprits of Nazi genocide.” The suit involved Goebbels, by Peter Longerich, professor of modern German history at Royal Holloway of the University of London. Random House Germany, Longerich's publisher, is planning an appeal to the German Supreme Court.