History

Interview with editors of 'The American Yawp,' a free history textbook published online

"I sound my barbaric yawp over the roofs of the world," Walt Whitman declares in Leaves of Grass. How he ended the line without an exclamation point always puzzled me, but maybe it was implicit. The poet sang "the body electric," and every line was meant to zap the reader into a higher state of awareness.

Whitman would have been pleased to see the new American history textbook called The American Yawp -- and not just for its allusive title. As a sometime school teacher and educational reformer, he wanted "free, ample and up-to-date textbooks, preferably by the best historians" (to quote one discussion of this aspect of the poet's life). Yawp's 30 chapters cover American history from the last ice age through the appearance of the millennial generation. It has plenty about the founders and the origins of the U.S., but avoids a triumphalist tone and includes material on inequality -- including economic inequality -- throughout. It was prepared through the collaborative efforts of scores of historians. And the creators have published it online, for free.

The beta version was released, with no fanfare at all, at the start of the current academic year. By the fall, a revision will be issued in e-book format, suitable for use in an undergraduate survey course -- again, for free. Walt would surely approve.

I contacted the editors -- Joseph Locke, an assistant professor of history at the University of Houston-Victoria, and Ben Wright, an assistant professor of history and political science at the Abraham Baldwin Agricultural College in south Georgia -- to find out more about The American Yawp. They collaborated in responding to my questions by e-mail. A transcript of the discussion follows.

Q: How did you go about writing (assembling?) your textbook? Did you collaborate via Listservs? Were there any face-to-face meetings?

A: Traditional textbooks usually begin with a single editor or a small team of editors searching for some unifying theme to tie together the many thematic strands of American history. Instead, we mirrored the way our profession already works. We believed that a narrative synthesis could emerge through the many innovations of our profession’s various subfields no less than through a preselected central theme. We therefore looked to a large and diverse yet loosely coordinated group of contributors to construct a narrative.

We began by mapping out potential contributions for all 30 chapters based on our experiences teaching the survey and in informal conversations with colleagues and potential chapter editors. We came up with things like “500 words on the election of 1860” and “300 words on the music and art of the Civil War.” We compiled these into lists.

Then, after tapping into the networks of scholars we knew, as well as scouring recent editions of major history journals, combing through lists of recent dissertations, browsing the rosters of university programs with traditional strengths in particular eras and soliciting contributors through social media and H-Net’s many history Listservs, we targeted scholars to write on these themes.

We had no trouble recruiting an adequate pool of qualified contributors. In fact, we ended up with over 300 historians writing for the project. This work was done almost exclusively online.

Q: Was it a matter of one person preparing a draft chapter and then other participants proposing changes?

A: Since a textbook should be more than a series of brief, disjointed topical entries, we began the work of synthesis. We recruited talented writers and scholars as chapter editors who went to work stitching submissions into coherent chapters. We then reviewed and edited drafts of all 30 chapters, particularly with an eye on ensuring greater narrative cohesion across the text.

During our beta year, we are soliciting feedback not only from our esteemed board of editorial advisers but from contributors and users through our parallel Comment Press platform. With that feedback in hand, we will publish a refined version of the text and begin a second phase that incorporates interactive digital content and further explores what a digital textbook is truly capable of.

Q: Are you aware of anyone teaching with the beta version? Have you had commitments from individuals or departments to use it during the next academic year?

A: Students are currently working with the text at a variety of institutions ranging from major state universities (such as the University of Georgia and the University of Florida) to various community colleges (such as Central New Mexico Community College and Bronx Community College) and everything in between (Rice University, Georgia State University, the University of Texas at Dallas and others). We don’t solicit formal commitments for use, but we’ve already heard from additional instructors and history departments hoping to adopt the text next fall. We are historians, not marketers, but we believe continued positive feedback and our formal launch in the fall will also encourage additional adoptions.

Q: In the culture wars, American history is one of the more harried battlegrounds. Did that factor into the textbook’s preparation in any way?

A: We believe history should be written by historians. We have no interest in the culture war, beyond mitigating the way that some have used it to wildly distort the past. Instead, we've trusted in our profession; our desire has been to reflect all the very best of contemporary scholarship.

On the other hand, we have been conscious about how to properly synthesize the American past. What gets included, and what doesn't? This is a difficult issue and we have enlisted the historical profession to help guide us. And we remain open to critical feedback.

Q: The talk page of a Wikipedia entry tends to become a forum for debate, informed and otherwise. Yawp is not in wiki format, of course, but will the comments component be moderated?

A: We've seen very little rancor in our Comment Press platform. Disagreements have mostly taken the form of highly specialized critiques. Historians are argumentative, but we've been pleased to see that all have followed the standards of professional decorum. We therefore haven't had any plans to moderate discussions. And, unlike a wiki, disruptive comments would not be able to filter into the text without editorial decisions.

Q: It seems as if The American Yawp could serve as a model for other textbooks. Is that the plan?

A: Our model is completely reproducible. We've accomplished this without institutions, grants or rarefied technological know-how.  And we very much hope that others will follow our example. We already know, for example, that within our own profession there is quite a bit of interest for a similar project in world history.

Q: A commercially produced textbook can be financially rewarding for everybody involved in its creation, and it counts on an author's CV. These seem like powerful incentives for stasis. What would it take for your mode of textbook production to establish itself as viable over the long run?

A: Of course, a commercially produced textbook is not financially rewarding for everybody involved -- it is often quite financially punitive for our students. (The College Board, for instance, found that the typical student now spends $1,200 a year on textbooks and supplies.) And outside of a few textbooks written by a few academics for a few major presses, financial rewards can be extremely limited for textbook producers.

Still, the reputational economics of academia do matter. Professional consideration of projects such as this will certainly shift as academia continues to adjust to the digital age, but we also did not embark upon the project for economic or professional gain. This has been and will continue to be a labor of love. We entered the historical profession because we believe there is a moral imperative to study the American past and to share that knowledge with students and with the public. The rising costs of higher education makes that difficult. Academics recognize this, and we believe that's why over 300 academic historians were so willing to participate in this project.

We believe our model is viable in the long term. This is not a start-up having to satisfy investors or foundation boards. This is simply a collective of historians who have come together to share the knowledge of our profession. That doesn't mean certain developments couldn't further secure the long-term viability of projects such as this, of course. For instance, we have been looking into possible partnerships with innovative university presses to help satisfy the very reputational implications you cited.

Editorial Tags: 

Historians reject vote on controversial anti-Israel resolutions

Smart Title: 

At business meeting of American Historical Association, members refuse to consider last-minute proposals to condemn Israel's treatment of Palestinians.

Historians discuss the pleasures and pitfalls of teaching popular history

Smart Title: 

From Little House on the Prairie to comics to flappers, historians at annual meeting discuss the pleasures and pitfalls of teaching popular history.

Essay on "The Americans," and Margaret Peacock, "Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War"

It was too prolonged for there to be any specific date, or dates, to mark it. But perhaps this is as good a time as any to mark the 25th anniversary of a process that started with the fall of the Berlin Wall in early November 1989 and reached a kind of peak with the events in Romania late that December.

The scale and pace of change were hard to process then, and difficult to remember now. Ceausescu had barely recovered from the shock of being heckled before he and his wife faced a firing squad. It was not how anyone expected the Cold War to end; insofar as we ever imagined it could end, the images that came to mind involved mutually assured destruction and nuclear winter.

A few years ago, Daniel T. Rogers characterized the intellectual history of the final decades of the 20th century as an “age of fracture” – an era in which the grand narratives and overarching conceptual schemata were constantly displaced by “piecemeal, context-driven, occasional, and… instrumental” ideas and perspectives in the humanities, social sciences, and public life. Fair enough; just try finding a vintage, unshattered paradigm these days. But a system of bipolar geopolitical hostilities prevailed throughout most of that period, and the contradictory structure of conflict-and-stasis seemed very durable, if not permanent.

Until, suddenly, it wasn’t. One smart and well-executed treatment of the world that came to an end a quarter-century ago is a recent television series called "The Americans," set in the early 1980s. The first season is now available in DVD and streaming video formats, and the second will be in two weeks, just in time for binge-viewing over the holidays.

"The Americans" is a Cold War spy drama as framed by the “secret life amidst suburban normality” subgenre, the basic tropes of which were inaugurated by "The Sopranos." In it, the Jenningses, a married couple, run a travel agency in Washington, where they live with their two early-adolescent kids. But they are actually KGB agents who entered the United States some 20 years earlier. They have operated from behind cover identities for so long that they blend right in, which makes them very effective in their covert work. While gathering information on the Strategic Defense Initiative, for example, they even get access to the Advanced Research Projects Agency Network -- aka ARPANET -- which allows communication between computers, or something.

The comparison shouldn’t be pushed too hard, but the paradox of the deep-cover agent is right out of John Le Carré: A divided identity makes for divided loyalties. At very least it puts considerable strain on whatever commitment the couple started out with, back in the late Khrushchev era. We get occasional flashbacks to their life as young Soviet citizens. With the onset of “Cold War II,” the motherland is imperiled once again (not only by the American arms buildup but also by the reflexes of KGB leadership at “the Center”) and the Jenningses have decidedly mixed feelings about raising kids under rampant consumerism, even if they’ve grown accustomed to it themselves.

The moral ambiguities and mixed motives build up nicely. Life as a couple, or in a family, proves to be more than a layer of the agents’ disguise: love is another demand on their already precarious balance of loyalties. Yet the real menace of thermonuclear showdown is always there, underneath it all. Some viewers will know that things came very close to the point of no return at least once during this period, during NATO’s “Able Archer” exercise in November 1983. Whatever sympathy the audience may develop toward the Jenningses (played with real chemistry by Keri Russell and Matthew Rhys) is regularly tested as they perform their KGB assignments with perfect ruthlessness. They are soldiers behind enemy lines, after all, and war always has innocent casualties.

The conflict has gone on so long, and with no end in sight, that the characters on screen don’t even feel the need to justify their actions. The spycraft that the show portrays is historically accurate, and it gets the anxious ground-tone of the period right, or as I remember it anyway. But very seldom does "The Americans" hint at the impending collapse of almost every motive driving its core story -- something the viewer cannot not know. (Pardon the double negative. But it seems to fit, given the slightly askew way it keeps the audience from taking for granted either the Cold War or the fact that it ended.)

The focus on the family in "The Americans" takes on added meaning in the light of Margaret Peacock’s Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War, recently published by the University of North Carolina Press. The scriptwriters really ought to spend some time with the book. At the very least, it would be a gold mine of nuances and points of character development. More generally, Innocent Weapons is a reminder of just how much ideological freight can be packed into a few messages rendered familiar through mass media, advertising, and propaganda.

Peacock, an assistant professor of history at the University of Alabama at Tuscaloosa, examines the hopes and fears about youngsters reflected in images from the mid-1940s through the late 1960s. The U.S. and the USSR each experienced a baby boom following World War II. But the outpouring of articles, books, movies, and magazine illustrations focusing on children was not solely a response to the concerns of new parents. It might be more accurate to say the imagery and arguments were a way to point the public’s attention in the right direction, as determined by the authorities in their respective countries.

Children are the future, as no politician can afford to tire of saying, and the images from just after the defeat of fascism were tinged with plenty of optimism. The standard of living was rising on both sides of the Iron Curtain. In 1950 President Truman promised parents a “the most peaceful times the world has ever seen.” Around the same time, the Soviet slogan of the day was “Thank You Comrade Stalin for Our Happy Childhood!”, illustrated with a painting of exuberant kids delivering an armful of roses to the General Secretary, whose eyes fairly twinkle with hearty good nature.

But vows of peace and plenty on either side were only as good as the leaders’ ability to hold their ground in the Cold War. That, in turn, required that young citizens be imbued with the values of patriotism, hard work, and strong character. Sadly enough, children on the other side were denied the benefits of growing up in the best of societies.

The Soviets media portrayed American youth as aimless, cynical jazz enthusiasts facing Dickensian work conditions after years of a school system with courses in such topics as “home economics” and “driver’s education.” The Americans, in turn, depicted Soviet youth as brainwashed, stultified, and intimidated by the state. (And that was on a good day.)

By the late 1950s, the authorities and media on each side were looking at their own young people with a more critical eye (alarmed at “juvenile deliquincy,” for example, or “hooliganism,” as the Soviets preferred to call it) -- while also grudgingly admitting that the other side was somehow bringing up a generation that possessed certain alarming virtues. Khrushchev-era educational reformers worried that their students had endured so much rote instruction that they lacked the creativity needed for scientific and technological progress, while American leaders were alarmed that so many young Soviets were successfully tackling subjects their own students could never pass -- especially in science and math. (The news that 8 million Soviet students were learning English, while just 8,000 Americans were taking Russian, was also cause for concern.)

The arc of Cold War discourse and imagery concerning childhood, as Peacock traces it, starts out with a fairly simplistic identification of youth’s well-being with the values of those in charge, then goes through a number of shifts in emphasis. By the late 1960s, the hard realities facing children on either side were increasingly understood as failures of the social system they had grown up in. In the U.S., a famous television commercial showed a little girl plucking the leaves of a daisy as a nuclear missile counted down to launch; while the ad was intended to sway voters against Barry Goldwater, it drew on imagery that the Committee for a Sane Nuclear Policy (better known as SANE) and Women Strike for Peace first used to oppose nuclear testing a few years earlier. Nothing quite so emblematic emerged in the Soviet bloc, but the sarcastic use of a slogan from the Komsomol (Young Communist Union) became a sort of inside joke about the government’s self-delusion.

“To varying degrees,” writes Peacock, “both countries found themselves over the course of these years betraying their ideals to win the [Cold] war, maintain power, and defend the status quo…. Even images like that of the innocent child can become volatile when the people who profess to defend the young become the ones who imperil them.”

 

Editorial Tags: 

Georgia Southern University investigating a professor accused of proselytizing in his classes

Section: 
Smart Title: 

Georgia Southern U. is looking into claims that a history professor is pushing his own anti-evolution views in his classes at the public university.

Review of Chris Walsh, 'Cowardice: A Brief History'

In recent years we’ve had quite a few books on the negative emotionsdisgust, malice, humiliation, shame – from scholars in the humanities. In addition, Oxford University Press published its series of little books on the Seven Deadly Sins. Apparently envy is the most interesting vice, to judge by the sales ranks on Amazon, followed by anger -- with lust straggling in third place. (A poor showing, given its considerable claims on human attention.)

The audience for monographs putting unpleasant or painful feelings into cultural and historical context probably doesn’t overlap very much with the far larger pop-psychology readership. But their interests do converge on at least one point. Negative affects do have some benefits, but most of us try to avoid them, or minimize them, both in ourselves and others, and to disguise them when necessary; or, failing that, to do damage control. And because the urge to limit them is so strong, so is the need to comprehend where the feelings come from and how they operate.

Arguably the poets, historians, and philosophers have produced richer understandings of negative emotions, in all their messiness. As for what the likes of Dr. Phil bring to the table, I have no opinion – though obviously they’re the ones leaving it with the biggest bags of money.

But the avoidance / interest dynamic really goes AWOL with the topic Chris Walsh explores in Cowardice: A Brief History (Princeton University Press). The Library of Congress catalog has a subject heading called “Cowardice — history,” with Walsh’s book being the sole entry. That’s a clerical error: Marquette University Press published Lesley J. Gordon’s “I Never Was a Coward”: Questions of Bravery in a Civil War Regiment in 2005. It is 43 pages long, making Walsh the preeminent scholar in the field by a sizable margin. (He is also associate director of the College of Arts and Sciences Writing Program at Boston University.)

“[P]ondering cowardice,” he writes “illuminates (from underneath, as it were) our moral world. What we think about cowardice reveals a great deal about our conceptions of human nature and responsibility, about what we think an individual person can and should have to endure, and how much one owes to others, to community and cause.”

But apart from a typically thought-provoking paper by William Ian Miller a few years ago, cowardice has gone largely unpondered. Plato brought it up while on route to discussing courage. Aristotle stressed the symmetry between cowardice (too much fear, too little confidence) and rashness (too much confidence, too little fear) and went on to observe that rash men tended to be cowards hiding behind bluster.

That insight has survived the test of time, though it’s one of the few analyses of cowardice that Walsh can draw on. But in the historical and literary record it is always much more concrete. (In that regard it’s worth noting that the LOC catalog lists 44 novels about cowardice, as against just two nonfiction works.)

Until sometime in the 19th century, cowardice seems to have been equated simply and directly with fear. It was the immoral and unmanly lack of yearning for the chance at slaughter and glory. The author refers to the American Civil War as a possible turning point, or at least the beginning of a change, in the United States. By the Second World War, the U.S. Army gave new soldiers a pamphlet stating, up front, YOU’LL BE SCARED and even acknowledging their anxiety that they might prove cowards once in battle.

Courage was not an absence of fear but the ability to act in spite of it. This represented a significant change in attitude, and it had the advantage of being sane. But it did not get around a fundamental issue that Walsh shows coming up repeatedly, and one well-depicted in James Jones’s novel The Thin Red Line:

“[S]omewhere in the back of each soldier’s mind, like a fingernail picking uncontrollably at a scabby sore, was the small voice saying: but is it worth it? Is it really worth it to die, to be dead, just to prove to everybody you’re not a coward?”

The answer that the narrator of Louis-Fernand Celine’s Journey to the End of the Night about the First World War (“I wasn’t very bright myself, but at least I had sense enough to opt for cowardice once and for all”) sounds a lot like Mark Twain’s considered opinion in the matter: “The human race is a race of cowards, and I am not only marching in that procession but carrying a banner.”

Both were satirists, but there may be more to the convergence of sentiment than that. In the late 19th and early 20th centuries, war became mechanized and total, with poison gas and machine guns (just a taste of improvements to come) and whole populations mobilized by propaganda and thrown onto the battlefield. The moral defect of the coward was sometimes less than obvious, especially with some hindsight.

In Twain’s case, the remark about fundamental human cowardice wasn’t an excuse for his own military record, which was not glorious. (He numbered himself among the thousands who "entered the war, got just a taste of it, and then stepped out again permanently.") Walsh provides a crucial bit of context by quoting Twain’s comment that “man’s commonest weakness, his aversion to being unpleasantly conspicuous, pointed at, shunned” is better understood as moral cowardice, “the supreme make-up of 9,999 men in the 10,000.”

I’ve indicated a few of Walsh’s themes here, and neglected a few. (The yellow cover, for example, being a reminder of his pages on the link between cowardice and that color.) Someone might well write an essay about how overwhelmingly androcentric the discussion tends to be, except insofar as a male labeled as a coward is called womanly. This is strange. When the time comes for battle, a man can try to flee, but I’ve never heard of anyone escaping childbirth that way. And the relationship between moral cowardice (or courage) and the military sort seems complex enough for another book.

Editorial Tags: 

Review of Zephyr Teachout, 'Corruption in America: From Benjamin Franklin's Snuff Box to Citizens United'

Writing in 1860, a journalist depicted Washington as a miserable little Podunk on the Potomac, quite unworthy of its status as the nation’s capitol. He called it an “out of the way, one-horse town, whose population consists of office-holders, lobby buzzards, landlords, loafers, blacklegs, hackmen, and cyprians – all subsisting on public plunder.”

"Hackmen" meant horse-powered cabbies. "Blacklegs" were crooked gamblers. And cyprians (lower-case) were prostitutes -- a classical allusion turned slur, since Cyprus was a legendary birthplace of Aphrodite. Out-of-towners presumably asked hackmen where to find blacklegs and cyprians.

But sordid entertainment was really the least of D.C. vices. “The paramount, overshadowing occupation of the residents,” the newsman continued, having just gotten warmed up, “is office-holding and lobbying, and the prize of life is a grab at the contents of Uncle Sam’s till. The public-plunder interest swallows up all others, and makes the city a great festering, unbearable sore on the body politic. No healthy public opinion can reach down here to purify the moral atmosphere of Washington.”

Plus ça change! To be fair, the place has grown more metropolitan and now generates at least some revenue from tourism (plundering the public by other means). Zephyr Teachout quotes this description in Corruption in America: From Benjamin Franklin’s Snuff Box to Citizens United (Harvard University Press), a book that merits the large readership it may get thanks to the author’s recent appearance on "The Daily Show," even if much of that interview concerned her remarkable dark-horse gubernatorial campaign in New York state's Democratic primary, in which anti-corruption was one of her major themes. (Teachout is associate professor of law at Fordham University.)

The indignant commentator of 1860 could include lobbyists in the list of ne’er-do-wells and assume readers would share his disapproval. “Lobby buzzards” were as about as respectable as card sharks and hookers. You can still draw cheers for denouncing their influence, of course, but Teachout suggests that something much deeper than cynicism was involved in the complaint. It had a moral logic – one implying a very different set of standards and expectations than prevails now, to judge by recent Supreme Court rulings.

Teachout’s narrative spans the history of the United States from its beginnings through Chief Justice John Roberts’s decision in McCutcheon v. FEC, less than six months ago. One of the books that gripped the country’s early leaders was Edward Gibbon’s Decline and Fall of the Roman Empire, the first volume of which happened to come out in 1776, and Teachout regards the spirit they shared with Gibbon as something like the crucial genetic material in the early republic’s ideological DNA.

To be clear, she doesn’t argue that Gibbon influenced the founders. Rather, they found in his history exceptionally clear and vivid confirmation of their understanding of republican virtue and the need to safeguard it by every possible means. A passage from Montesquieu that Thomas Jefferson copied into his notebook explained that a republican ethos “requires a constant preference of public to private interest [and] is the source of all private virtues….”

That “constant preference” required constant vigilance. The early U.S. statesmen looked to the ancient Roman republic as a model (“in creating something that has never yet existed,” a German political commentator later noted, political leaders “anxiously conjure up the spirits of the past to their service and borrow from them names, battle cries, and costumes in order to present the new scene of world history in this time-honored disguise and this borrowed language”).

But the founders also took from history the lesson that republics, like fish, rot from the head down. The moral authority, not just of this or that elected official, but of the whole government demanded the utmost scruple – otherwise, the whole society would end up as a fetid moral cesspool, like Europe. (The tendency to define American identity against the European other runs deep.)

Translating this rather anxious ideology into clear, sharp legislation was a major concern in the early republic, as Teachout recounts in sometimes entertaining detail. It was the diplomatic protocol of the day for a country’s dignitaries to present lavish gifts to foreign ambassadors -- as when the king of France gave diamond-encrusted snuffboxes, with his majesty’s portrait on them, to Benjamin Franklin and Thomas Jefferson. In Franklin’s case, at least, the gift expressed admiration and affection for him as an individual at least as much as it did respect for his official role.

But all the more reason to require Congressional approval. Doing one’s public duty must be its own reward, not an occasion for private benefit. Franklin received official permission to accept the snuffboxes, as did two other figures Teachout discusses. The practice grated on American sensibilities, but had to be tolerated to avoid offending an ally. Jefferson failed to disclose the gift to Congress and quietly arranged to have the diamonds plucked off and sold to cover his expenses.

Like the separation of powers among the executive, legislative, and judicial branches (another idea taken from Montesquieu), the division of Congress into House and Senate was also designed to preempt corruption: “The improbability of sinister combinations,” wrote Madison, “will be in proportion to the dissimilarity in genius of the two bodies.” Teachout quotes one delegate to the Constitutional Convention referring sarcastically to the “mercenary & depraved ambition” of “those generous & benevolent characters who will do justice to each other’s merit, by carving out offices & rewards for it.”

Hence the need for measures such as the clause in Article 1, Section 6 forbidding legislators from serving simultaneously in an appointed government position. It also prevented them from accepting such a position created during their terms, after they took office. The potential for abuse was clear, but it could be contained. The clause was an effort “to avoid as much as possible every motive for corruption,” in another delegate’s words.

Corruption, so understood, clearly entails far more than bribery, nepotism, and the like – things done with an intent to influence the performance of official duties, in order to yield a particular benefit. The quid pro quo was only the most obvious level of the injustice. Beyond violating a rule or law, it undermines the legitimacy of the whole process. It erodes trust in even the ideal of disinterested official power. Public service itself begins to look like private interest carried on duplicitously.

The public-mindedness and lofty republican principles cultivated in the decades just after the American revolution soon enough clashed with the political and economic realities of a country expanding rapidly westward. There were fortunes to be made, and bribes to be taken. But as late as the 1880s, states were putting laws on the books to wipe out lobbying, on the grounds that it did damage to res publica.

Clearly a prolonged and messy process has intervened in the meantime, which we’ll consider in the next column, along with some of the criticism of Teachout’s ideas that have emerged since she began presenting them in legal journals a few years ago. Until then, consider the proposal that newspaper writer of the 1860s offered for how to clean the Augean stables of Washington: To clear out corruption, the nation’s capitol should be moved to New York City, where it would be under a more watchful eye. Brilliant! What could possibly go wrong?

 

 

Editorial Tags: 

Review of Jo Guldi and David Armitage, "The History Manifesto"

When young sociologists would consult with C. Wright Mills, it’s said, he would end his recommendations with what was clearly a personal motto: “Take it big!” It was the concentrated expression of an ethos: Tackle major issues. Ask wide-ranging questions. Use the tools of your profession, but be careful not to let them dig a mental rut you can’t escape.

Jo Guldi and David Armitage give much the same advice to their colleagues, and especially their colleagues-to-be, in The History Manifesto, a new book from Cambridge University Press. (Guldi is an assistant professor of history at Brown University, while Armitage is chair of the history department at Harvard.) Only by “taking it big” can their field regain the power and influence it once had in public life – and lost, somewhere along the line, to economics, with its faith in quantification and the seeming rigor of its concepts.

But issues such as climate change and growing economic inequality must be understood in terms of decades and centuries. The role of economists as counselors to the powerful has certainly been up for question over the past six years. Meanwhile, the world’s financial system continues to be shaped by computerized transactions conducted at speeds only a little slower than the decay of subatomic particles. And so, with their manifesto, the authors raise the call: Now is the time for all good historians to come to the aid of their planet.

But first, the discipline needs some major recalibration. “In 1900,” Guldi and Armitage write, “the average number of years covered [by the subject matter of] doctoral dissertations in history in the United States was about 75 years; by 1975, it was closer to 30.” The span covered in a given study is not the only thing that’s narrowed over the intervening four decades. Dissertations have “concentrated on the local and the specific as an arena in which the historian can exercise her skills of biography, archival reading, and periodization within the petri-dish of a handful of years.”

The problem isn’t with the monographs themselves, which are often virtuoso analyses by scholars exhibiting an almost athletic stamina for archival research. Guldi and Armitage recognize the need for highly focused and exhaustively documented studies in recovering the history of labor, racial and religious minorities, women, immigrants, LGBT people, and so forth.

But after two or three generations, the “ever narrower yet ever deeper” mode has become normative. The authors complain that it "determines how we write our studies, where we look for sources, and which debates we engage. It also determines where we break off the conversation.”

Or, indeed, whether the conversation includes a historical perspective at all. “As students in classrooms were told to narrow and to focus” their research interests, “the professionals who deal with past and future began to restrict not only their sources and their data, but sometimes also their ideas.”

In referring to “professionals who deal with past and future,” the authors do not mean historians themselves -- at least not exclusively -- but rather leaders active at all levels of society. The relevance of historical knowledge to public affairs (and vice versa) once seemed obvious. Guldi and Armitage point to Machiavelli’s commentary on Livy as one example of a political figure practicing history, while Eric Williams, who wrote Capitalism and Slavery for his doctoral dissertation, went on to serve as Trinidad’s first prime minister after it became independent.

Between extreme specialization by historians and politicians whose temporal horizons are defined by the election cycle, things look bad. That understanding the past might have some bearing on actions in the present may not seem all that difficult to grasp. But consider the recent American president who invaded a Middle Eastern country without knowing that its population consisted of two religious groups who, over the past millennium or so, have been less than friendly toward one another. (For some reason, I thought of that a couple of times while reading Guldi and Armitage.) Anyway, it did turn out to be kind of an issue.

A manifesto requires more than complaint. It must also offer a program and, as much as possible, rally some forces for realizing its demands. The cure for short-term thinking in politics that Guldi and Armitage propose is the systematic cultivation of long-term thinking in history.

And to begin with, that means putting the norms of what they call “microhistory” in context – keeping in mind that it is really a fairly recent development within the profession. (Not so many decades ago, a historical study covering no more than a hundred years ran the risk of being dismissed as a bit of a lightweight.) The authors call for a revival of the great Ferdinand Braudel’s commitment to study historical processes “of long, even of very long, duration,” as he said in the late 1950s.

Braudel’s longue-durée was the scale on which developments such as the consolidation of trade routes or the growth of a world religion took place: centuries, or millennia. These phenomena “lasted longer than economic cycles, to be sure,” Guldi and Armitage write, but “were significantly shorter than the imperceptibly shifting shapes of mountains or seas, or the rhythms of nomadism or transhumance.”

Braudel counterposed the longue-durée to “the history of events,” which documented ephemeral matters such as wars, political upheaval, and whatnot. The History Manifesto is not nearly so Olympian as that. The aim is not to obliterate what the authors call “the Short Past” but rather to encourage research that would put “events” in the perspective of rhythms of change extending beyond a single human lifetime.

The tools are available. Guldi and Armiutage’s proposed course seems inspired by Big Data as much as by Braudel. Drawing on pools of scattered information about “weather, trade, agricultural production, food consumption, and other material realities,” historians could create broad but detailed accounts of how the social and environmental conditions change over long periods.

“Layering known patterns of reality upon each other,” the authors say, “produces startling indicators of how the world has changed – for instance the concentration of aerosols identified from the mid-twentieth century in parts of India have proven to have disrupted the pattern of the monsoon…. By placing government data about farms next to data on the weather, history allows us to see the interplay of material change with human experience, and how a changing climate has already been creating different sets of winners and losers over decades.”

Any number of questions come to mind about causality, the adequacy of available documents, and whether one’s methodology identifies patterns or creates them. But that’s always the case, whatever the scale a historian is working on.

The History Manifesto is exactly as tendentious as the title would suggest -- and if the authors find it easier to make their case against “microhistory” by ignoring the work of contemporary “macrohistorians” …. well, that’s the nature of the genre. A few examples off the top of my head: Perry Anderson’s Passages from Antiquity to Feudalism and Lineages of the Absolutist State, Michael Mann’s multivolume study of social power over the past five thousand years, and the essays by Gopal Balakrishnan’s collected in Antagonistics, which grapple with the longue-duré in terms both cosmopolitan and stratospheric. They also shuffle quietly past the work of Oswald Spengler, Arnold Toynbee, and Carroll Quigley – an understandable oversight, given the questions that would come up about where megahistory ends and megalomania takes over.

Moments of discretion aside, The History Manifesto is a feisty and suggestive little book, and it should be interesting to see whether much of the next generation of historians will gather beneath its banner.

 

Editorial Tags: 

New NEH director welcomes digital humanities grant recipients to the agency's new home

Smart Title: 

Grants for digital humanities projects serve as established tradition as the new chairman for the National Endowment for the Humanities welcomes grant recipients to the agency's new home in Washington.

Review of Anna M. Young, "Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement"

Many a thick academic tome turns out to be a journal article wearing a fat suit. So all due credit to Anna M. Young, whose Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement was published by Southern Illinois University Press this year. Her premise is sound; her line of argument looks promising; and she gets right to work without the rigmarole associated with what someone once described as the scholarly, “Yes, I read that one too” tic. 

Indeed, several quite good papers could be written exploring the implicit or underdeveloped aspects of her approach to the role and the rhetoric of the public intellectual. Young is an associate professor of communication at Pacific Lutheran University, in Tacoma, Washington. Much of the book is extremely contemporary in emphasis (to a fault, really, just to get my complaint about it out front here). But the issue it explores goes back at least to ancient Rome -- quite a while before C. Wright Mills got around to coining the expression “public intellectual” in 1958, in any case.

The matter in question emerges in Cicero’s dialogue De Oratore, where Young finds discussed a basic problem in public life, then and now. Cicero, or his stand-in character anyway, states that for someone who wants to contribute to the public discussion of important matters, “knowledge of a vast number of things is necessary, without which volubility of words is empty and ridiculous.”

On the other hand -- as Cicero has a different character point out -- mere possession of learning, however deep and wide, is no guarantee of being able to communicate that learning to others. (The point will not be lost on those of you surreptitiously reading this column on your mobile phones at a conference.)

Nobody “can be eloquent on a subject that he does not understand,” says Cicero. Yet even “if he understands a subject ever so well, but is ignorant of how to form and polish his speech, he cannot express himself eloquently about what he does understand.”

And so what is required is the supplementary form of knowledge called rhetoric. The field had its detractors well before Cicero came along. But rhetoric as defined by Aristotle referred not to elegant and flowery bullshit but rather to the art of making cogent and persuasive arguments.

Rhetoric taught how to convey information, ideas, and attitudes by selecting the right words, in the right order, to deliver in a manner appropriate to a particular audience -- thereby convincing it of an argument, generally as a step toward moving it to take a given action or come to a certain judgment or decision. The ancient treatises contain not a little of what would later count as psychology and sociology, and modern rhetorical theory extends its interdisciplinary mandate beyond the study of speech, into all other forms of media. But in its applied form, rhetoric continues to be a skill of skills – the art of using and coordinating a number of registers of communication at the same time: determining the vocabulary, gestures, tone and volume of voice, and so on best-suited to message and audience.  

When the expression “public intellectual” was revived by Russell Jacoby in the late 1980s, it served in large part to express unhappiness with the rhetorical obtuseness of academics, particularly in the humanities and social sciences. The frustration was not usually expressed quite that way. It instead took the form of a complaint that intellectuals were selling their birthright as engaged social and cultural critics in exchange for the mess of pottage known as tenure. It left them stuck in niches of hyperspecialized expertise. There they cultivated insular concerns and leaden prose styles, as well as inexplicable delusions of political relevance.

The public intellectual was a negation of all of this. He or she was a free-range generalist who wrote accessibly, and could sometimes be heard on National Public Radio. In select cases the public intellectual was known to Charlie Rose by first name.

I use the past tense here but would prefer to give the term a subscript: The public intellectual model ca. 1990 was understood to operate largely or even entirely outside academe, but that changed over the following decade, as the most prominent examples of the public intellectual tended to be full-time professors, such as Cornel West and Martha Nussbaum, or at least to teach occasionally, like Judge Richard Posner, a senior lecturer in law at the University of Chicago.

And while the category continues to be defined to some degree by contrast with certain tried-and-true caricatures of academic sensibility, the 2014 model of the public intellectual can hardly be said to have resisted the blandishments of academe. The danger of succumbing to the desire for tenure is hardly the issue it once might have seemed. 

Professor Young’s guiding insight is that public intellectuals might well reward study through rhetorical analysis -- with particular emphasis on aspects that would tend to be missed otherwise. They come together under the heading “style.” She does not mean the diction and syntax of their sentences, whether written or spoken, but rather style of demeanor, comportment, and personality (or what’s publicly visible of it).

Style in Young’s account includes what might be called discursive tact. Among other things it includes the gift of knowing how and when to stop talking, and even to listen to another person’s questions attentively enough to clarify, and even to answer them. The author also discusses the “physiological style” of various public intellectuals – an unfortunate coinage (my first guess was that it had something to do with metabolism) that refers mostly to how they dress.

A public intellectual, then, has mastered the elements of style that the “traditional intellectual” (meaning, for the most part, the professorial sort) typically does not. The public perceives the academic “to be a failure of rhetorical style in reaching the public. He is dressed inappropriately. She carries herself strangely. He describes ideas in ways we cannot understand. She holds the floor too long and seems to find herself very self-important.” (That last sentence is problematic in that a besetting vice of the self-important that they do not find themselves self-important; if they did, they’d probably dial it down a bit.)

Now, generations of satirical novels about university life have made clear that the very things Young regards as lapses of style are, in fact, perfectly sensible and effective rhetorical moves on their own terms. (The professor who wears the same argyle sweater year-round has at least persuaded you that he would rather think about the possible influence of the Scottish Enlightenment on The Federalist Papers than the admittedly large holes.)

But she longs for a more inclusive and democratic mode of engagement of scholarship with the public – and of the public with ideas and information it needs. To that end, Young identifies a number of public-intellectual character types that seem to her exemplary and effective. “At different times,” she writes, “and in different cultural milieus, different rhetorical styles emerge as particularly relevant, powerful, and persuasive.” And by Young’s count, six of them prevail in America at present: Prophet, Guru, Sustainer, Pundit, Narrator, and Scientist.

“The Prophet is called by a higher power at a time of crisis to judge sinners in the community and outline a path of redemption. The Guru is the teacher who gains a following of disciples and leads them to enlightenment. The Sustainer innovates products and processes that sustain natural, social, and political environments. The Pundit is a subject expert who discusses the issues of the day in a more superficial way via the mass media. The Narrator weaves experiences with context, creating relationships between event and communities and offering a form of evidence that flies below the radar in order to provide access to information.” Finally, the Scientist “rhetorically constructs his or her project as one that answers questions that have plagued humankind since the beginnings….”

The list is presumably not meant to be exhaustive, but Young finds examples of people working successfully in each mode. Next week we'll take a look at what the schema implies -- and at the grounds for thinking of each style as successful.

 

Editorial Tags: 

Pages

Subscribe to RSS - History
Back to Top