In 2011, Paul Clemens, a writer from Detroit published an up-close and personal look at deindustrialization called Punching Out: One Year in a Closing Auto Plant. It recorded the year he spent on a work team hired to dismantle and gut one of the city’s remaining factories. I wrote about the book when it came out, and won’t recycle anything here, but recall a few paragraphs expressing a particular kind of frustration that a non-Detroiter can only sympathize with, not really share.
The cause was a tendency by outsiders – or a subset of them at any rate – to treat the city’s decline as perverse kind of tourist attraction or raw material for pontification. Clemens had lost all patience with arty photographs of abandoned buildings and pundits’ blatherscate about the “creative destruction” intrinsic to dynamic capitalism. He also complained about the other side of the coin, the spirit of “we’re turning the corner!” boosterism. “No Parisian is as impatient with American mispronunciation,” he wrote, “no New Yorker as disdainful of tourists needing directions, as is a born-and-bred Detroiter with the optimism of recent arrivals and their various schemes for the city’s improvement.”
Dora Apel, a professor of art history and visual culture at Wayne State University, has, in effect, gathered everything that dismays and offends Clemens between the covers of Beautiful Terrible Ruins: Detroit and the Anxiety of Decline (Rutgers University Press).
She calls Detroit “the quintessential urban nightmare in a world where the majority of people live in cities.” But nightmare imagery can be seductive, making Detroit “a thriving destination for artists, urban explorers, academics, and other curious travelers and researchers who want to experience for themselves industrial, civic, and residential abandonment on a massive scale.”
That lure is felt most keenly by people who, after the “experience,” enjoy the luxury of going home to someplace stable, orderly, and altogether more pleasant. It’s evident Apel finds something ghoulish about taking pleasure from a scene of disaster, “feeding off the city’s misery while understanding little about its problems, histories, or dreams.” But the aesthetic appeal of ruins – the celebration of old buildings crumbling picturesquely, of columns broken but partly standing, of statuary fractured and eroded by time –- goes back at least to the 18th century, and it can’t be reduced to mere gloating. The author makes a brief but effective survey of “ruin lust,” a taste defined by “the beautiful and melancholic struggle between nature and culture,” as well as the feelings of contrast between ancient and modern life that ruins could evoke in the viewer, in pleasurable ways. She also points out how, in previous eras, this taste often involved feelings of national superiority, as with well-off travelers enjoying the sight of another country’s half-demolished architecture. (At least a tinge of gloating, in that case.)
It’s not difficult to recognize classical elements of the ruins aesthetic as "The Tree" by James D. Griffioen. One of a number of images reproduced in the book, Griffioen took the photograph in the Detroit Public Schools Book Depository, in which a sapling has taken root in the mulch created by a layer of decomposing textbooks – an almost schematic case of “the beautiful and melancholic struggle between nature and culture.” But Apel underscores the differences between the 21st century mode of “ruination” and the taste cultivated in earlier periods. For one thing, ““modernist architecture refuses the return of culture to nature in the manner of ancient ruins in large part because the building materials of concrete, steel, and glass do not delicately crumble in the picturesque way that stone does.”
More importantly, though, the fragments aren’t poking up from some barely imaginable gulf in time and culture: Detroit was, in effect, the world capital of industrial society within living memory. In his autobiography published in the early 1990s, then-Mayor Coleman Young wrote: “In the evolutionary urban order, Detroit today has always been your town tomorrow.” The implications of that thought are considerably more gloomy than they were even 20 years ago. One implication of imagery such as "The Tree" is that it’s not a reminder of the recent past so much as a glimpse at the ruins of the not too distant future.
Photography is not the only cultural register in which the fascination with contemporary ruins makes itself evident: There are “urban exploration,” for example: a subculture consisting (it seems) mainly of young guys who trespass on ruined property to take in the ambience while also enjoying the dangers and challenges of moving around in collapsing architecture. Apel also writes about artists in Detroit who have colonized depopulated areas both to reclaim them as living space and to incorporate the ruins into their creative work.
The effects are not strictly local: “the borders between art, media, advertising, and popular culture have become increasingly permeable,” Apel writes, “as visual imagery easily ranges across these formats and as people produce their own imagery on websites and social media.” And the aestheticized ruination of Detroit feeds into a more widespread (even global) “anxiety of decline” expressed in post-apocalyptic videogame scenarios, survivalist television programs, zombie movies, and so on. Not that Detroit is the inspiration in each case, but it provides the most concrete, real-world example of dystopia.
“As the largest deteriorating former urban manufacturing center,” Apel writes, contemporary Detroit is a product of an understanding of society in “rights are dependent on what people can offer to the state’s economic well-being, rather than vice-versa,” and “the lost protection of the state means vastly inadequate living conditions and the most menial and unprotected forms of labor in cities that are divested of many of their social services and left to their own devices.”
Much of the imagery analyzed in Beautiful Terrible Ruins seems to play right along with that social vision. The nicely composed photographs of crumbling buildings are usually empty of any human presence, while horror movies fill their urban landscapes with the hungry undead -- the shape of dreaded things to come.
The most distracting thing about costume dramas set in any period before roughly the turn of the 20th century -- in my experience, anyway -- is the thought that everything and everyone on screen must have smelled really bad. The most refined lords and gentry on Wolf Hall did not bathe on anything we would regard today as a regular basis.
No doubt there were exceptions. But until fairly recently in human history, even the most fastidious city dweller must have grown accustomed to the sight of human waste products from chamber pots that had been emptied in the street. (And not just the sight of it, of course.) Once in a while a movie or television program will evince something of a previous era’s ordinary grunge, as in The Return of Martin Guerre or Deadwood, where almost everything looks soiled, fetid and vividly uncomfortable. But that, too, is exceptional. The audience for costume drama is often looking for charm, nostalgia or escapism, and so the past usually wears a deodorant.
The wider public may not have heard of it, but a “sensory turn” among American historians has made itself felt in recent years -- an attention, that is, to the smells, tastes, textures and sounds of earlier periods. I refer to just four senses, because the importance of sight was taken for granted well before the turn. In their more polemical moments, sensory historians have even referred to “the tyranny of the visual” within their discipline.
That seems a little melodramatic, but point taken: historians have tended to scrutinize the past using documents, images, maps and other artifacts that chiefly address the eye. Coming in second as the organ of perception most likely to play a role in historical research would undoubtedly be the ear, thanks to the advent of recorded sound. The remaining senses tie for last place simply because they leave so few traces -- which, in any case, are not systematically preserved the way audiovisual materials are. We have no olfactory or haptic archives; it is difficult to imagine a library of flavors.
Calls to overcome these obstacles -- to analyze whatever evidence could be found about how everyday life once sounded, smelled, felt, etc. -- came from American historians in the early 1990s, with a few pioneers at work in Europe even before that. But the field of sensory history really came into its own over the past decade or so, with Mark M. Smith’s How Race is Made: Slavery, Segregation and the Senses (University of North Carolina Press, 2006) and Sensing the Past: Seeing, Hearing, Smelling, Tasting and Touching in History (University of California Press, 2007) being among the landmarks. Smith, a professor of history at the University of South Carolina, also convened a roundtable on the sensory turn published in the September 2008 issue of The Journal of American History. A number of the contributors are on the editorial board of the Studies in Sensory History series published by the University Illinois Press, which launched in 2011.
The series’ fifth and most recent title is Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers by Adam Mack, an assistant professor of history at the School of the Art Institute of Chicago. Beyond the monographic focus -- it covers about fifty years of the city’s history -- the book demonstrates how much of the sensory field of an earlier era can be reconstructed, and why doing so can be of interest.
Overemphasis on the visual dimension of an urban landscape “mirrors a set of modern cultural values that valorize the eye as the barometer of truth and reason,” we read in the introduction, “and tend to devalue the proximate, ‘lower’ senses as crude and less rational.” Having thus recapitulated one of sensory history’s founding premises, the author wastes no time before heading to one site that must have forced its way deep into the memory of anyone who got near it in the 19th century: the Chicago River.
“A bed of filth,” one contemporary observer called it, where manure, blood, swill and unusable chunks of carcass from the slaughterhouses ended up, along with human sewage and dead animals -- all of it (an editorialist wrote) “rotting in the sun, boiling and bubbling like the lake of brimstone, and emitting the most villainous fumes,” not to mention drawing clouds of flies. A letter writer from 1862 mentions that the water drawn from his kitchen hydrant contained “half a pint or more of rotten fish.” Many people concluded that it was safest just to drink beer instead.
Laws against dumping were passed and commissions appointed to investigate the problem, for all the good it did. The poorest people had to live closest to the river, so disgust at the stench combined in various ways with middle- and upper-class attitudes towards them, as well as with nativist prejudices.
The horrific odor undermined efforts to construct a modern, rationally organized city. Imposing a grid of streets on the landscape might please the eye, but smell didn’t respect geometry. The same principle applied to the Great Fire of 1871, the subject of Mack’s next chapter. The heat and sheer sensory overload were overwhelming, and the disaster threw people from all walks of life together in the streets in a way that made social status irrelevant, at least for a while. The interplay between social hierarchy and sensory experience (exemplified in references to “the roar of the mob”) is the thread running through the rest of the book. Thinking of the “‘lower’ senses as crude and less rational” -- to quote the author’s phrase again -- went along with assumptions about refinement or coarseness as markers of class background.
The sources consulted by the author are much the same as any other historian might use: newspapers, civic records, private or otherwise unpublished writings by long-forgotten people, such as the recollections of the Great Fire by witnesses, on file at the Chicago History Museum. The contrast is at the level of detail -- that is, the kinds of detail the historian looks for and interprets. Perhaps the next step would be for historians to enhance their work with direct sensory documentation.
A prototype might be found in the work of John Waters, who released one of his movies in Odorama. Audience members received cards with numbered scratch-and-sniff patches, which they consulted when prompted by a message on the screen.
On second thought, it was difficult enough to read Mack’s account of the Chicago River in the 19th century without tickling the gag reflex. Olfactory realism might push historical accuracy farther than anyone really wants it to go.
The South is home for me, but to my students in Minnesota, it’s an exotic place from which I am an ambassador. So when Dylann Roof massacred congregants at Emanuel African Methodist Episcopal Church in Charleston, S.C., last month and students began asking me about the killings and the debate they reignited over the Confederate flag, I did not know whether they sought my analysis as a scholar of the Confederacy and its legacies or my feelings as a transplanted Southerner. My uncertainty deepened because the questions came between semesters, from men and women who had taken courses with me last spring and would do so again in the fall. Did the timing of the questions change my relationship to the people who asked them, and therefore inform which part of me -- the professor or the person -- answered?
The difference between my answers depends on whether I want my students to embrace or reject the polemic through which we discuss the Confederacy, its cause and its symbols, ascribing them to represent either virulent hatred or regional pride and nostalgia. In a “Room for Debate” feature on June 19, The New York Times pitted former Georgia Congressman Ben Jones’s views of the flag as “A Matter of Pride and Heritage” against three authors who emphasized the flag’s postwar uses as a banner for Jim Crow violence, reactionary resistance to integration and civil rights, and the most obdurate hate groups in the contemporary United States. Governor Nikki Haley invoked a similar framing in her speech calling for the South Carolina legislature to remove the flag from the state capitol grounds. The governor presented the flag’s dual meanings on an almost equal footing; which interpretation a person chose, she implied, depended on their race. For white people, the flag meant honoring the “respect, integrity and duty” of Confederate ancestors -- “That is not hate, nor is it racism,” she said of that interpretation -- while “for many others … the flag is a deeply offensive symbol of a brutally oppressive past.”
In asking the South Carolina legislature to remove the flag from the statehouse grounds, the governor posed the meaning of the Confederate flag as a choice, and she refused to pick sides because she understood the sympathies of those in both interpretive camps. Because many of those who honor the state’s Confederate past neither commemorate nor act out of hate, in the governor’s logic they are not wrong -- merely out of sync with the political needs of 2015.
As a person, I want my students to take sides in that polemic, to know that Confederate “heritage” is the wrong cause to celebrate in any context. I want my students to know that the Confederacy was created from states that not only embraced slavery but, as Ta-Nahisi Coates has demonstrated beyond refutation, proudly defined their political world as a violent, diabolical contest for racial mastery. I want them to understand that the Civil War rendered a verdict on secession and, in the words of historian Stephanie McCurry, on “a modern pro-slavery and antidemocratic state, dedicated to the proposition that all men were not created equal.”
I want them to scrutinize, as John Coski has done in his excellent book The Confederate Battle Flag: America’s Most Embattled Symbol, the flag’s long use by those who reject equal citizenship. I want my (overwhelmingly white) students to grasp why the flags of the Confederacy in their many iterations -- on pickup trucks, college campuses and statehouse grounds -- tell African-Americans that they are not, and cannot be, equal citizens. I want them to feel the imperative in the words of President Obama’s eulogy for Reverend Clementa Pinckney, in which the president claimed that only by rejecting the shared wrongs of slavery, Jim Crow and the denial of civil rights can we strive for “an honest accounting of America’s history; a modest but meaningful balm for so many unhealed wounds.”
As a historian, I want more. I don’t want my students to simply choose sides in a polemic between heritage and hate; rather, I hope they will interrogate the Confederacy’s white supremacist project on more complex terms. A simple dichotomy of heritage or hate misses something essential about both the Confederacy and the social construction of racism: then as now, you don’t need to hate to be a racist. Many Confederate soldiers held views that cohered perfectly with the reactionary, violent and indeed hateful lens through which Dylann Roof sees race. After the Battle of Petersburg, Major Rufus Barrier celebrated the “slaughter … in earnest” of black soldiers and relished how “the blood ran in streams from their worthless carcasses.”
But others, like Confederate officer Francis Marion Parker, grounded their commitment to white supremacy not in jagged words of hate, but in the softer tones of family. Explaining his reasons for going to war in a letter to his wife and children, Parker promised that “home will be so sweet, when our difficulties are settled and we are permitted to return to the bosom of our families, to enjoy our rights and privileges” -- that is, slaveholding -- “under the glorious flag of the Confederacy.”
I want my students to see that men and women of differing temperaments and qualities supported the Confederacy’s white supremacist project and justified their support through a variety of ethics, appeals and emotions. I want them to overcome rhetorical paradigms that allow modern-day defenders of Confederate heritage to divorce the character of the men who fought for the “Lost Cause” from the cause itself. I want them to think critically about how otherwise honorable, courageous men as well as vicious, hate-filled racists came to embrace a cause informed, in the words of Confederate Vice President Alexander Stephens, by “the great truth that the Negro is not the equal of the white man, that slavery, subordination to the superior race, is his natural and normal condition.”
I hope my students will draw a bit from both of my answers, bringing careful scrutiny of the past into dialogue with the urgency of the present. As they do so, I hope they will think a bit about what historical meaning is and why history demands their scrutiny. If history becomes a mere servant of contemporary “truthiness,” reduced to selective anecdotes deployed as weapons in polarizing debates, then we are merely choosing camps in a contest of identities. Such one-dimensional choices leave space for those who equate the Confederacy with nostalgia and a kind of inherited pride to use the Confederate flag as shorthand for who they are and where they come from without any mention of race or white supremacy. Yet if historical interpretation remains antiquarian and refuses to speak to the present, it leaves us self-satisfied in the illusion that we have transcended the people and societies we study. One day generations yet unborn will scrutinize us and find us wanting, too. If we critique the people of the past and the choices they made not only with an eye to distancing ourselves from their worst extremes but also with a sense of how easy, how normal and how justifiable unequal citizenship can appear to be, the tragedy in Charleston and the history it invokes may teach a resonant lesson.
David C. Williard is assistant professor of history at the University of Saint Thomas, in Minnesota.
Reading the Emancipation Proclamation for the first time is an unforgettable experience. Nothing prepares you for how dull it turns out to be. Ranking only behind the Declaration of Independence and the Constitution in its consequences for U.S. history, the document contains not one sentence that has passed into popular memory. It was the work, not of Lincoln the wordsmith and orator, but of Lincoln the attorney. In fact, it sounds like something drafted by a group of lawyers, with Lincoln himself just signing off on it.
Destroying an institution of systematic brutalization -- one in such contradiction to the republic’s professed founding principles that Jefferson’s phrase “all men are created equal” initially drew protests from slave owners -- would seem to require a word or two about justice. But the proclamation is strictly a procedural document. The main thrust comes from an executive order issued in late September 1862, “containing, among other things, the following, to wit: ‘That on the first day of January, in the year of our Lord one thousand eight hundred and sixty-three, all persons held as slaves within any State or designated part of a State, the people whereof shall then be in rebellion against the United States, shall be then, thenceforward, and forever free….’”
Then -- as if to contain the revolutionary implications of that last phrase -- the text doubles down on the lawyerese. The proclamation itself was issued on the aforesaid date, in accord with the stipulations of the party of the first part, including the provision recognizing “the fact that any State, or the people thereof, shall on that day be, in good faith, represented in the Congress of the United States by members chosen thereto at elections wherein a majority of the qualified voters of such State shall have participated, shall, in the absence of strong countervailing testimony, be deemed conclusive evidence that such State, and the people thereof, are not then in rebellion against the United States.”
In other words: “If you are a state, or part of a state, that recognizes the union enough to send representatives to Congress, don’t worry about your slaves being freed right away and without compensation. We’ll work something out.”
Richard Hofstadter got it exactly right in The American Political Tradition (1948) when he wrote that the Emancipation Proclamation had “all the moral grandeur of a bill of lading.” It is difficult to believe the same author could pen the great memorial speech delivered at Gettysburg a few months later -- much less the Second Inaugural Address.
But to revisit the proclamation after reading Edna Greene Medford’s Lincoln and Emancipation (Southern Illinois University Press) is also a remarkable experience -- a revelation of how deliberate, even strategic, its lawyerly ineloquence really was.
Medford, a professor of history at Howard University, was one of the contributors to The Emancipation Proclamation: Three Views (Louisiana State University Press, 2006). Her new book is part of SIUP’s Concise Lincoln Library, now up to 17 volumes. Medford’s subject overlaps with topics covered by earlier titles in the series (especially the ones on race, Reconstruction and the Eighteenth Amendment) as well as with works such as Eric Foner’s The Fiery Trial: Abraham Lincoln and American Slavery (Norton, 2010).
Even so, Medford establishes her own approach by focusing not only on Lincoln’s ambivalent and changing sense of what he could and ought to do about slavery (a complex enough topic in its own right) but also on the attitudes and activities of a heterogeneous and dispersed African-American public with its own priorities.
For Lincoln, abolishing the institutionalized evils of slavery was a worthy goal but not, as such, an urgent one. As of 1860, his primary concern was that it not spread to the new states. After 1861, it was to defeat the slaveholders’ secession -- but without making any claim to the power to end slavery itself. He did support efforts to phase it out by compensating slave owners for manumission. (Property rights must be respected, after all, went the thinking of the day.) His proposed long-term solution for racial conflict was to send the emancipated slaves to Haiti, Liberia, or someplace in Central America to be determined.
Thanks in part to newspapers such as The Weekly Anglo-African, we know how free black citizens in the North responded to Lincoln, and it is clear that some were less than impressed with his antislavery credentials. “We want Nat Turner -- not speeches,” wrote one editorialist; “Denmark Vesey -- not resolutions; John Brown -- not meetings.” Especially galling, it seems, were Lincoln’s plans to reimburse former slave owners for their trouble while uprooting ex-slaves from land they had worked for decades. African-American commentators argued that Lincoln was getting it backward. They suggested that the ex-slaves be compensated and their former masters shipped off instead.
To boilMedford’s succinct but rich narrative down into something much more schematic, I’ll just say that Lincoln’s cautious regard for the rights of property backfired. Frederick Douglass wrote that the slaves “[gave] Mr. Lincoln credit for having intentions towards them far more benevolent and just than any he is known to cherish…. His pledges to protect and uphold slavery in the States have not reached them, while certain dim, undefined, but large and exaggerated notions of his emancipating purpose have taken firm hold of them, and have grown larger and firmer with every look, nod, and undertone of their oppressors.” African-American Northerners and self-emancipating slaves alike joined the Union army, despite all the risks and the obstacles.
The advantage this gave the North, and the disruption it created in the South, changed abolition from a moral or political concern to a concrete factor in the balance of forces -- and the Emancipation Proclamation, for all its uninspired and uninspiring language, was Lincoln’s concession to that reality. He claimed the authority to free the slaves of the Confederacy “by virtue of the power in me vested as Commander-in-Chief, of the Army and Navy of the United States in time of actual armed rebellion against the authority and government of the United States, and as a fit and necessary war measure for suppressing said rebellion.”
Despite its fundamentally practical motivation and its avoidance of overt questions about justice, the proclamation was a challenge to the American social and political order that had come before. And it seems to have taken another two years before the president himself could spell out its implications in full, in his speech at the Second Inaugural. The depth of the challenge is reflected in the each week's headlines, though to understand it better you might want to read Medford's little dynamite stick of a book first.
Many historians try to make their work accessible to the public. But how accessible is too accessible, and at what cost? New course offered jointly by History Channel and U of Oklahoma has some on campus wondering.
"I sound my barbaric yawp over the roofs of the world," Walt Whitman declares in Leaves of Grass. How he ended the line without an exclamation point always puzzled me, but maybe it was implicit. The poet sang "the body electric," and every line was meant to zap the reader into a higher state of awareness.
Whitman would have been pleased to see the new American history textbook called The American Yawp -- and not just for its allusive title. As a sometime school teacher and educational reformer, he wanted "free, ample and up-to-date textbooks, preferably by the best historians" (to quote one discussion of this aspect of the poet's life). Yawp's 30 chapters cover American history from the last ice age through the appearance of the millennial generation. It has plenty about the founders and the origins of the U.S., but avoids a triumphalist tone and includes material on inequality -- including economic inequality -- throughout. It was prepared through the collaborative efforts of scores of historians. And the creators have published it online, for free.
The beta version was released, with no fanfare at all, at the start of the current academic year. By the fall, a revision will be issued in e-book format, suitable for use in an undergraduate survey course -- again, for free. Walt would surely approve.
I contacted the editors -- Joseph Locke, an assistant professor of history at the University of Houston-Victoria, and Ben Wright, an assistant professor of history and political science at the Abraham Baldwin Agricultural College in south Georgia -- to find out more about The American Yawp. They collaborated in responding to my questions by e-mail. A transcript of the discussion follows.
Q: How did you go about writing (assembling?) your textbook? Did you collaborate via Listservs? Were there any face-to-face meetings?
A: Traditional textbooks usually begin with a single editor or a small team of editors searching for some unifying theme to tie together the many thematic strands of American history. Instead, we mirrored the way our profession already works. We believed that a narrative synthesis could emerge through the many innovations of our profession’s various subfields no less than through a preselected central theme. We therefore looked to a large and diverse yet loosely coordinated group of contributors to construct a narrative.
We began by mapping out potential contributions for all 30 chapters based on our experiences teaching the survey and in informal conversations with colleagues and potential chapter editors. We came up with things like “500 words on the election of 1860” and “300 words on the music and art of the Civil War.” We compiled these into lists.
Then, after tapping into the networks of scholars we knew, as well as scouring recent editions of major history journals, combing through lists of recent dissertations, browsing the rosters of university programs with traditional strengths in particular eras and soliciting contributors through social media and H-Net’s many history Listservs, we targeted scholars to write on these themes.
We had no trouble recruiting an adequate pool of qualified contributors. In fact, we ended up with over 300 historians writing for the project. This work was done almost exclusively online.
Q: Was it a matter of one person preparing a draft chapter and then other participants proposing changes?
A: Since a textbook should be more than a series of brief, disjointed topical entries, we began the work of synthesis. We recruited talented writers and scholars as chapter editors who went to work stitching submissions into coherent chapters. We then reviewed and edited drafts of all 30 chapters, particularly with an eye on ensuring greater narrative cohesion across the text.
During our beta year, we are soliciting feedback not only from our esteemed board of editorial advisers but from contributors and users through our parallel Comment Press platform. With that feedback in hand, we will publish a refined version of the text and begin a second phase that incorporates interactive digital content and further explores what a digital textbook is truly capable of.
Q: Are you aware of anyone teaching with the beta version? Have you had commitments from individuals or departments to use it during the next academic year?
A: Students are currently working with the text at a variety of institutions ranging from major state universities (such as the University of Georgia and the University of Florida) to various community colleges (such as Central New Mexico Community College and Bronx Community College) and everything in between (Rice University, Georgia State University, the University of Texas at Dallas and others). We don’t solicit formal commitments for use, but we’ve already heard from additional instructors and history departments hoping to adopt the text next fall. We are historians, not marketers, but we believe continued positive feedback and our formal launch in the fall will also encourage additional adoptions.
Q: In the culture wars, American history is one of the more harried battlegrounds. Did that factor into the textbook’s preparation in any way?
A: We believe history should be written by historians. We have no interest in the culture war, beyond mitigating the way that some have used it to wildly distort the past. Instead, we've trusted in our profession; our desire has been to reflect all the very best of contemporary scholarship.
On the other hand, we have been conscious about how to properly synthesize the American past. What gets included, and what doesn't? This is a difficult issue and we have enlisted the historical profession to help guide us. And we remain open to critical feedback.
Q: The talk page of a Wikipedia entry tends to become a forum for debate, informed and otherwise. Yawp is not in wiki format, of course, but will the comments component be moderated?
A: We've seen very little rancor in our Comment Press platform. Disagreements have mostly taken the form of highly specialized critiques. Historians are argumentative, but we've been pleased to see that all have followed the standards of professional decorum. We therefore haven't had any plans to moderate discussions. And, unlike a wiki, disruptive comments would not be able to filter into the text without editorial decisions.
Q: It seems as if The American Yawp could serve as a model for other textbooks. Is that the plan?
A: Our model is completely reproducible. We've accomplished this without institutions, grants or rarefied technological know-how. And we very much hope that others will follow our example. We already know, for example, that within our own profession there is quite a bit of interest for a similar project in world history.
Q: A commercially produced textbook can be financially rewarding for everybody involved in its creation, and it counts on an author's CV. These seem like powerful incentives for stasis. What would it take for your mode of textbook production to establish itself as viable over the long run?
A: Of course, a commercially produced textbook is not financially rewarding for everybody involved -- it is often quite financially punitive for our students. (The College Board, for instance, found that the typical student now spends $1,200 a year on textbooks and supplies.) And outside of a few textbooks written by a few academics for a few major presses, financial rewards can be extremely limited for textbook producers.
Still, the reputational economics of academia do matter. Professional consideration of projects such as this will certainly shift as academia continues to adjust to the digital age, but we also did not embark upon the project for economic or professional gain. This has been and will continue to be a labor of love. We entered the historical profession because we believe there is a moral imperative to study the American past and to share that knowledge with students and with the public. The rising costs of higher education makes that difficult. Academics recognize this, and we believe that's why over 300 academic historians were so willing to participate in this project.
We believe our model is viable in the long term. This is not a start-up having to satisfy investors or foundation boards. This is simply a collective of historians who have come together to share the knowledge of our profession. That doesn't mean certain developments couldn't further secure the long-term viability of projects such as this, of course. For instance, we have been looking into possible partnerships with innovative university presses to help satisfy the very reputational implications you cited.
It was too prolonged for there to be any specific date, or dates, to mark it. But perhaps this is as good a time as any to mark the 25th anniversary of a process that started with the fall of the Berlin Wall in early November 1989 and reached a kind of peak with the events in Romania late that December.
The scale and pace of change were hard to process then, and difficult to remember now. Ceausescu had barely recovered from the shock of being heckled before he and his wife faced a firing squad. It was not how anyone expected the Cold War to end; insofar as we ever imagined it could end, the images that came to mind involved mutually assured destruction and nuclear winter.
A few years ago, Daniel T. Rogers characterized the intellectual history of the final decades of the 20th century as an “age of fracture” – an era in which the grand narratives and overarching conceptual schemata were constantly displaced by “piecemeal, context-driven, occasional, and… instrumental” ideas and perspectives in the humanities, social sciences, and public life. Fair enough; just try finding a vintage, unshattered paradigm these days. But a system of bipolar geopolitical hostilities prevailed throughout most of that period, and the contradictory structure of conflict-and-stasis seemed very durable, if not permanent.
Until, suddenly, it wasn’t. One smart and well-executed treatment of the world that came to an end a quarter-century ago is a recent television series called "The Americans," set in the early 1980s. The first season is now available in DVD and streaming video formats, and the second will be in two weeks, just in time for binge-viewing over the holidays.
"The Americans" is a Cold War spy drama as framed by the “secret life amidst suburban normality” subgenre, the basic tropes of which were inaugurated by "The Sopranos." In it, the Jenningses, a married couple, run a travel agency in Washington, where they live with their two early-adolescent kids. But they are actually KGB agents who entered the United States some 20 years earlier. They have operated from behind cover identities for so long that they blend right in, which makes them very effective in their covert work. While gathering information on the Strategic Defense Initiative, for example, they even get access to the Advanced Research Projects Agency Network -- aka ARPANET -- which allows communication between computers, or something.
The comparison shouldn’t be pushed too hard, but the paradox of the deep-cover agent is right out of John Le Carré: A divided identity makes for divided loyalties. At very least it puts considerable strain on whatever commitment the couple started out with, back in the late Khrushchev era. We get occasional flashbacks to their life as young Soviet citizens. With the onset of “Cold War II,” the motherland is imperiled once again (not only by the American arms buildup but also by the reflexes of KGB leadership at “the Center”) and the Jenningses have decidedly mixed feelings about raising kids under rampant consumerism, even if they’ve grown accustomed to it themselves.
The moral ambiguities and mixed motives build up nicely. Life as a couple, or in a family, proves to be more than a layer of the agents’ disguise: love is another demand on their already precarious balance of loyalties. Yet the real menace of thermonuclear showdown is always there, underneath it all. Some viewers will know that things came very close to the point of no return at least once during this period, during NATO’s “Able Archer” exercise in November 1983. Whatever sympathy the audience may develop toward the Jenningses (played with real chemistry by Keri Russell and Matthew Rhys) is regularly tested as they perform their KGB assignments with perfect ruthlessness. They are soldiers behind enemy lines, after all, and war always has innocent casualties.
The conflict has gone on so long, and with no end in sight, that the characters on screen don’t even feel the need to justify their actions. The spycraft that the show portrays is historically accurate, and it gets the anxious ground-tone of the period right, or as I remember it anyway. But very seldom does "The Americans" hint at the impending collapse of almost every motive driving its core story -- something the viewer cannot not know. (Pardon the double negative. But it seems to fit, given the slightly askew way it keeps the audience from taking for granted either the Cold War or the fact that it ended.)
The focus on the family in "The Americans" takes on added meaning in the light of Margaret Peacock’s Innocent Weapons: The Soviet and American Politics of Childhood in the Cold War, recently published by the University of North Carolina Press. The scriptwriters really ought to spend some time with the book. At the very least, it would be a gold mine of nuances and points of character development. More generally, Innocent Weapons is a reminder of just how much ideological freight can be packed into a few messages rendered familiar through mass media, advertising, and propaganda.
Peacock, an assistant professor of history at the University of Alabama at Tuscaloosa, examines the hopes and fears about youngsters reflected in images from the mid-1940s through the late 1960s. The U.S. and the USSR each experienced a baby boom following World War II. But the outpouring of articles, books, movies, and magazine illustrations focusing on children was not solely a response to the concerns of new parents. It might be more accurate to say the imagery and arguments were a way to point the public’s attention in the right direction, as determined by the authorities in their respective countries.
Children are the future, as no politician can afford to tire of saying, and the images from just after the defeat of fascism were tinged with plenty of optimism. The standard of living was rising on both sides of the Iron Curtain. In 1950 President Truman promised parents a “the most peaceful times the world has ever seen.” Around the same time, the Soviet slogan of the day was “Thank You Comrade Stalin for Our Happy Childhood!”, illustrated with a painting of exuberant kids delivering an armful of roses to the General Secretary, whose eyes fairly twinkle with hearty good nature.
But vows of peace and plenty on either side were only as good as the leaders’ ability to hold their ground in the Cold War. That, in turn, required that young citizens be imbued with the values of patriotism, hard work, and strong character. Sadly enough, children on the other side were denied the benefits of growing up in the best of societies.
The Soviets media portrayed American youth as aimless, cynical jazz enthusiasts facing Dickensian work conditions after years of a school system with courses in such topics as “home economics” and “driver’s education.” The Americans, in turn, depicted Soviet youth as brainwashed, stultified, and intimidated by the state. (And that was on a good day.)
By the late 1950s, the authorities and media on each side were looking at their own young people with a more critical eye (alarmed at “juvenile deliquincy,” for example, or “hooliganism,” as the Soviets preferred to call it) -- while also grudgingly admitting that the other side was somehow bringing up a generation that possessed certain alarming virtues. Khrushchev-era educational reformers worried that their students had endured so much rote instruction that they lacked the creativity needed for scientific and technological progress, while American leaders were alarmed that so many young Soviets were successfully tackling subjects their own students could never pass -- especially in science and math. (The news that 8 million Soviet students were learning English, while just 8,000 Americans were taking Russian, was also cause for concern.)
The arc of Cold War discourse and imagery concerning childhood, as Peacock traces it, starts out with a fairly simplistic identification of youth’s well-being with the values of those in charge, then goes through a number of shifts in emphasis. By the late 1960s, the hard realities facing children on either side were increasingly understood as failures of the social system they had grown up in. In the U.S., a famous television commercial showed a little girl plucking the leaves of a daisy as a nuclear missile counted down to launch; while the ad was intended to sway voters against Barry Goldwater, it drew on imagery that the Committee for a Sane Nuclear Policy (better known as SANE) and Women Strike for Peace first used to oppose nuclear testing a few years earlier. Nothing quite so emblematic emerged in the Soviet bloc, but the sarcastic use of a slogan from the Komsomol (Young Communist Union) became a sort of inside joke about the government’s self-delusion.
“To varying degrees,” writes Peacock, “both countries found themselves over the course of these years betraying their ideals to win the [Cold] war, maintain power, and defend the status quo…. Even images like that of the innocent child can become volatile when the people who profess to defend the young become the ones who imperil them.”