History

Are historians the ideal futurists?

Section: 
Smart Title: 

Is history really future studies in reverse? Panel at recent gathering of historians' association makes case for teaching the future.

Historians' panel centers on who should write military history, for whom

Section: 
Smart Title: 

Historians' panel centers on what it takes to write good military history. Hint -- it's not brass.

Sacramento State ends investigation of disagreement over what professor said about Native Americans

Smart Title: 

Sacramento State finds no wrongdoing by professor who was accused of insulting a Native American student, nor by the student who challenged what the professor said about genocide.

Sacramento State student says she was kicked out of class for arguing that Native Americans were victims of genocide

Smart Title: 

At Sacramento State, student says she was kicked out of class for insisting that Native Americans were victims of genocide. As incident is investigated, debate grows over whether she was treated unfairly -- and how to handle classroom discussions of this sort.

Review of Jacques Le Goff, 'Must We Divide History Into Periods?'

George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”

The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”

His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)

Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.

Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).

Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)

He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.

Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”

Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:

“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”

I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”

That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”

So, no bold, clean stroke of the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.

At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”

I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.

Editorial Tags: 

Article on Mark Zborowski, scholar and spy

Among the passengers disembarking from a ship from that reached Philadelphia in the final days of December 1941 was one Mark Zborowski -- a Ukrainian-born intellectual who grew up in Poland. He had lived in Paris for most of the previous decade, studying at the Sorbonne. He was detained by the authorities for a while (the U.S. had declared war on the Axis powers just three weeks earlier, so his visa must have been triple-checked) and then released.

Zborowski's fluency in several languages was a definite asset. By 1944 he was working for the U.S. Army on a Russian-English dictionary; after that that he joined the staff of the Institute for Jewish Research in New York, serving as a librarian. And from there the émigré’s career took off on an impressive if not meteoric course.

He joined the Research in Contemporary Culture Project at Columbia University, launched just after World War II by the prominent anthropologists Ruth Benedict and Margaret Mead with support from the Office of Naval Research. Zborowski oversaw an ethnographic study of Central and Eastern European Jewish culture, based on interviews with refugees. It yielded Life Is With People: The Culture of the Shtetl, a book he co-authored in 1952. Drawing on Zborowski’s childhood memories more than he acknowledged and written in a popularizing style, it sold well and remained in print for decades.

The volume’s reputation has taken some hits over the years -- one scholar dubs it “the book that Jewish historians of the region loathe more than any other” – but Zborowski enjoyed the unusual distinction of influencing a Broadway musical: the song “If I Were a Rich Man” in Fiddler on the Roof was inspired, in part, by a passage in Life Is With People. He later turned to research on cultural differences in how pain is experienced and expressed, culminating in his book People in Pain (1969). Once again his published work got mixed reviews in the professional journals, while the author himself enjoyed a kind of influence that citation statistics do not measure: a generation of medical anthropologists studied with him at the Pain Institute of Mt. Zion Hospital in San Francisco. He died in 1990.

If the details just given represented an honest account of Mark Zborowski’s life, he would now be remembered by scarcely anyone except specialists working in his fields of interest. The narrative above is all factually correct, to the best of my knowledge. But it omits an abundance of secrets. Some were revealed during his lifetime, but even they come wrapped in the mystery of his motives.

The fullest account now available is “Mark ‘Etienne’ Zborowski: Portrait of Deception” by Susan Weissman, a two-part study appearing in the journal Critique. Weissman, a professor of politics at Saint Mary’s College in Moraga, Calif., published the first half in 2011 and expected the second to follow shortly, though in fact it will appear in print only later this year. (Both can be downloaded in PDF from her page at Academia.edu.)

Etienne was the name Zborowski used while infiltrating anti-Stalinist radical circles in France for the GPU and the NKVD (forerunners of the KGB) during the 1930s, and he continued surveillance on opponents of the Soviet Union during his first few years in the United States.

“He is remembered by his students and colleagues as warm, generous and erudite,” writes Weissman. “Personally he neither stole documents nor directly assassinated people, but he informed Stalin’s teams of thugs where to find the documents or the people they sought. Zborowski infiltrated small leftist circles, made friends with its cadres and then reported on them. He always ratted on his ‘supposed’ friends. He saw [one woman] daily for nearly five years, and she helped him in countless ways. What did he give her in return? Only her survival, something not afforded to other Zborowski ‘friends.’ Once his orders switched and he no longer needed to report on her activities (or that of her husband), Zborowski simply stopped calling this constant friend, who defended him, gave him money and helped him with that precious commodity denied to so many, the visa to the United States.”

Weissman chronicles Etienne’s destructive role among the anti-Stalinist revolutionaries in Europe while also showing that his precise degree of culpability in some operations remains difficult to assess. Important missions were sometimes “nearly sabotaged by conflicting aims and lack of coordination between Soviet espionage teams.” And spy craft is not immune to a kind of office politics: reports to “the center” (intelligence headquarters) were not always accurate so much as aspirational or prudent.

Overviews of Zborowski’s covert life have been available for some time – among them, his own testimony to a Senate subcommittee on internal security, which was not especially candid. Weissman’s study draws on earlier treatments but handles them critically, and in the light of a wider range of sources than have been brought to bear on his case until now.

Besides material from Stalin-era archives (consulted when she was in Russia during the 1990s) and the decoded Venona intercepts of Soviet cable communications from the 1940s, Weissman obtained court transcripts from Zborowski’s trials for perjuring himself before Congress. (He received a retrial after appealing his first conviction, but lost and served four years in prison.)

She also used the Freedom of Information Act to request the pertinent files from the Federal Bureau of Investigation. There were surveillance reports, of course, and interviews conducted by FBI agents -- with some pages all but entirely blacked out -- but also a piece of evidence about Zborowski that has been hiding in plain sight for 50 years.

The Feb. 28, 1965, issue of the Sunday magazine of The New York Times contained an article called “The Prison ‘Culture’ -- From the Inside.” The author identified himself as an anthropologist (“and as far as I know the first member of my profession to study a prison culture from the inside”) and used the pseudonym “M. Arc.” They seem like pretty clear hints to his identity, but no one seems to have made the connection until Weismann opened the dossier.

“The article is a scholarly, well-written account of life inside,” she notes, “with a critical look at the criminal justice system … and [it] has been widely cited and reprinted in prison sociology texts.”

Part of his hidden curriculum vitae, then. “True to form,” Weismann writes, “Zborowski put the focus entirely on the subject at hand, revealing virtually nothing of himself.”

And that really is the mystery within the mystery here. It’s difficult to square Professor Zborowski (amiable, conscientious, a little bland, perhaps) with the sinister career of Etienne, a man who made himself the closest friend of Trotsky’s son Leon Sedov and quite possibly set him up for murder. (Afterward he tried to wrangle an invitation to the Russian revolutionary’s compound in Mexico, but another assassin got there first.)

In a conversation with Weissman by phone, I mentioned being both fascinated by her research (mention Trotsky in something and I’ll probably read it) and left puzzled by the figure she portrayed. And puzzled in a troubling way, with no sense of his intentions -- of how he had understood his own actions, whether while carrying them out or across the long years he had to reflect on them.

“While in prison,” she told me, “he kept insisting to the FBI that he was good citizen. He never expressed remorse. There’s nothing in his papers about his politics, nothing about his own beliefs.” The reader perplexed by Weissman's “portrait of deception” is in the same position as the scholar who investigated him: “He’s a puzzle I couldn’t solve.”

Editorial Tags: 
Image Source: 
Wikipedia
Image Caption: 
Mark Zborowski

review of Dora Apel, "Beautiful Terrible Ruins: Detroit and the Anxiety of Decline"

In 2011, Paul Clemens, a writer from Detroit published an up-close and personal look at deindustrialization called Punching Out: One Year in a Closing Auto Plant. It recorded the year he spent on a work team hired to dismantle and gut one of the city’s remaining factories. I wrote about the book when it came out, and won’t recycle anything here, but recall a few paragraphs expressing a particular kind of frustration that a non-Detroiter can only sympathize with, not really share.

The cause was a tendency by outsiders – or a subset of them at any rate – to treat the city’s decline as perverse kind of tourist attraction or raw material for pontification. Clemens had lost all patience with arty photographs of abandoned buildings and pundits’ blatherscate about the “creative destruction” intrinsic to dynamic capitalism. He also complained about the other side of the coin, the spirit of “we’re turning the corner!” boosterism. “No Parisian is as impatient with American mispronunciation,” he wrote, “no New Yorker as disdainful of tourists needing directions, as is a born-and-bred Detroiter with the optimism of recent arrivals and their various schemes for the city’s improvement.”

Dora Apel, a professor of art history and visual culture at Wayne State University, has, in effect, gathered everything that dismays and offends Clemens between the covers of Beautiful Terrible Ruins: Detroit and the Anxiety of Decline (Rutgers University Press).

She calls Detroit “the quintessential urban nightmare in a world where the majority of people live in cities.” But nightmare imagery can be seductive, making Detroit “a thriving destination for artists, urban explorers, academics, and other curious travelers and researchers who want to experience for themselves industrial, civic, and residential abandonment on a massive scale.”

That lure is felt most keenly by people who, after the “experience,” enjoy the luxury of going home to someplace stable, orderly, and altogether more pleasant. It’s evident Apel finds something ghoulish about taking pleasure from a scene of disaster, “feeding off the city’s misery while understanding little about its problems, histories, or dreams.” But the aesthetic appeal of ruins – the celebration of old buildings crumbling picturesquely, of columns broken but partly standing, of statuary fractured and eroded by time –- goes back at least to the 18th century, and it can’t be reduced to mere gloating. The author makes a brief but effective survey of “ruin lust,” a taste defined by “the beautiful and melancholic struggle between nature and culture,” as well as the feelings of contrast between ancient and modern life that ruins could evoke in the viewer, in pleasurable ways. She also points out how, in previous eras, this taste often involved feelings of national superiority, as with well-off travelers enjoying the sight of another country’s half-demolished architecture. (At least a tinge of gloating, in that case.)

It’s not difficult to recognize classical elements of the ruins aesthetic as "The Tree" by James D. Griffioen. One of a number of images reproduced in the book, Griffioen took the photograph in the Detroit Public Schools Book Depository, in which a sapling has taken root in the mulch created by a layer of decomposing textbooks – an almost schematic case of “the beautiful and melancholic struggle between nature and culture.” But Apel underscores the differences between the 21st century mode of “ruination” and the taste cultivated in earlier periods. For one thing, ““modernist architecture refuses the return of culture to nature in the manner of ancient ruins in large part because the building materials of concrete, steel, and glass do not delicately crumble in the picturesque way that stone does.”

More importantly, though, the fragments aren’t poking up from some barely imaginable gulf in time and culture: Detroit was, in effect, the world capital of industrial society within living memory. In his autobiography published in the early 1990s, then-Mayor Coleman Young wrote: “In the evolutionary urban order, Detroit today has always been your town tomorrow.” The implications of that thought are considerably more gloomy than they were even 20 years ago. One implication of imagery such as "The Tree" is that it’s not a reminder of the recent past so much as a glimpse at the ruins of the not too distant future.

Photography is not the only cultural register in which the fascination with contemporary ruins makes itself evident: There are “urban exploration,” for example: a subculture consisting (it seems) mainly of young guys who trespass on ruined property to take in the ambience while also enjoying the dangers and challenges of moving around in collapsing architecture. Apel also writes about artists in Detroit who have colonized depopulated areas both to reclaim them as living space and to incorporate the ruins into their creative work.

The effects are not strictly local: “the borders between art, media, advertising, and popular culture have become increasingly permeable,” Apel writes, “as visual imagery easily ranges across these formats and as people produce their own imagery on websites and social media.” And the aestheticized ruination of Detroit  feeds into a more widespread (even global) “anxiety of decline” expressed in post-apocalyptic videogame scenarios, survivalist television programs, zombie movies, and so on. Not that Detroit is the inspiration in each case, but it provides the most concrete, real-world example of  dystopia.

“As the largest deteriorating former urban manufacturing center,” Apel writes, contemporary Detroit is a product of  an understanding of society in “rights are dependent on what people can offer to the state’s economic well-being, rather than vice-versa,” and  “the lost protection of the state means vastly inadequate living conditions and the most menial and unprotected forms of labor in cities that are divested of many of their social services and left to their own devices.”

Much of the imagery analyzed in Beautiful Terrible Ruins seems to play right along with that social vision. The nicely composed photographs of crumbling buildings are usually empty of any human presence, while horror movies fill their urban landscapes with the hungry undead -- the shape of dreaded things to come.

Editorial Tags: 

Review of Adam Mack, "Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers"

The most distracting thing about costume dramas set in any period before roughly the turn of the 20th century -- in my experience, anyway -- is the thought that everything and everyone on screen must have smelled really bad. The most refined lords and gentry on Wolf Hall did not bathe on anything we would regard today as a regular basis.

No doubt there were exceptions. But until fairly recently in human history, even the most fastidious city dweller must have grown accustomed to the sight of human waste products from chamber pots that had been emptied in the street. (And not just the sight of it, of course.) Once in a while a movie or television program will evince something of a previous era’s ordinary grunge, as in The Return of Martin Guerre or Deadwood, where almost everything looks soiled, fetid and vividly uncomfortable. But that, too, is exceptional. The audience for costume drama is often looking for charm, nostalgia or escapism, and so the past usually wears a deodorant.

The wider public may not have heard of it, but a “sensory turn” among American historians has made itself felt in recent years -- an attention, that is, to the smells, tastes, textures and sounds of earlier periods. I refer to just four senses, because the importance of sight was taken for granted well before the turn. In their more polemical moments, sensory historians have even referred to “the tyranny of the visual” within their discipline.

That seems a little melodramatic, but point taken: historians have tended to scrutinize the past using documents, images, maps and other artifacts that chiefly address the eye. Coming in second as the organ of perception most likely to play a role in historical research would undoubtedly be the ear, thanks to the advent of recorded sound. The remaining senses tie for last place simply because they leave so few traces -- which, in any case, are not systematically preserved the way audiovisual materials are. We have no olfactory or haptic archives; it is difficult to imagine a library of flavors.

Calls to overcome these obstacles -- to analyze whatever evidence could be found about how everyday life once sounded, smelled, felt, etc. -- came from American historians in the early 1990s, with a few pioneers at work in Europe even before that. But the field of sensory history really came into its own over the past decade or so, with Mark M. Smith’s How Race is Made: Slavery, Segregation and the Senses (University of North Carolina Press, 2006) and Sensing the Past: Seeing, Hearing, Smelling, Tasting and Touching in History (University of California Press, 2007) being among the landmarks. Smith, a professor of history at the University of South Carolina, also convened a roundtable on the sensory turn published in the September 2008 issue of The Journal of American History. A number of the contributors are on the editorial board of the Studies in Sensory History series published by the University Illinois Press, which launched in 2011.

The series’ fifth and most recent title is Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers by Adam Mack, an assistant professor of history at the School of the Art Institute of Chicago. Beyond the monographic focus -- it covers about fifty years of the city’s history -- the book demonstrates how much of the sensory field of an earlier era can be reconstructed, and why doing so can be of interest.

Overemphasis on the visual dimension of an urban landscape “mirrors a set of modern cultural values that valorize the eye as the barometer of truth and reason,” we read in the introduction, “and tend to devalue the proximate, ‘lower’ senses as crude and less rational.” Having thus recapitulated one of sensory history’s founding premises, the author wastes no time before heading to one site that must have forced its way deep into the memory of anyone who got near it in the 19th century: the Chicago River.

“A bed of filth,” one contemporary observer called it, where manure, blood, swill and unusable chunks of carcass from the slaughterhouses ended up, along with human sewage and dead animals -- all of it (an editorialist wrote) “rotting in the sun, boiling and bubbling like the lake of brimstone, and emitting the most villainous fumes,” not to mention drawing clouds of flies. A letter writer from 1862 mentions that the water drawn from his kitchen hydrant contained “half a pint or more of rotten fish.” Many people concluded that it was safest just to drink beer instead.

Laws against dumping were passed and commissions appointed to investigate the problem, for all the good it did. The poorest people had to live closest to the river, so disgust at the stench combined in various ways with middle- and upper-class attitudes towards them, as well as with nativist prejudices.

The horrific odor undermined efforts to construct a modern, rationally organized city. Imposing a grid of streets on the landscape might please the eye, but smell didn’t respect geometry. The same principle applied to the Great Fire of 1871, the subject of Mack’s next chapter. The heat and sheer sensory overload were overwhelming, and the disaster threw people from all walks of life together in the streets in a way that made social status irrelevant, at least for a while. The interplay between social hierarchy and sensory experience (exemplified in references to “the roar of the mob”) is the thread running through the rest of the book. Thinking of the “‘lower’ senses as crude and less rational” -- to quote the author’s phrase again -- went along with assumptions about refinement or coarseness as markers of class background.

The sources consulted by the author are much the same as any other historian might use: newspapers, civic records, private or otherwise unpublished writings by long-forgotten people, such as the recollections of the Great Fire by witnesses, on file at the Chicago History Museum. The contrast is at the level of detail -- that is, the kinds of detail the historian looks for and interprets. Perhaps the next step would be for historians to enhance their work with direct sensory documentation.

A prototype might be found in the work of John Waters, who released one of his movies in Odorama. Audience members received cards with numbered scratch-and-sniff patches, which they consulted when prompted by a message on the screen.

On second thought, it was difficult enough to read Mack’s account of the Chicago River in the 19th century without tickling the gag reflex. Olfactory realism might push historical accuracy farther than anyone really wants it to go.

Editorial Tags: 

Essay on teaching the history of the Confederacy

The South is home for me, but to my students in Minnesota, it’s an exotic place from which I am an ambassador. So when Dylann Roof massacred congregants at Emanuel African Methodist Episcopal Church in Charleston, S.C., last month and students began asking me about the killings and the debate they reignited over the Confederate flag, I did not know whether they sought my analysis as a scholar of the Confederacy and its legacies or my feelings as a transplanted Southerner. My uncertainty deepened because the questions came between semesters, from men and women who had taken courses with me last spring and would do so again in the fall. Did the timing of the questions change my relationship to the people who asked them, and therefore inform which part of me -- the professor or the person -- answered?

The difference between my answers depends on whether I want my students to embrace or reject the polemic through which we discuss the Confederacy, its cause and its symbols, ascribing them to represent either virulent hatred or regional pride and nostalgia. In a “Room for Debate” feature on June 19, The New York Times pitted former Georgia Congressman Ben Jones’s views of the flag as “A Matter of Pride and Heritage” against three authors who emphasized the flag’s postwar uses as a banner for Jim Crow violence, reactionary resistance to integration and civil rights, and the most obdurate hate groups in the contemporary United States. Governor Nikki Haley invoked a similar framing in her speech calling for the South Carolina legislature to remove the flag from the state capitol grounds. The governor presented the flag’s dual meanings on an almost equal footing; which interpretation a person chose, she implied, depended on their race. For white people, the flag meant honoring the “respect, integrity and duty” of Confederate ancestors -- “That is not hate, nor is it racism,” she said of that interpretation -- while “for many others … the flag is a deeply offensive symbol of a brutally oppressive past.”

In asking the South Carolina legislature to remove the flag from the statehouse grounds, the governor posed the meaning of the Confederate flag as a choice, and she refused to pick sides because she understood the sympathies of those in both interpretive camps. Because many of those who honor the state’s Confederate past neither commemorate nor act out of hate, in the governor’s logic they are not wrong -- merely out of sync with the political needs of 2015.

As a person, I want my students to take sides in that polemic, to know that Confederate “heritage” is the wrong cause to celebrate in any context. I want my students to know that the Confederacy was created from states that not only embraced slavery but, as Ta-Nahisi Coates has demonstrated beyond refutation, proudly defined their political world as a violent, diabolical contest for racial mastery. I want them to understand that the Civil War rendered a verdict on secession and, in the words of historian Stephanie McCurry, on “a modern pro-slavery and antidemocratic state, dedicated to the proposition that all men were not created equal.”

I want them to scrutinize, as John Coski has done in his excellent book The Confederate Battle Flag: America’s Most Embattled Symbol, the flag’s long use by those who reject equal citizenship. I want my (overwhelmingly white) students to grasp why the flags of the Confederacy in their many iterations -- on pickup trucks, college campuses and statehouse grounds -- tell African-Americans that they are not, and cannot be, equal citizens. I want them to feel the imperative in the words of President Obama’s eulogy for Reverend Clementa Pinckney, in which the president claimed that only by rejecting the shared wrongs of slavery, Jim Crow and the denial of civil rights can we strive for “an honest accounting of America’s history; a modest but meaningful balm for so many unhealed wounds.”

As a historian, I want more. I don’t want my students to simply choose sides in a polemic between heritage and hate; rather, I hope they will interrogate the Confederacy’s white supremacist project on more complex terms. A simple dichotomy of heritage or hate misses something essential about both the Confederacy and the social construction of racism: then as now, you don’t need to hate to be a racist. Many Confederate soldiers held views that cohered perfectly with the reactionary, violent and indeed hateful lens through which Dylann Roof sees race. After the Battle of Petersburg, Major Rufus Barrier celebrated the “slaughter … in earnest” of black soldiers and relished how “the blood ran in streams from their worthless carcasses.”

But others, like Confederate officer Francis Marion Parker, grounded their commitment to white supremacy not in jagged words of hate, but in the softer tones of family. Explaining his reasons for going to war in a letter to his wife and children, Parker promised that “home will be so sweet, when our difficulties are settled and we are permitted to return to the bosom of our families, to enjoy our rights and privileges” -- that is, slaveholding -- “under the glorious flag of the Confederacy.”

I want my students to see that men and women of differing temperaments and qualities supported the Confederacy’s white supremacist project and justified their support through a variety of ethics, appeals and emotions. I want them to overcome rhetorical paradigms that allow modern-day defenders of Confederate heritage to divorce the character of the men who fought for the “Lost Cause” from the cause itself. I want them to think critically about how otherwise honorable, courageous men as well as vicious, hate-filled racists came to embrace a cause informed, in the words of Confederate Vice President Alexander Stephens, by “the great truth that the Negro is not the equal of the white man, that slavery, subordination to the superior race, is his natural and normal condition.”

I hope my students will draw a bit from both of my answers, bringing careful scrutiny of the past into dialogue with the urgency of the present. As they do so, I hope they will think a bit about what historical meaning is and why history demands their scrutiny. If history becomes a mere servant of contemporary “truthiness,” reduced to selective anecdotes deployed as weapons in polarizing debates, then we are merely choosing camps in a contest of identities. Such one-dimensional choices leave space for those who equate the Confederacy with nostalgia and a kind of inherited pride to use the Confederate flag as shorthand for who they are and where they come from without any mention of race or white supremacy. Yet if historical interpretation remains antiquarian and refuses to speak to the present, it leaves us self-satisfied in the illusion that we have transcended the people and societies we study. One day generations yet unborn will scrutinize us and find us wanting, too. If we critique the people of the past and the choices they made not only with an eye to distancing ourselves from their worst extremes but also with a sense of how easy, how normal and how justifiable unequal citizenship can appear to be, the tragedy in Charleston and the history it invokes may teach a resonant lesson.

David C. Williard is assistant professor of history at the University of Saint Thomas, in Minnesota.
 

Section: 
Editorial Tags: 
Image Source: 
Wikimedia Commons

Review of Edna Greene Medford, "Lincoln and Emancipation"

Reading the Emancipation Proclamation for the first time is an unforgettable experience. Nothing prepares you for how dull it turns out to be. Ranking only behind the Declaration of Independence and the Constitution in its consequences for U.S. history, the document contains not one sentence that has passed into popular memory. It was the work, not of Lincoln the wordsmith and orator, but of Lincoln the attorney. In fact, it sounds like something drafted by a group of lawyers, with Lincoln himself just signing off on it.

Destroying an institution of systematic brutalization -- one in such contradiction to the republic’s professed founding principles that Jefferson’s phrase “all men are created equal” initially drew protests from slave owners -- would seem to require a word or two about justice. But the proclamation is strictly a procedural document. The main thrust comes from an executive order issued in late September 1862, “containing, among other things, the following, to wit: ‘That on the first day of January, in the year of our Lord one thousand eight hundred and sixty-three, all persons held as slaves within any State or designated part of a State, the people whereof shall then be in rebellion against the United States, shall be then, thenceforward, and forever free….’”

Then -- as if to contain the revolutionary implications of that last phrase -- the text doubles down on the lawyerese. The proclamation itself was issued on the aforesaid date, in accord with the stipulations of the party of the first part, including the provision recognizing “the fact that any State, or the people thereof, shall on that day be, in good faith, represented in the Congress of the United States by members chosen thereto at elections wherein a majority of the qualified voters of such State shall have participated, shall, in the absence of strong countervailing testimony, be deemed conclusive evidence that such State, and the people thereof, are not then in rebellion against the United States.”

In other words: “If you are a state, or part of a state, that recognizes the union enough to send representatives to Congress, don’t worry about your slaves being freed right away and without compensation. We’ll work something out.”

Richard Hofstadter got it exactly right in The American Political Tradition (1948) when he wrote that the Emancipation Proclamation had “all the moral grandeur of a bill of lading.” It is difficult to believe the same author could pen the great memorial speech delivered at Gettysburg a few months later -- much less the Second Inaugural Address.

But to revisit the proclamation after reading Edna Greene Medford’s Lincoln and Emancipation (Southern Illinois University Press) is also a remarkable experience -- a revelation of how deliberate, even strategic, its lawyerly ineloquence really was.

Medford, a professor of history at Howard University, was one of the contributors to The Emancipation Proclamation: Three Views (Louisiana State University Press, 2006). Her new book is part of SIUP’s Concise Lincoln Library, now up to 17 volumes. Medford’s subject overlaps with topics covered by earlier titles in the series (especially the ones on race, Reconstruction and the Eighteenth Amendment) as well as with works such as Eric Foner’s The Fiery Trial: Abraham Lincoln and American Slavery (Norton, 2010).

Even so, Medford establishes her own approach by focusing not only on Lincoln’s ambivalent and changing sense of what he could and ought to do about slavery (a complex enough topic in its own right) but also on the attitudes and activities of a heterogeneous and dispersed African-American public with its own priorities.

For Lincoln, abolishing the institutionalized evils of slavery was a worthy goal but not, as such, an urgent one. As of 1860, his primary concern was that it not spread to the new states. After 1861, it was to defeat the slaveholders’ secession -- but without making any claim to the power to end slavery itself. He did support efforts to phase it out by compensating slave owners for manumission. (Property rights must be respected, after all, went the thinking of the day.) His proposed long-term solution for racial conflict was to send the emancipated slaves to Haiti, Liberia, or someplace in Central America to be determined.

Thanks in part to newspapers such as The Weekly Anglo-African, we know how free black citizens in the North responded to Lincoln, and it is clear that some were less than impressed with his antislavery credentials. “We want Nat Turner -- not speeches,” wrote one editorialist; “Denmark Vesey -- not resolutions; John Brown -- not meetings.” Especially galling, it seems, were Lincoln’s plans to reimburse former slave owners for their trouble while uprooting ex-slaves from land they had worked for decades. African-American commentators argued that Lincoln was getting it backward. They suggested that the ex-slaves be compensated and their former masters shipped off instead.

To boil Medford’s succinct but rich narrative down into something much more schematic, I’ll just say that Lincoln’s cautious regard for the rights of property backfired. Frederick Douglass wrote that the slaves “[gave] Mr. Lincoln credit for having intentions towards them far more benevolent and just than any he is known to cherish…. His pledges to protect and uphold slavery in the States have not reached them, while certain dim, undefined, but large and exaggerated notions of his emancipating purpose have taken firm hold of them, and have grown larger and firmer with every look, nod, and undertone of their oppressors.” African-American Northerners and self-emancipating slaves alike joined the Union army, despite all the risks and the obstacles.

The advantage this gave the North, and the disruption it created in the South, changed abolition from a moral or political concern to a concrete factor in the balance of forces -- and the Emancipation Proclamation, for all its uninspired and uninspiring language, was Lincoln’s concession to that reality. He claimed the authority to free the slaves of the Confederacy “by virtue of the power in me vested as Commander-in-Chief, of the Army and Navy of the United States in time of actual armed rebellion against the authority and government of the United States, and as a fit and necessary war measure for suppressing said rebellion.”

Despite its fundamentally practical motivation and its avoidance of overt questions about justice, the proclamation was a challenge to the American social and political order that had come before. And it seems to have taken another two years before the president himself could spell out its implications in full, in his speech at the Second Inaugural. The depth of the challenge is reflected in the each week's headlines, though to understand it better you might want to read Medford's little dynamite stick of a book first.

Editorial Tags: 

Pages

Subscribe to RSS - History
Back to Top