George Orwell opened one of his broadcasts on the BBC in the early 1940s by recounting how he’d learned history in his school days. The past, as his teachers depicted it, was “a sort of long scroll with thick black lines ruled across it at intervals,” he said. “Each of these lines marked the end of what was called a ‘period,’ and you were given to understand that what came afterwards was completely different from what had gone before.”
The thick black lines were like borders between countries that didn’t know one another’s languages. “For instance,” he explained, “in 1499 you were still in the Middle Ages, with knights in plate armour riding at one another with long lances, and then suddenly the clock struck 1500, and you were in something called the Renaissance, and everyone wore ruffs and doublets and was busy robbing treasure ships on the Spanish Main. There was another very thick black line drawn at the year 1700. After that it was the Eighteenth Century, and people suddenly stopped being Cavaliers and Roundheads and became extra-ordinarily elegant gentlemen in knee breeches and three-cornered hats … The whole of history was like that in my mind -- a series of completely different periods changing abruptly at the end of a century, or at any rate at some sharply defined date.”
His complaint was that chopping up history and simplistically labeling the pieces was a terrible way to teach the subject. It is a sentiment one can share now only up to a point. Orwell had been an average student; today it would be the mark of a successful American school district if the average student knew that the Renaissance came after the Middle Ages, much less that it started around 1500. (A thick black line separates his day and ours, drawn at 1950, when television sales started to boom.)
Besides, the disadvantages of possessing a schematic or clichéd notion of history are small by contrast to the pleasure that may come later, from learning that the past was richer (and the borders between periods more porous) than the scroll made it appear.
Must We Divide History Into Periods? asked Jacques Le Goff in the title of his last book, published in France shortly before his death in April 2014 and translated by M. B. DeBevoise for the European Perspectives series from Columbia University Press. A director of studies at L'École des Hautes Études en Sciences Sociales in Paris, Le Goff was a prolific and influential historian with a particular interest in medieval European cities. He belonged to the Annales school of historians, which focused on social, economic and political developments over very long spans of time -- although his work also exhibits a close interest in medieval art, literature and philosophy (where changes were slow by modern standards, but faster than those in, say, agricultural technique).
Le Goff’s final book revisits ideas from his earlier work, but in a manner of relaxed erudition clearly meant to address people whose sense of the past is roughly that of young Orwell. And in fact it is that heavy mark on the scroll at the year 1500 -- the break between the Middle Ages and the Renaissance -- that Le Goff especially wants the student to reconsider. (I say “student” rather than “reader” because time with the book feels like sitting in a lecture hall with a memorably gifted teacher.)
He quotes one recent historian who draws the line a little earlier, with the voyage of Christopher Columbus in 1492: “The Middle Ages ended, the modern age dawned, and the world became suddenly larger.” Le Goff is not interested in the date but in the stark contrast that is always implied. Usually the Middle Ages are figured as “a period of obscurity whose outstanding characteristic was ignorance” -- happily dispelled by a new appreciation for ancient, secular literature and a sudden flourishing of artistic genius.
Calling something “medieval” is never a compliment; the image that comes to mind is probably that of a witch trial. By contrast, “Renaissance” would more typically evoke a page from Leonardo da Vinci’s notebooks. Such invidious comparison is not hard to challenge. Witch trials were rare in the Middle Ages, while the Malleus Maleficarum appeared in “the late fifteenth century,” Le Goff notes, “when the Renaissance was already well underway, according to its advocates.”
Given his expertise on the medieval city -- with its unique institutional innovation, the university -- Le Goff makes quick work of demolishing the notion of the Middle Ages having a perpetually bovine and stagnant cultural life. The status of the artist as someone “inspired by the desire to create something beautiful” who had “devoted his life to doing just this” in pursuit of “something more than a trade, nearer to a destiny,” is recognized by the 13th century. And a passage from John of Salisbury describes the upheaval underway in the 12th century, under the influence of Aristotle:
“Novelty was introduced everywhere, with innovations in grammar, changes in dialectic, rhetoric declared irrelevant and the rules of previous teachers expelled from the very sanctuary of philosophy to make way for the promulgation of new systems …”
I can’t say that the name meant anything to me before now, but the entry on John of Salisbury in the Stanford Encyclopedia of Philosophy makes it sound as if Metalogicon (the work just quoted) was the original higher ed polemic. It was “ostensibly written as a defense of the study of logic, grammar and rhetoric against the charges of the pseudonymous Cornificius and his followers. There was probably not a single person named Cornificius; more likely John was personifying common arguments against the value of a liberal education. The Cornificians believe that training is not relevant because rhetorical ability and intellectual acumen are natural gifts, and the study of language and logic does not help one to understand the world. These people want to seem rather than to be wise. Above all, they want to parlay the appearance of wisdom into lucrative careers. John puts up a vigorous defense of the need for a solid training in the liberal arts in order to actually become wise and succeed in the ‘real world.’”
That's something an Italian humanist might write four hundred years later to champion “the new learning” of the day. And that is no coincidence. Le Goff contends that “a number of renaissances, more or less extensive, more or less triumphant,” took place throughout the medieval era -- an elaboration of the argument by the American historian Charles H. Haskins in The Renaissance of the Twelfth Century (1927), a book that influenced scholars without, as Le Goff notes, having much effect on the larger public. The Renaissance, in short, was a renaissance -- one of many -- and in Le Goff’s judgment “the last subperiod of a long Middle Ages.”
So, no bold, clean strokeof the black Magic Marker; more of a watercolor smear, with more than one color in the mix. Le Goff treats the Middle Ages as having a degree of objective reality, insofar as certain social, economic, religious and political arrangements emerged and developed in Europe over roughly a thousand years.
At the same time, he reminds us that the practice of breaking up history into periods has its own history -- deriving, in its European varieties, from Judeo-Christian ideas, and laden with ideas of decline or regeneration. “Not only is the image of a historical period liable to vary over time,” he writes, “it always represents a judgment of value with regard to sequences of events that are grouped together in one way rather than another.”
I'm not entirely clear how, or if, he reconciled the claim to assess periodization on strictly social-scientific grounds with its status as a concept with roots in the religious imagination, but it's a good book that leaves the reader with interesting questions.
How should parents prepare children for college? In a new book, a former college president takes a look at programs and resources at five different institutions to find out what students need and what parents should do during the first year of college.
You don’t hear much about the United States being a “postracial society” these days, except when someone is dismissing bygone illusions of the late ’00s, or just being sarcastic. With the Obama era beginning to wind down (as of this week, the president has just under 18 months left in office) American life is well into its post-post-racial phase.
Two thoughts: (1) Maybe we should retire the prefix. All it really conveys is that succession does not necessarily mean progress. (2) It is easy to confuse an attitude of cold sobriety about the pace and direction of change with cynicism, but they are different things. For one, cynicism is much easier to come by. (Often it’s just laziness pretending to be sophisticated.) Lucid assessment, on the other hand, is hard work and not for the faint of spirit.
Naomi Zack’s White Privilege and Black Rights: The Injustice of U.S. Police Racial Profiling and Homicide (Rowman & Littlefield) is a case in point. It consists of three essays plus a preface and conclusion. Remarks by the author indicate it was prepared in the final weeks of last year, with the events in Ferguson, Mo., fresh in mind. But don’t let the title or the book’s relative brevity fool you. The author is a professor of philosophy at the University of Oregon -- and when she takes up terms such as “white privilege” or “black rights,” it is to scrutinize the concepts rather than to use them in slogans.
Despite its topicality, Zack’s book is less a commentary on recent events than part of her continuing effort to think, as a philosopher, about questions of race and justice that are long-standing, but also prone to flashing up, on occasion, with great urgency -- demanding a response, whether or not philosophers (or anyone else) is prepared to answer them.
Zack distinguishes between two ways of philosophizing about justice. One treats justice as an ideal that can be defined and reasoned about, even if no real society in human history ever “fully instantiates or realizes an ideal of justice for all members of that society.” Efforts to develop a theory of justice span the history of Western philosophy.
The other approach begins with injustice and seeks to understand and correct it. Of course, that implies that the philosopher already has some conception of what justice is -- which would seem to beg the question. But Zack contends that theories of justice also necessarily start out from pre-existing beliefs about what it is, which are then strengthened or revised as arguments unfold.
“However it may be done and whatever its subject,” Zack writes, “beginning with concrete injustice and ending with proposals for its correction is a very open-ended and indeterminate task. But it might be the main subject of justice about which people who focus on real life and history genuinely care.”
The philosopher Zack describes may not start out with a theory of what justice is. But that’s OK -- she can recognize justice, paradoxically enough, when it's gone missing.
I wish the author had clarified the approach in the book’s opening pages, rather than two-thirds of the way through, because it proves fundamental to almost everything else she says. She points out how police killings of young, unarmed African-American males over the past couple of years are often explained with references to “white privilege” and “the white supremacist system” -- examples of a sort of ad hoc philosophizing about racial injustice in the United States, but inadequate ones in Zack’s analysis.
Take the ability to walk around talking on the phone carrying a box of Skittles. It is not a “privilege” that white people enjoy, as should be obvious from the sheer absurdity of putting it that way. It is one of countless activities that a white person can pursue without even having to think about it. “That is,” Zack writes, “a ‘privilege’ whites are said to have is sometimes a right belonging to both whites and nonwhites that is violated when nonwhites are the ones who [exercise] it.”
In the words of an online comment the author quotes, “Not fearing the police will kill your child for no reason isn’t a privilege, it’s a right.” The distinction is more than semantic. What Zack calls “the discourse of white privilege” not only describes reality badly but fosters a kind of moral masochism, inducing “self-paralysis in the face of its stated goals of equality.” (She implies that white academics are particularly susceptible to "hold[ing] … progressive belief structures in intellectual parts of their life that are insulated from how they act politically and privately …")
Likewise, “the white supremacist power structure” is a way of describing and explaining oppression that is ultimately incapacitating: “After the civil rights movement, overt and deliberate discrimination in education, housing and employment were made illegal and explicitly racially discriminatory laws were prohibited.” While “de facto racial discrimination is highly prevalent in desirable forms of education, housing and employment,” it does no one any good to assume that “an officially approved ideology of white supremacy” remains embodied in the existing legal order.
None of which should be taken to imply that Zack denies the existence of deep, persisting and tenacious racial inequality, expressed and reinforced through routine practices of violence and humiliation by police seldom held accountable for their actions. But, she says, "What many critics may correctly perceive as societywide and historically deep antiblack racism in the United States does not have to be thoroughly corrected before the immediate issue of police killings of unarmed young black men can be addressed."
She is not a political strategist; her analyses of the bogus logic by which racial profiling and police killings are rationalized are interesting but how to translate them into action is not exactly clear. But in the end, justice and injustice are not problems for philosophers alone.
“If you spend much time in libraries,” the late Northrop Frye wrote at the start of an essay from 1959, “you will probably have seen long rows of dark green books with gold lettering, published by Macmillan and bearing the name of Frazer.” These were the collected works of the Victorian classicist and anthropologist Sir James Frazer, author of The Golden Bough (15 volumes) and a great deal else besides.
Frye’s remarks -- originally delivered as a talk on the Canadian Broadcasting Corporation’s radio network -- were aimed for a much broader public than would have read his then-recent book Anatomy of Criticism, which made its author the most-cited name in Anglophone literary studies until at least the early 1980s. (Frye was professor emeritus of English at Victoria College, University of Toronto, when he died in 1991.) He told listeners that it would require “a great many months of hard work, without distractions, to read completely through Frazer.”
And the dedicated person making the effort probably wouldn’t be an anthropologist. The discipline’s textbooks “were respectful enough about him as a pioneer,” Frye wrote, “but it would have taken a Geiger counter to find much influence of The Golden Bough in them.”
And yet Frazer’s ideas about myth and ritual and his comparative approach to the analysis of symbolism exercised an abiding fascination for other readers -- in part through the echoes of them audible in T. S. Eliot’s “The Waste Land,” but also thanks to Frazer’s good sense in preparing an abridged edition of The Golden Bough in one stout volume that it was entirely possible to finish reading in no more than a year.
If you spend much time in libraries these days -- wandering the stacks, that is, rather than sitting at a terminal -- you might have seen other long rows of dark green books with gold lettering, published by the University of Toronto Press and bearing the name of Frye.
The resemblance between The Collected Works of Northrop Frye (in 30 volumes) and the Frazerian monolith is almost certainly intentional, though not the questions such a parallel implies: What do we do with a pioneer whose role is acknowledged and honored, but whose work may be several degrees of separation away from where much of the contemporary intellectual action is? Who visits the monument now? And in search of what?
Part of the answer may be found in Essays on Northrop Frye: Word and Spirit, a new collection of studies by Robert D. Denham, professor emeritus of English at Roanoke College. The publisher named on the title page is Iron Mountain Press of Emory, Va., which appears not to have a website; the listing for the book on Amazon indicates that it is available through CreateSpace Independent Publishing Platform, a print-on-demand service.
Denham has written or edited more than 30 books by or about Frye, including several volumes of notebooks, diaries, letters and works of fiction in the Collected Works, for which he also prepared the definitive edition of Anatomy of Criticism. The second of the three sections in Word and Spirit (as I prefer to call the new book) consists of essays on the Anatomy, examining Frye’s ideas about rhetoric and the imagination and brandishing them in the face of dismissive remarks by Frederick Crews and Tzvetan Todorov.
Frye’s relative decline as a force to be reckoned with in literary theory was already evident toward the end of his life; at this point the defense of Frygian doctrine may seem like a hopelessly arrière-garde action. (“Frygian” is the preferred term, by the way, at least among the Frygians themselves.) But the waning of his influence at the research-university seminar level is only part of the story, and by no means the most interesting part. The continuing pedagogical value of the Anatomy is suggested by how many of Frye’s ideas and taxonomies have made their way into Advanced Placement training materials. Anyone trying to find a way around in William Blake’s poetic universe can still do no better than to start with Frye’s first book, Fearful Symmetry (1947). Before going to see Shakespeare on stage, I’ve found it worthwhile to see what Frye had to say about the play. Bloggers periodically report reading the Anatomy, or Frye’s two books about the Bible and literature, and having their minds blown.
Northrop Frye is the rare case of a literary theorist whose critical prose continues to be read with interest and profit by people who are not engaged in producing more of the stuff. In the talk on Frazer, he noted that The Golden Bough appealed to artists, poets and “students of certain aspects of religion” -- which seems, on the whole, like a fair guess at the makeup of Frye’s own posthumous constituency.
What’s been lacking is the single-volume, one-stop survey of the Frygian landscape. The Collected Works have complicated things -- not just by being vast and intimidating (and too expensive for most of individuals to afford) but by adding thousands of pages of unpublished material to the already imposing mass of Frye’s work.
Denham is as responsible for adding new turns to the labyrinth as anyone. He is the scholar dedicated enough to have solved the riddle of the great man’s handwriting. Most of the lectures and papers in Essays on Northrop Frye: Word and Spirit draw on the private papers, which are of considerably more than biographical interest. Frye used his notebooks to think out loud and to explain himself to himself, working out the links among the work he’d published and things he wanted to write.
They reveal elements of his inner life that remained unstated, or at most implicit, in Frye’s public writings -- for example, his studies in Buddhist and Hindu thought. He also explored the whole gamut of esoteric and mystical writings from the Corpus Hermeticum and Nicolas of Cusa (respectable) to Madame Blavatsky and Aleister Crowley (shady but undeniably fascinating) to titles such as The Aquarian Conspiracy and Cosmic Trigger: The Final Secret of the Illuminati (“kook books,” as Frye called them). Connections existed between this material and his scholarship (you can’t study Blake or Yeats for long without picking up some Gnosticism and theosophy) but Frye also needed to understand his own religious beliefs and occasional experiences of the ineffable. He was interested in the cosmological side of the literary imagination, but also compelled to figure out his own place in the cosmos.
The drives were mutually reinforcing. But references to these interests in his published work were few and far between, and often enough too oblique to notice. With Denham’s close knowledge of Frye’s writings (scholarly and subterranean alike) Word and Spirit seems like the book that’s been necessary for some while -- the thread that can take readers into the depths of the Frygian labyrinth. So on those grounds, I can recommend it -- without guaranteeing you’ll find the way back out again.
You can’t judge a book by its neologisms, but the coinages appearing in the first chapter or two of Carl Cederström and André Spicer’s The Wellness Syndrome (Polity) serve as pretty reliable landmarks for the ground its argument covers. We might start with “orthorexia,” which spell-check regards with suspicion, unlike “anorexia,” its older and better-established cousin.
Where the anorexic person avoids food as much as possible, the orthorexic is fixated on eating correctly -- that is, in accord with a strict and punitive understanding of what’s healthy to eat, and in what quantities, as well as what must be avoided as the culinary equivalent of a toxic landfill. It is a sensible attitude turned pathological by anxiety. And in the authors’ interpretation, that anxiety is socially driven: the product of “biomorality,” meaning “the moral demand to be happy and healthy,” as expressed in countless ways in a culture that makes chefs celebrities while stigmatizing the poor for eating junk food.
But diet is only one bailiwick for “wantologists,” somewhat better known as “life coaches,” whose mission it is to “help you figure out what you really want” in life. Cederström is an assistant professor of organizational theory at Stockholm University, while Spicer is a professor of organizational behavior at City University, London. I take it from their account that the wantological professions (there are certification programs) extend beyond one-on-one consulting to include the market in self-improvement and motivational goods and services such as books, workshops and so on. The goal in each case is the combination of physical fitness and positive mental attitude that amounts to an “ideal performance state” for the contemporary employee.
“A recent survey by RAND,” we learn, “found that just over half of U.S. employers with more than 50 staff offer some kind of workplace wellness program,” while 70 percent of companies in the Fortune 200 do so. “In total, U.S. employers spend about $6 billion a year on such programs,” which “are often tied up with employees’ health insurance.”
“Know Yourself, Control Yourself, Improve Yourself” reads one of the chapter subheads, as if to list the slogans from some Orwellian Ministry of Wellness. But where Big Brother ruled through the repression of desire and personal identity, the cultural regime defined by what the authors call “the wellness command” makes every possible concession to individuality and contentment. Indeed, it demands them. Every aspect of life becomes “an opportunity to optimize pleasure and become more productive,” and the experts warn that faking it won’t help: the satisfaction and self-realization must be authentic. We are all the captains of our fates and masters of our souls. Failure to stay healthy and happy -- and flexible enough to adapt to whatever circumstances the labor market may throw at you -- is ultimately a personal and moral failure. So you’d better get some life coaching if you know what’s good for you, and maybe especially if you don’t.
“What is crucial is not what you have achieved,” write Cederström and Spicer, “but what you can become. What counts is your potential self, not your actual self.” The titular syndrome refers to the cumulative strain of trying to respond to all the wellness commands, which are numerous, conflicting and changeable -- a perfect recipe for chronic anxiety, of which an obsession with eating correctly seems like an exemplary symptom. On first reading, I took “orthorexia” to be the authors’ own addition to the language (like “the insourcing of responsibility” and “authenticrat,” per the tendencies described a moment ago) but in fact it turns out to be an unofficial diagnosis in the running for future lists of psychiatric disorders.
The Wellness Syndrome offers, by turns, both a recognizable survey of recent cultural trends and a collage of insights drawn from more original works of social analysis and theory. Much of it will seem more than a little familiar to readers already acquainted with Christopher Lasch’s The Culture of Narcissism, Eve Chiapello and Luc Boltanski’s The New Spirit of Capitalism, Slavoj Zizek’s sundry discussions of the contemporary superego, or any given book by Zygmunt Bauman or Barbara Ehrenreich published in the past twenty years. These works are duly cited but the ideas not pushed in any new direction. The common principle subtending them all is that cynicism about institutions or the possibility of large-scale social change creates a privatized, moralistic ideology that traps people into punitive introspection or the fine-tuning of lifestyles. Unfortunately much of The Wellness Syndrome reads as if such trends began under the administrations of Bill Clinton and Tony Blair.
Alas, no. They were already visible 40 years ago as baby boomers began signing up for weekend explorations in self-discovery with unlicensed therapists who yelled insults at them and wouldn’t let them use the bathroom. Nothing in the new book points to any means or agency capable of changing things in any fundamental way, or even of imagining such a change. Social scientists aren't obliged to be prophets and, of course, they seldom do a very good job when they try; at best they describe and analyze change once it's discernable, not before. But after seven or eight years of shocks and aftershocks from a global financial crisis, it's time for books that do more than put new labels on decades-old problems.
In coining the word utopia, Thomas More was making a pun. The villain of Wolf Hall was, in real life, a learned man who wrote for people who could recognize a joke in Greek when he made one. The island republic of social perfection depicted in his most famous book was a good place (eu-topia), obviously. But it existed only in the imagination: it was also, literally, no place (ou-topia).
Alternating currents of optimism and skepticism crackle in the space between syllables. The ambivalence vanishes with “dystopia,” which, like dysentery (“bad bowels”), has nothing to recommend it. But there is more to dystopia than has been encoded in its etymology. The word usually implies utopia’s evil twin: a social order of perfect oppression, designed to bring the greatest misery to the greatest number.
The places Kate Brown writes about in Dispatches From Dystopia: Histories of Places Not Yet Forgotten (University of Chicago Press) are not all examples of hell on earth, by any means, but each bears the scars of some catastrophe that the visitor is bound to know about before arriving: the ghost town of Chernobyl, for example, or the basement of a hotel in Seattle full of the belongings of Japanese-American residents relocated to internment camps during World War II. The author introduces herself as “a professional disaster tourist,” though her day job is as a professor of history at the University of Maryland, Baltimore County. Her two previous books grew out of research on Russia and Ukraine during the Soviet era. Dispatches From Dystopia pursues many of the same interests while also working reflexively to consider the genres available for writing about place and memory: professional historiography, of course, but also personal narrative and travel writing.
“Many writers presume that the site of action is a given,” she notes, “as if places were neutral containers of human interaction rather than dynamic places in their own right.” At the same time, scholarly prose is often written from the vantage point of the proverbial “man from nowhere.” Make that “person from nowhere,” rather -- anyway, a voice that, while not omniscient, remains as rigorous and impersonal as possible.
“In their quest to explore the human condition,” Brown writes, “historians can hide behind their subjects, using them as a scrim on which to project their own sentiments and feelings. Let me put that another way: in my quest to explore the human condition, I have hidden behind my subjects, using them as a scrim on which to project my own sentiments and feelings. The third-person voice is a very comfortable one in which to reside. Permanently. The intimacy of the first person takes down borders between the author and the subject, borders that are considered by many to be healthy in a profession that is situated between the social sciences and the humanities.”
Such intimacy brings the potential for extreme embarrassment. Brown prefaces the lines just quoted by saying that her hands are sweating as writes them. Her early ventures into first-person scholarship met with resistance, expressed in well-meant warnings such as, “You won't get a job with that dissertation” and “Other scholars will assign you, but not cite you.” Which is understandable, because other risks besides personal and professional awkwardness can follow from experimentation of the kind Brown undertakes. The existence of “borders between the author and the subject” at least reduce the dangers of twee memoir -- and also of prolonged metaepistemic inquiry (how can the knower know the knower, much less the known?) that scorches the earth with tedium.
So for the first several pages of Dispatches From Dystopia I braced myself, only to find that Brown is the rare case of someone who can incorporate a number of registers of narrative and reflection within the same piece of writing, shifting among them with grace and quiet confidence. Her essays might be called position papers: topographical surveys of historical sites, with the mapmaker’s own itinerary sketched in.
The trips to erstwhile Soviet republics are not, she makes clear, a search for roots. A product of “the industrial heartland of the United States at a time when it was the world’s most prosperous and powerful country,” she is unaware of any German, Jewish or Slavic branches to her family tree: “I could hardly have been born farther from rural, famished, collectivized, heavily politicized, bombed and terrorized Right Bank Ukraine” -- the subject of her first book -- “a place that stands in my mind as the epicenter of 20th-century misery.”
But another essay suggests the advantages of this presumed naïveté. People she met granted the author a place in post-Soviet society “as an honorary child…. If I accepted this role passively, relinquishing my status as an autonomous adult and the critical rationality of a researcher, they often let me in, if fleetingly, for a closer look. By becoming childlike -- susceptible, disabled and dependent -- I became a temporary member of their community, which in the Soviet Union was defined by an understanding of biological vulnerability, mutual interdependence and obligation.”
Other expeditions require different personae. Her trip to what’s left of the city of Chernobyl elicits another kind of identification with people who have been there. Expecting a scene from opening days of the Gorbachev era -- irradiated but frozen in time -- she finds that everything that can be sold has been hauled off to market: “Even the knobs on the kitchen cabinets were gone. Even the time capsule schoolchildren buried in the 1970s had been looted. (I know because I was hoping to dig it up and loot it myself.)”
Brown’s first-person reflections are embedded in narratives and place descriptions that are more intricate and varied than a reviewer can even begin to suggest, and certain issues and motifs link the essays in ways that would probably reward a second reading. Each piece, like the volume as a whole, is an example of nonfiction that uses the first person, rather than just indulges it. The learned essay and the personal essay are different creatures and attempts to create a hybrid are often problematic at best. But Dispatches From Dystopia proves it can be done.