Evelyn Barish's The Double Life of Paul de Man, from Basic Books, is a scandalous volume, in at least a couple of ways.
At the most obvious level there is troubling nature, even after all this time, of the "the de Man affair" -- the discovery, in 1987, that the preeminent figure among the literary theorists at Yale University had published a substantial body of literary journalism in a Belgian newspaper when it served as a mouthpiece for the Nazis during the occupation. It generated much discussion over the next few years, with a very little of it involving people who had ever actually read anything by Paul de Man. It was, in that and many other regards, a formative stage of the culture wars.
Barish, a professor emeritus of English at the City University of New York Graduate Center, offers evidence that his collaboration went further than writing regime-friendly articles on French and German books. Since the early 1990s, she has been digging in archives and interviewing the critical theorist's family members, friends, and enemies. She even tracked down the Dutch translation of Moby Dick that de Man published in his 20s. He reinvented himself quite thoroughly after arriving in the United States in 1948, going from penniless fugitive (with no postsecondary degree) to doctoral candidate in the comparative literature department at Harvard, then on to a role as one of the most sophisticated and influential literary theorists in the era of structuralism and its aftermath.
Barish concentrates almost entirely on de Man's career up to his first appointment, at Cornell University, following completion of his Ph.D. at Harvard in 1960. She doesn't say much about de Man's ideas, which is fortunate; her references to the literary, political, and intellectual contexts of his work seldom inspire confidence. Barish's forte, rather, is as sleuth. And if even half of what The Double Life reports is accurate, de Man picked the wrong Melville novel to translate. The Confidence Man would have been a lot more apt.
That is a problematic "if," however. It is difficult to express the cumulative frustration of reading a book on matters of such importance containing so many careless mistakes, needless repetitions, and dubious leaps of assumption. Their effect is not to shatter the structure of Barish's argument, like volleys of mortar fire hitting a house. It will stand for a while, until someone else does a better job. But it has termites, leaving it shaky in the meantime.
The story in brief: Paul de Man, born in 1919, grew up in a recently prosperous Belgian family (his father ran a company that manufactured x-ray tables) with its full share of neurotic misery. His mother was prone to prolonged spells of clinical depression and made a number of suicide attempts before succeeding by hanging herself when de Man was 17 years old.
His father was a philanderer, and the boy understandably if unfairly blamed him for her death. He took his uncle, Henri de Man, as a more worthy paternal model. Henri was a leading figure in the Belgian labor movement and very well-connected in political and journalistic circles; he was a major theorist of the non-Marxist wing of European socialism and an adviser to Belgian royalty.
Barish understands much of Paul de Man's early career as shaped by this maelstrom of influences. A brilliant student in his early teens, he entered college in the wake of his mother's suicide and flunked out repeatedly; he took up something of the playboy lifestyle enjoyed by his father, despite hating the man. Uncle Henri came rather early to the self-fulfilling conclusion that Hitler was unstoppable and that Belgium would inevitably fall under German domination. He and his followers were prepared to make the best of it -- and his ambitious but aimless nephew, even more than most.
With the benefit of Henri's pull, Paul joined the staff of the newspaper Le Soirée in 1940, a year after the German tanks rolled in, and contributed hundreds of book and music reviews. This paper trail shows him to have been an opportunist rather than a true believer (his article criticizing Jewish influence on European literature was half-hearted at most, especially by contrast with the rants published alongside it) but The Double Life quotes passages from reviews that echo Nazi ideas contrasting the vitality of German culture with French decadence.
Nepotism and scheming also enabled him to carve out a niche as adviser to a major book distributor. By his mid-20s, de Man seems to have been as well-integrated into the collaboration's cultural apparatus as anyone could be; had some of the office-politics skulduggery de Man engaged worked out, he might have climbed even higher. (Blocked career mobility can be a blessing at times: two of the men he worked under were condemned to death after the war.) All the while, de Man maintained contacts with friends involved in the resistance. Once the war ended, of course, it turned out that everyone had fought in the resistance. De Man could even name the unit he had "joined."
His real-life activities in the mid-1940s sound altogether shadier. He started a publishing company in the usual way -- gathering investors, commissioning books and translations, etc. -- but used its assets as his own personal cash machine until there was nothing left but debt. When he departed for New York in 1948 de Man was facing a prison sentence of five years for embezzlement.
The ease with which de Man escaped this past entirely once in the U.S. -- or shook it off for a while at least -- will astonish anyone habituated to today's norms of surveillance and databanks. His wife and children went to Argentina to await him, but de Man seems to have decided that chapter of his life was closed. He took a job stocking books at a Barnes & Noble store while ingratiating himself with the literary and intellectual circles around Partisan Review and Dwight Macdonald's magazine politics.
With the novelist and critic Mary McCarthy he formed a close friendship (whether or not "with benefits" being a matter of some debate among the biographer's sources) that led to a stint at Bard College, teaching French literature. In front of the classroom, he finally found his element: students were enraptured by how he sounded out the ironies and paradoxes of a poem. He was exceptionally discreet about his earlier history, but much less so about his current behavior: he managed to run up considerable debts and to impregnate a co-ed.
They were properly married, in good time -- but only after a bigamous period, during which de Man's other family, the one that had been waiting in Argentina, showed up on the doorstep one day.
I know it all sounds complicated, but really, this is the streamlined version of the story. It also involves lawyers, doctored documents, French surrealism, and countless unpaid and otherwise unhappy landlords. There are poison-pen letters and visits from the Immigration and Naturalization Service. Paul de Man supplements his stipend by translating articles for a magazine run by a professor named Henry Kissinger. Plus a tree falls on de Man during a thunderstorm. And then there was the difficulty of getting anyone in comp lit at Harvard to take Heidegger seriously in the 1950s. The biographer suggests that de Man's life grew more settled once he became a professor.
But then, it would have to be.
A ripping yarn! Convoluted as the story is, it's somehow easier to believe than was the news about de Man's wartime writings when it first broke, more than 25 years ago. People who spent a significant part of the 1980s reading and rereading Blindness and Insight and Allegories of Reading -- the two books de Man published during his lifetime -- thought of him as being like his prose: both magisterial and sublimely ironic. It was painfully difficult, almost impossible really, to understand how a thinker so canny about the implications and complications of literature could ever have lent his support to a system of such brutal simple-mindedness.
One strategy of response was exemplified by Jacques Derrida, who stood up for his friend in an essay suggesting that de Man's article on the Jews in European literature was actually a very subtle -- a very, very subtle -- deconstruction of Nazi discourse. This was an example of the procedure known as "polishing a turd." It did not prove helpful.
The Double Life of Paul de Man cuts through the knot by seeing his life and work as a response to his family and milieu. He grew up with an emotionally fragile mother who clung to him desperately, then left him to find her lifeless body hanging in the attic. He absorbed all of his father's vices but none of the bourgeois virtues; money ran through his fingers like water. The family member he admired most (and, we learn, repeatedly claimed was his real father) was a political opportunist with fascistic leanings who taught by example that career advancement by any means necessary was acceptable.
He learned these lessons all too well. His later work in critical theory, in the biographer's estimation, rationalized the idea that words have no meaning and you can interpret things just about any way you want.
Except, he didn't. Besides relying on clueless polemics as a substitute for reading his work Barish has cultivated some strange ideas about de Man and his influence. We read that de Man created "a new philosophy, a way of looking at the world that redefined America's point of view." She calls him a "linguistic philosopher," and says that he was "known by some as the 'father of deconstruction,' although he said the term was coined by Jacques Derrida." (Actually she makes the same point about the patrimony of "deconstruction" again later, as if there were some ground for doubt in the matter, which there isn't. Derrida came up with it.)
As if afraid we'll drop the book unless it is about a titan, the biographer reaches to the heights of overstatement. "Influential in both the academic world and the broader social one," she also states, "de Man wielded more influence on intellectual ideas than any other voice here or abroad."
Barish is, of course, quite right that de Man's influence made him influential. (And even vice versa.) But the statement is otherwise not even remotely true. At the peak of his renown, de Man's readership consisted almost entirely of professors and graduate students in literature programs (comparative and otherwise) along with the occasional ambitious undergraduate. He played no role at all in non-academic sectors of the public sphere. In part that is because, after his early journalism, he avoided discussing contemporary political matters of any kind. He made no grand pronouncements about Society or Truth -- not even to deny that it was possible to make grand pronouncements about them. Nor was he a "linguistic philosopher." He was a critic of romantic and post-romantic literature. He wrote about what language does, or can do, when it operates in certain specific locations known as literary texts.
To find such confused statements about de Man's role at the start of a book about his life does not inspire trust. I kept on reading but found it impossible not to be distracted by countless examples of what can only be called outright sloppiness. The author repeats descriptions or characterizations of people repeated almost word for word from one place to the next. We learn of the newspaper Le Soir ("The Evening") that during the occupation, German patriots called it Le Soir volé ("The Stolen Evening") and from that point on every single mention of the paper calls it Le Soir (volé), as if that were its actual title.
It's eccentric at best. Thoughts are not always developed: Erick Erickson's discussion of "alienated man" had some effect on the biographer's understanding of de Man, since she nods in their direction a number of times, but whatever insight it offered as a key to de Man is never worked out. Her references to Bourdieu's sociology of culture are equally murky, and not a little compromised by her apparent belief that his term "habitus" means something like "social network."
Much more disappointing from a professor emeritus is her attribution of Keats's phrase about the world as a "vale of soul-making" to Milton. I didn't keep track of all the book's anachronisms, but here's one memorable example: When de Man lies about having written a master's thesis on Henri Bergson's understanding of time, Barish explains that Bergson's work became fashionable in the 1940s because Heidegger and Sartre wrote about it. (Just explaining why that's wrong would take another couple of paragraphs).
We get many, many references to what de Man "must have thought" about something, and also confident statements about what others would have known, or not known, about de Man himself. A little of that sort of thing goes a long way. It is a fair guess that, in 1948, not many people in the U.S. would have made a connection between Paul de Man and Belgian political history, since Belgium ranks somewhere behind Romania in the American awareness of Europe -- slightly ahead of Liechtenstein, though that's arguable. But she is on shaky ground in making the same assumption about the New York intellectuals de Man met. Many had been radicals during the depression and were perfectly well aware of his uncle Henri.
By the time the biographer speculates that Mary McCarthy expected de Man to marry her after getting her pregnant -- for which there is simply no evidence -- it's not altogether clear what genre The Double Life of Paul de Man falls into.
It's readable, but is it reliable? About innumerable small things, no, it isn't; that leaves me dubious about the author's judgment regarding larger matters. Some years back, Ortwin de Graef, the scholar who unearthed de Man's collaborationist writings, published a book covering the same period called Serenity in Crisis: A Preface to Paul de Man 1939-1960, but it's a monographic study. A definitive biography of Paul de Man would combine de Graef's depth of understanding with Barish's narrative zip, but it will probably be a long time before that happens.
A couple of weeks ago I decided, after prolonged dithering, to rent space in the digital warehousing district known as "the cloud." One of my laptops held at least five years' worth of material -- digital page proofs for books, JSTOR downloads, extensive photographic documentation of the lives of our cats, etc. -- running to about 14,000 files, or more than 50 gigabytes. Having all of it in one place seemed to tempt fate.
It also meant that use of my digital archive was restricted to times when that rather clunky laptop happened to be convenient. The biggest advantage of storing a file in the cloud is being able to retrieve it on any computer or e-reader that has web access. The savings in exasperation alone are considerable. A feeling of creeping senility kicks in when you end up with two or three copies of a paper that you probably already downloaded, but can't remember for sure (so just to be safe...) or spend part of a trip to the library gathering the same citations you collected a few years ago.
The one disadvantage -- in case anyone else out there has a similar digital hoarding problem -- is that first you have to upload everything, and it can take a while. The task does not require much attention. But even with sending batches of a hundred files or more at a time, it took a long weekend. That doesn't count the labor of sorting and labeling the files and weeding out duplicates, which, like housekeeping, is an ongoing process that never really ends.
After this long march into the paperless future, my study ought to look as aesthetically spare as an Ikea store display -- not crowded with cardboard boxes full of documents from projects both in progress and in limbo. But I'm not there yet and probably never will be. With a scanner and a few more weekends, all the files could all be rendered into PDF. For that matter, some of the material that took me years to locate, and not a few bucks to acquire, can now be downloaded in that format for free.
It's the same text, of course, yet somehow not the same document. The PDF lacks the aura of the original: the constant, lingering reminder that, in the past, readers held this specific document in their hands, focused attention on it for their own particular reasons, and decided that it was worth keeping.
Contact with the original document enriches the experience of reading -- thickening it with added layers of historicity. That said, it's also convenient to have a digital version of it on hand, to annotate or to share. But by the time I finished reading Lisa Gitelman's new book Paper Knowledge: Toward a Media History of Documents (Duke University Press), even the humble PDFs downloaded on a JSTOR binge began to seem interesting in their own right as a variety of social and cultural artifact.
Gitelman, a professor of English and of media, culture, and communication at New York University, finds the contrast between print culture and digital culture much less compelling than a series of developments from the past 150 years conditioning how we understand documents of whatever variety, whether published with ink or in bytes. My hunch going in was that the author would give a fair bit of space to one more rehearsal and critique of Foucault's treatment of the concepts of document and archive in The Archeology of Knowledge. The eyes fairly glaze at the prospect.
Instead, Gitelman practices a kind of conceptual archeology without obeisance to the master, in an argument that stands well on its own.
To sum it up all too quickly, then: Discussions of print culture typically concern published matter of a few general kinds, such as books, pamphlets, magazines, and newspapers -- in short, mass-produced texts through which authors communicate with an audience.
But another category accounted for up to a third of the output of printers in the United States by the end of the 19th century: the "job printing" done for the government, industry, and small businesses, providing them with batches of application forms, tickets, order books, rent receipts, posters, and so on.
This layer of "print culture" was part of the basic infrastructure of modern bureaucracy and of advanced capitalism -- as essential to modernity as the circulation of books and magazines was in creating the "public sphere" (Jürgen Habermas) or the "imagined community" of the nation-state (Benedict Arnold). The concept of "author" hardly applied to the documents turned out by job printing, and they didn't typically have "readers," either, certainly not in the sense a newspaper did. But they were integral to everyday life -- and with the passing of time, they could become historical evidence, the raw material of scholarship.
Here the analysis begins to spin out a couple of threads that, by turns, twist together and move at odd angles to each other. Gitelman goes on to trace the efforts of academics in the 1920s and '30s to develop standards for making scarce primary sources available to the scholarly community (using emerging tools such as microfilm) while also establishing standards for cataloging and citing documents circulating through non-print modes of reproduction (for example, carbon copy or the hectograph).
Marketing of the Xerox machine in the early 1960s originally stressed its usefulness as a replacement for job printing. But by the end of the decade, copy shops were sprouting up around college campuses, precisely to meet the need for small-run reproduction of scholarly materials that American learned societies had anticipated in earlier decades.
By the time you reach the book's final chapter, on the rise of PDF, the relationship between the history of ground-level print culture and that of its Ivory Tower analog seem linked in so many suggestive ways that the advent of digital culture seems like just one part of an intricate pattern. Most of the stimulation of the book comes from Gitelman's narration and juxtaposition of developments across several decades, which unfortunately can't be captured in paraphrase.
It's the first of the author's books I have read, but it won't be the last.
The South Carolina House of Representatives on Monday twice refused to reverse a $52,000 cut to the College of Charleston's budget -- a cut added by a legislative committee to punish the college for assigning Fun Home, a well regarded memoir by a lesbian, to freshmen, the Associated Press reported. Lawmakers said that they wanted to send a message about the selection of the book.
The college responded to questions from Inside Higher Ed about the vote by releasing this statement from President P. George Benson: "Any university education must include the opportunity for students to engage controversial ideas. Our students are adults, and we will treat them as such at the College of Charleston. As one of the oldest universities in the United States, the College of Charleston is committed to the principle of academic freedom. Faculty, not politicians, ultimately must decide what textbooks are selected and how those materials are taught. Any legislative attempt to tie institutional funding to what books are taught, or who teaches them, threatens the credibility and reputation of all South Carolina public universities."
Two people who died at a San Francisco nursing home on Monday night appear to have been victims of a murder-suicide; they were mother and daughter, though few other details have yet been released. A police officer and "self-confessed gun nut" in Dallas extolled the qualities of his new shotgun in a video posted online a few days before using it to kill his wife and then himself last week. It's reported that his jealousy was stoked by her Facebook socializing. The lack of evident motive makes even more horrific the scene in a Chicago suburb, also last week. After killing his parents and his 5 year old nephew, a man set the house on fire, then shot himself.
These events all occurred during the short time it took me to read The Perversion of Virtue: Understanding Murder-Suicide by Thomas Joiner, a professor of psychology at Florida State University, just published by Oxford University Press. The author estimates that around 2 percent of suicides in the United States are accompanied by the murder of at least one other person. It averages out to slightly more than two murder-suicides per day.
Nearly 90 percent of "ordinary" murders are committed by men, who also make up most (at least 75 percent) of the body count. With murder-suicide, the figures are significantly different: the perpetrators are male more than 90 percent of time, while 75 percent of the victims are female. "Both murder per se and suicide per se involve firearms between 55 percent and 70 percent of the time," the author indicates. "The rate in murder-suicide is considerably higher, with some studies returning rates approaching 100 percent."
But there is the very rare exception, such as the man who killed his wife with an injection cyanide before swallowing some himself. In this case, it was a matter of convenience: "He was a jeweler, and jewelers frequently use cyanide for their wares." Firearms and poison alike can be used in both stages of murder-suicide, while the man who killed his wife and son with a baseball bat three years ago couldn't exactly turn the weapon back on himself. Yet he "did nevertheless arrange that he be bludgeoned to death," Joyner writes; "he placed himself in front of an oncoming passenger train."
An article about the apparent murder-suicide of a man and woman in Cleveland last month reported: "Police have not said which of the two victims they believe was murdered. They also have not revealed why they believe the deaths are the result of a murder-suicide." I have not been able to find more recent news about the case, but Joiner's book makes a confident guess possible on both points.
The author is a prominent specialist in the study of suicidal behavior, and his goal in The Perversion of Virtue is to create "a comprehensive yet parsimonious typology" for what he calls "true murder-suicide." He excludes cases in which a murderer commits suicide to avoid punishment after the attempt to escape has failed, or still rarer instances of the suicide causing someone else's death by accident (say, a pedestrian killed by a building-jumper). In murder-suicide proper, the perpetrator's decision to kill himself is the primary factor. All else follows from it, through a morbid logic in which the thought of the victim(s) continuing to live is "the final barrier to suicide ... in the perpetrator's mind." The resolution to kill himself "necessitates, through an appeal to virtue, the death of at least one other person."
Virtue seems a peculiar word to find in this context, but it is the key to the book's four-compartment typology, defined by the venerable higher goods of mercy, justice, duty, and glory. The perpetrator of murder-suicide considers the death of the other(s) as required by at least one, and possibly two, of the four virtues. The act entails "a perverted and horribly distorted version of [virtue] to be sure," say Joiner, who also indicates that that the decision is always a product of mental illness. From the perspective of anyone but the killer, a murder-suicide compelled by the demands of justice is simply a matter of revenge: the abusive parent or the ex-spouse's infidelity damaged the suicidal person so badly that life is unbearable, but even more unbearable is the idea of them getting away with it.
Conversely, the murder may be committed as an act of violent mercy: a way to spare the victim (or victims) suffering in the wake of the suicide, as when parents in a suicide pact also kill their children. Not altogether distinct from such mercy killings are cases in which the perpetrator feels responsible for a severely ill or otherwise incapacitated person, so that killing them is a duty to be performed before committing suicide.
Finally, and the hardest of the four to regard with sympathy, is murder-suicide as a quest for glory. The primary example Joiner considers is the Columbine killers, who hoped to exceed Timothy McVeigh's death toll, and might have, had their bombs worked. The carnage of Jonestown and Heaven's Gate might also be relevant examples of murder-suicide pursued in the interest of their leaders' heroic self-concept, which to anyone else just looks like grandiosity. Orchestrating mass death was as close to glory as they ever got.
Parsimony, too, is a virtue, though more of the intellectual than moral variety. Having narrowed the scope of the term "murder-suicide," with stress on the suicidal impulse as its driving force, Joiner takes an inductive leap by suggesting a four-part typology of the rationales perpetrators create for their violent actions. Near the end of the book, he points out that the virtues of mercy, justice, duty, and glory can be further reduced to two categories: "one, combining mercy and duty, in which feelings of care and empathy for others are high (if distorting) and another, combining justice and glory, in which callousness and carelessness predominate." But he stops short of pushing any further toward schematism. And a good thing too. Like any virtue, parsimony gone wrong becomes a vice.
Just what value does the taxonomy itself have? Joiner suggests that it could be useful in talking to patients considered potentially violent. People with plans for suicide can be extremely reticent to reveal much about themselves, but a carefully delivered question about some aspect of the four virtues might be useful in assessing their state of mind.
For the lay reader, there's a certain relief at learning some kind of order or intelligibility can be found amidst all the mayhem. If, in addition, the book prevents even one more horror of the kind it describes, it will have served its purpose.
The men who established the republic were no plaster saints of Red State moral uplift. Only one of the half-dozen figures Thomas A. Foster writes about in Sex and the Founding Fathers: The American Quest for a Relatable Past (Temple University Press) would escape denunciation by the Traditional Values Coalition if the Founders were around today.
Accusations of adultery or of fathering children out of wedlock (or both) were made against George Washington, Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton; the last two admitted the truth of the charges. Gouverneur Morris managed to draft the Constitution between rounds of frequent, strenuous fornication -- exercise he pursued despite having a severely mangled right arm and amputated left leg.
Only the the tightly wound John Adams seems to have escaped any hint of scandal. By all evidence, he and Abigail were strictly monogamous and not averse to finger-wagging at the other Founders' morals -- especially Franklin's, which were particularly relaxed. Besides writing a notorious essay on selecting a mistress, Franklin lived with a common-law wife; later, he conducted a good deal of his work as ambassador to France either in bed with well-born Parisian ladies or trying to get them there.
He was also broad-minded in ways that would be fodder for cable TV news today. He seems to have been on friendly terms with one Chevalier d'Eon, a French diplomat who preferred to dress in women's clothing. Poor Richard's ventriloquist was, as it's put nowadays, straight but not narrow.
Tabloid history? No, though much innuendo about the Founders did appear in frankly sensationalist publications of the day. (Negative campaigning goes way back.) Foster, an associate professor of history at DePaul University, is innocent of any muckraking intent. Everything in Sex and the Founding Fathers is a well-established part of the historical record, and in the case of Jefferson's relationship with his slave Sally Hemmings, you'd have to have spent the last 20 years on a desert island not to have heard about it by now.
The author isn't interested in revealing the character or psychology of the early American statesmen. Rather, the book is a metahistory (not that Foster uses such jargon) of how their sex lives and their public roles were understood during across the past 200 years or so. The biography of a major political figure is itself a political act. Historians and others writing about the Founders have dealt with their peccadilloes in different ways over time, the shifts in emphasis and judgment reflecting changes in the national political culture.
George Washington, for example, seems the most austerely virtuous of the country's early leaders, thanks especially to the moralizing fables of Parson Weems. Recent biographies suggest that he had a number of romantic relationships, consummated and otherwise, before marrying Martha. Writers of historical fiction depict the six-foot-three, athletically built military man as exerting powerful animal magnetism upon the colonial womenfolk. (Like Fabio, but with wooden teeth.) In real life, Washington addressed passionate letters to a married woman. If no further improprieties occurred, it was not for want of trying.
Foster notes the tendency to assume that earlier images of the first president were "disembodied" idealizations which have "only recently been humanized." But the record is more nuanced: "Even the earliest images emphasize both his domestic life and his military and government successes," Foster writes, with some 19th-century biographies and paintings "establish[ing] Washington as the romantic man" as well as "head of a prosperous household." But on that last point, one fact was somewhat problematic: Martha, who was a widow when they met, had a number of children by her first husband but never conceived with George.
"No early account hides the fact that he had no children of his own," Foster notes. "But 19th-century writers do not dwell on this aspect of his life, leaving some readers to their own devices to determine this aspect of his private family life." Biographers in the Victorian era "could not anticipate that readers would ever expect an answer to the very personal question of why he had no children."
Refusing to acknowledge the question did not make it go away, however. The lack of progeny was a seeming defect in Washington's status as embodiment of masculine ideals. One answer to the problem was sentimental: The couple could be depicted as blissfully compatible yet saddened by their plight, even without any evidence of it. ("Americans," Foster remarks, "have never hesitated to speak definitively about the loves and inner lives of the Founders, despite a lack of documentation.") Unfortunate as the situation was, Washington finally transcended it by becoming "father of his country." Another solution was to deny that Washingon's virility was compromised at all, by claiming that he had an illegitimate son by the widow of one of his tenant farmers. See also the rumor that Washington died from a cold he caught "from leaping out a window, pants-less, after a romantic encounter with an 'overseer's wife.'"
No other figure in Sex and the Founding Fathers occupies so markedly paternal a role in public life, but in each case Foster brings out the complex and tightly knit relationship between sexual and political life. Even with Benjamin Franklin -- whose flirtatiousness is well-known, as is his earthy advice about the benefits of dating older women -- the author finds aspects of the record that add some nuance to the familiar portrait. I never appreciated just how disturbing a figure he was to his countrymen in the 19th century, when a senator struck his name from a list of candidates for a proposed national hall of fame on these grounds:
"Dr. Franklin's conduct of life was that of a man on a low plane. He was without idealism, without lofty principle, and one side of his character gross and immoral.... [His letter] on the question of keeping a mistress, which, making allowances for the manner of the time, and all allowance for the fact that he might have been in jest, is an abominable and wicked letter; and all his relation to women, and to the family life, were of that character."
Abominable? Well, he wasn't a hypocrite, and that's always a risky thing not to be. Consider also Alexander Hamilton. When accused of financial improprieties involving public funds, he denied it but admitted to having had a fling with a married woman whose husband then tried to blackmail him. "He chose to discuss the affair, in print, publicly, and in the greatest of documented detail to save his public honor," writes Foster. "He was not divorced. His wife did not denounce him. [George] Washington publicly supported him, as did others."
For a long time, biographers treated the matter evasively. They airbrushed the details out of his portrait as much as possible. Nowadays, Foster says, we get "warts-and-all hagiography -- ones that present failings only to dismiss them or have them overshadowed by an overarching theme of national greatness." Either way, he argues, the statesmen of the early republic stand apart from more recent politicians embroiled in sex scandals in one important way. Our contemporary lotharios can skulk off the public stage after a while, while the Founders never can. Their dirty linen hangs out for everyone to see, forever.
As Inside Higher Edreported last week, the newest round of curricular mayhem instigated by Bruce H. Leslie, the chancellor of the Alamo Colleges, is to replace his district’s second three-credit humanities course requirement with a class based on The 7Habits of Highly Effective People. (Leslie might have suggested Machiavelli’s The Prince, which seems closer to his style of governance.
The FranklinCovey Company plans to release a textbook specifically for EDUC 1300, Learning Framework, which will be required for every student taking the course. Perhaps Leslie read the 2013 version from Save Time Summaries, whose motto is “Save Time and Understand More”! The lengthy list of short takes in the Save Times Summaries series includes Malcolm Gladwell’s David and Goliath: Underdogs, Misfits, and the Art of Battling Giants, Tim Rath’s StrengthsFinder 2.0, and, most puzzling of all, Proof of Heaven, by Eben Alexander III (surely, reading this proof doesn’t require saving time -- presumably we have all of eternity). All that remains, for some enterprising individual, is a proposal for “7 Effective Habits for Dummies.”
And the new core course can’t be too intense; work place seminars on the book generally run around two days, and after all, as Colleen Flaherty reported for Inside Higher Ed, Leslie’s inspiration was a kindergarten class that he visited, where one young scholar shook his hand. Chancellor Leslie, in fact, seems fixated on the handshake, something even my mother’s succession of poodles has mastered.
It came up again in his rationale for revising the core: According to Flaherty, “Leslie said the proposed course is a measured response to calls ... to ensure that students graduate with ‘soft skills’ -- leadership, knowing how to shake a hand, how to manage time effectively -- and from his own personal experience. Several years ago, Leslie realized that some graduates hardly looked him in the eye or knew how to shake his hand... .” Perhaps they just weren’t all that into him.
As for eye contact, that can happen just as easily in a humanities course as in a class centered on a self-help book. I’ve encountered it hundreds of times: maybe it’s sparked by a line from a Seamus Heaney poem, or the final chapter of a novel by Zadie Smith, or a passage from Narrative of the Life of Frederick Douglass, a Slave: it’s the look of learning, the look that says, “I get it.”
Perhaps in the chancellor’s dream course there will still be time for a short poem or a snippet from history or a mini lesson in how to say “Hello!” in different languages -- in weekly “Show and Tell” sessions, scheduled in between nap and snack times.
And there may be additional reading. While the logical companion text would seem to be All I Really Need to Know I Learned in Kindergarten, other equally frightening supplemental possibilities include WhoMoved My Cheese? and Fish! My recommendation is Napoleon Hill’s Think and Grow Rich, if only because it includes the word Think in the title.
Instructors at the Alamo Colleges are limited in terms of selecting their own texts, but they might also want to sneak in Ben Franklin’s Autobiography and F. Scott Fitzgerald’s Great Gatsby, if only for Gatsby’s Franklin-like self-improvement list, or what the little boy who so inspired Leslie on his fateful visit to that kindergarten class called his “data book.”
Time permitting, students might also benefit from reading Langston Hughes’s “I, Too,” a concise and precise lesson in perception; Shakespeare’s Hamlet, a reminder of the dangers of procrastination, and Kurt Vonnegut Jr.’s God Bless you, Mr. Rosewater, a reminder of what highly effective people know: “Be kind.” Or, as Mr. Rosewater actually said, “There’s only one rule that I know of, babies -- God damn it, you’ve got to be kind.”
The Save Time Summaries version of 7 Habits cut the original accompanying anecdotes, in the interests of time and space; this isn’t surprising, coming from a venture that also includes Jodi Picoult’s The Storyteller on its list, with the promise of “From Start to Finish -- 20 minutes!”
Now Chancellor Leslie’s lose/lose plan (see the discussion of Habit #4 of highly successful people) will do something similar for the students in his district. They’ll just have to wait for grad school curriculums in medicine, law, and business to find another course on the value of storytelling.
Planners of those programs have recognized that storytelling -- both the telling and the listening -- are important. Narratives convey information and facilitate learning. Besides, it’s helpful to have a good tale to go along with that firm handshake.
But you don’t have to take the word of curriculum planners at Columbia University or Saint Louis University or Penn State University on the significance of storytelling -- just ask any five-year-old.
Carolyn Foster Segal is professor emerita of English at Cedar Crest College. She currently teaches at Muhlenberg College.
In one of those cases where satire cannot trump cold hard fact, the power brokers and heavy thinkers who gathered at an Alpine resort in Davos, Switzerland, for the World Economic Forum last month expressed great concern about the danger that growing inequality poses to social stability everywhere. As well they might.
Strictly speaking, "widening income disparities" was only one of 10 issues flagged by the Forum's Outlook on the Global Agenda 2014report, along with "a lack of values in leadership" and "the rapid spread of misinformation online." But a couple of concerns on the list -- "persistent structural unemployment" and "the diminishing confidence in economic policies" -- were variations on the same theme. Two or three other topics were related to income disparity only a little less directly
In case you didn't make it to Davos last month (my invitation evidently got lost in the mail this year ... as it has every year, come to think of it), another gathering this summer will cover much of the same ground. The 18th World Congress of the International Sociological Association -- meeting in Yokohama, Japan, in mid-July -- has as its theme "Facing an Unequal World: Challenges for Global Sociology." The scheduling of their events notwithstanding, it was the sociologists who were really farsighted about the issue of growing inequality, not the "Davos men." The ISA announced the theme for its congress as early as December 2010.
And the conversation in Japan is sure to be more focused and substantive. A lot of business networking goes on during the World Economic Forum. By some accounts, the topic of inequality figured more prominently in the news releases than in actual discussions among participants. It's almost as if all of Bono's efforts at Davos were for nought.
Available a solid six months before the sociologists put their heads together in Yokohama, Goran Therborn's The Killing Fields of Inequality (Polity) ought to steer the public's thinking into deeper waters than anything that can be reached with a reductive notion like "widening income disparities." Money provides one measure of inequality, but so do biomedical statistics, which record what Therborn, a professor emeritus of sociology at the University of Cambridge, calls "vital inequality." (Income disparities fall under the heading of "resource inequalities," along with disparities in access to nutrition, education, and other necessities of life.)
A third, less quantifiable matter is "existential inequality," which Therborn defines as "the unequal allocation of personhood, i.e., of autonomy, dignity, degrees of freedom, and of rights to respect and self-development." A big-tent concept of Therborn's own making, existential inequality covers the limitations and humiliations imposed by racism, sexism, and homophobia but also the experience of "people with handicaps and disabilities or just the indigent overlorded by poorhouse wardens or condescending socio-medical powerholders," among others.
While analytically distinct, the three forms of inequality tend to be mutually reinforcing, often in perfectly understandable but no less miserable ways: "Nationwide U.S. surveys of the last decade show that the lower the income of their parents, the worse is the health of the children, whether measured in overall health assessment, limitations on activity, school absence for illness, emergency ward visits, or hospital days."
The differences in health between the offspring of well-off and low-income parents "have been measured from the child's age of two, and the differentials then grow with age." A study of mortality rates among men in Central and East European countries shows a pattern of higher education corresponding to a longer life; men with only a primary education not only died earlier but were more prone to longstanding illnesses. (The patterns among women were comparable "but differentials are smaller, less than half the male average.")
Such inequalities within countries look small compared to those between countries, of course -- and Therborn piles up the examples of so many varieties of inequality from such diverse places that it becomes, after a while, either numbing or unbearable. Generalization is hazardous, but the pattern seems to be that a considerable variety of inequalities, both inter- and intranational, has sharpened over the past 30 years or so. Not even the author's own country of origin, Sweden -- so long the promised land for social democrats -- has been spared. Therborn's study of income developments in the Stockholm Metropolitan area between 1991 and 2010 showed that "the less affluent 80 percent of the population saw their income share decline, while the most prosperous 10 percent had their share augmented from 25 to 32 percent."
Furthermore, the share of the income that top tenth earned from playing the Stockholm Stock Exchange grew 282 percent over the same period. In Sweden as elsewhere, "the top side of intra-national inequality is driven primarily by capital expansion and concentration, and that at the bottom by (politically alterable) policies to keep the poor down and softened up to accept anything."
It seems unlikely that the CEOs, financiers, and politicians at Davos ever had it put to them quite like that. But Therborn seems equally unhappy with his own discipline, which he thinks has somehow managed to dodge thinking about inequality as such.
"Among the fifty odd Research Committees of the International Sociological Association," he writes, "there is not one focused on inequality." The closest approximation is the one on "Social Stratification," which he says "has mainly been interested in intergenerational social mobility."
That mobility having been, for the most part, upwards. But the distance from the bottom of society to its top verges ever more on the dystopian. In a rare flourish, Therborn invokes the alternative: "the positive lure of enlightened societies governed by rational and inclusive deliberation, where nobody is outcast or humiliated, and where everybody has a chance to develop his/her abilities."
To reach it, or even to move in that direction, implies a battle. "Nobody knows how it will end," he concludes. "Which side will you be on?"
I don't think he's asking just the people who will be there in Yokohama this summer.