In one of those cases where satire cannot trump cold hard fact, the power brokers and heavy thinkers who gathered at an Alpine resort in Davos, Switzerland, for the World Economic Forum last month expressed great concern about the danger that growing inequality poses to social stability everywhere. As well they might.
Strictly speaking, "widening income disparities" was only one of 10 issues flagged by the Forum's Outlook on the Global Agenda 2014report, along with "a lack of values in leadership" and "the rapid spread of misinformation online." But a couple of concerns on the list -- "persistent structural unemployment" and "the diminishing confidence in economic policies" -- were variations on the same theme. Two or three other topics were related to income disparity only a little less directly
In case you didn't make it to Davos last month (my invitation evidently got lost in the mail this year ... as it has every year, come to think of it), another gathering this summer will cover much of the same ground. The 18th World Congress of the International Sociological Association -- meeting in Yokohama, Japan, in mid-July -- has as its theme "Facing an Unequal World: Challenges for Global Sociology." The scheduling of their events notwithstanding, it was the sociologists who were really farsighted about the issue of growing inequality, not the "Davos men." The ISA announced the theme for its congress as early as December 2010.
And the conversation in Japan is sure to be more focused and substantive. A lot of business networking goes on during the World Economic Forum. By some accounts, the topic of inequality figured more prominently in the news releases than in actual discussions among participants. It's almost as if all of Bono's efforts at Davos were for nought.
Available a solid six months before the sociologists put their heads together in Yokohama, Goran Therborn's The Killing Fields of Inequality (Polity) ought to steer the public's thinking into deeper waters than anything that can be reached with a reductive notion like "widening income disparities." Money provides one measure of inequality, but so do biomedical statistics, which record what Therborn, a professor emeritus of sociology at the University of Cambridge, calls "vital inequality." (Income disparities fall under the heading of "resource inequalities," along with disparities in access to nutrition, education, and other necessities of life.)
A third, less quantifiable matter is "existential inequality," which Therborn defines as "the unequal allocation of personhood, i.e., of autonomy, dignity, degrees of freedom, and of rights to respect and self-development." A big-tent concept of Therborn's own making, existential inequality covers the limitations and humiliations imposed by racism, sexism, and homophobia but also the experience of "people with handicaps and disabilities or just the indigent overlorded by poorhouse wardens or condescending socio-medical powerholders," among others.
While analytically distinct, the three forms of inequality tend to be mutually reinforcing, often in perfectly understandable but no less miserable ways: "Nationwide U.S. surveys of the last decade show that the lower the income of their parents, the worse is the health of the children, whether measured in overall health assessment, limitations on activity, school absence for illness, emergency ward visits, or hospital days."
The differences in health between the offspring of well-off and low-income parents "have been measured from the child's age of two, and the differentials then grow with age." A study of mortality rates among men in Central and East European countries shows a pattern of higher education corresponding to a longer life; men with only a primary education not only died earlier but were more prone to longstanding illnesses. (The patterns among women were comparable "but differentials are smaller, less than half the male average.")
Such inequalities within countries look small compared to those between countries, of course -- and Therborn piles up the examples of so many varieties of inequality from such diverse places that it becomes, after a while, either numbing or unbearable. Generalization is hazardous, but the pattern seems to be that a considerable variety of inequalities, both inter- and intranational, has sharpened over the past 30 years or so. Not even the author's own country of origin, Sweden -- so long the promised land for social democrats -- has been spared. Therborn's study of income developments in the Stockholm Metropolitan area between 1991 and 2010 showed that "the less affluent 80 percent of the population saw their income share decline, while the most prosperous 10 percent had their share augmented from 25 to 32 percent."
Furthermore, the share of the income that top tenth earned from playing the Stockholm Stock Exchange grew 282 percent over the same period. In Sweden as elsewhere, "the top side of intra-national inequality is driven primarily by capital expansion and concentration, and that at the bottom by (politically alterable) policies to keep the poor down and softened up to accept anything."
It seems unlikely that the CEOs, financiers, and politicians at Davos ever had it put to them quite like that. But Therborn seems equally unhappy with his own discipline, which he thinks has somehow managed to dodge thinking about inequality as such.
"Among the fifty odd Research Committees of the International Sociological Association," he writes, "there is not one focused on inequality." The closest approximation is the one on "Social Stratification," which he says "has mainly been interested in intergenerational social mobility."
That mobility having been, for the most part, upwards. But the distance from the bottom of society to its top verges ever more on the dystopian. In a rare flourish, Therborn invokes the alternative: "the positive lure of enlightened societies governed by rational and inclusive deliberation, where nobody is outcast or humiliated, and where everybody has a chance to develop his/her abilities."
To reach it, or even to move in that direction, implies a battle. "Nobody knows how it will end," he concludes. "Which side will you be on?"
I don't think he's asking just the people who will be there in Yokohama this summer.
The Indian arm of Penguin Random House has agreed to pull from the market all copies of a University of Chicago scholar's 2009 book on Hinduism that came under attack from some conservatives in the country, The Wall Street Journalreported. The book, "The Hindus: An Alternative History," by Wendy Doniger, the Mircea Eliade Distinguished Service Professor of the History of Religions at Chicago, was withdrawn as part of a settlement (obtained by the Journal) with a nationalist group that had complained about the book. In a statement, Doniger said she was "deeply troubled by what it foretells for free speech in India in the present, and steadily worsening, political climate."
Nothing sharpens memory quite like regret, so I cannot help noting the anniversary of a tossed-off phrase that has come back to haunt me many times over the past 10 years.
In early 2004, I began writing an occasional series of two- or three-paragraph squibs on the latest publications and doings of the Slovenian thinker Slavoj Žižek for The Chronicle of Higher Education, where it ran under the title "Žižek Watch." In the subhead for one such mini-article, I referred to him as "the Elvis of cultural theory." The expression took wings and has been repeated on more occasions than any sane person could track. (As of this writing, it gets 79,000 returns from Google.)
The phrase will outlive me. Last year it appeared in an article in the journal Critical Inquiry, as well as in a Canadian dissertation on the concept of totalitarian evil in the work Hannah Arendt. Someone will eventually write a book using it as a title. Remembering the line always make me cringe, as if from mild food poisoning. For the most salient quality of "the Elvis of cultural theory" -- judged, by any standard, as a characterization of Žižek's work or career -- is its near perfect meaninglessness, verging on hopeless and absolute stupidity.
Unless you know the inside joke, anyway. By my count, roughly two people in the world are in on it. So to mark the anniversary, it is time finally to put the backstory on the record.
The idea for "Žižek Watch" came from my editor at the time, Richard Byrne, an estimable playwright and cultural journalist with family roots in the Balkans. These days Rich is at the helm of the University of Maryland Baltimore County's UMBC Magazine, of which he is the founding editor. We shared a fascination with Žižek's close but complex relationship with the Slovenian post-punk band Laibach and the avant garde movement around it. Given the pace of his output (two or three books a year, just in English) and the growing frequency with which he had begun appearing in odd corners of the mass media, it felt like a matter of time before he graced The National Enquirer, or at least Weekly World News.
So it was that through a chain of associations that "Žižek Watch" alluded -- very much in passing -- to the definitive song about the improbable ubiquity of a tabloid phenomenon: "Elvis is Everywhere" by Mojo Nixon & Skid Roper.
And the rest is, if not history, at least a decade-long lesson in the sliding of the signifier across the greased skids of digital-age publicity.
A footnote in one article from 2005 did trace "the Elvis of cultural theory" back to its first appearance, albeit without identifying the origins of the phrase as such. But by now, context is irrelevant. The expression has long since escaped meaning. And even though nobody seems to get it, does not the very circulation of my remark participate in what Žižek identifies as the "mystery" of jokes -- that they seemingly appear "all of a sudden out of nowhere," produced by "the anonymous symbolic order" through "the very unfathomable contingent generative power of language"?
So writes Elvis, or somebody, in the introduction to Žižek's Jokes (Did you hear the one about Hegel and negation?), published by MIT Press. It is an anthology of the theorist's shtick, not an analysis of it. The cover describes it as "contain[ing] every joke cited, paraphrased, or narrated in Žižek's work in English (including some in unpublished manuscripts), including different versions of the same joke that make different points in different contexts." The sources of the collected passages are given in the book's endnotes, followed with a brief yet oddly repetitive afterword by a novelist and songwriter from Scotland who lives in Japan and writes under the pen name Momus.
The claim to be exhaustive is difficult to credit, and so is the rationale offered for its existence: "The larger point being that comedy is central to Žižek's seriousness." Along with his frequent digressions into popular culture, Žižek's use of jokes has lent his books an appearance of accessibility that accounts for his fame with a broad audience. But that quality is misleading. Žižek practices a form of what Freud called "wild psychoanalysis," with contemporary culture as the analysand. The remarks, quoted earlier, about the free-floating and anonymous nature of jokes are just Žižek's paraphrase of a point made in Jokes and Their Relation to the Unconscious, where Freud interpreted the erotic and aggressive drives manifested through manipulation of the funny bone.
By spelling that out, I've just told you more about why "comedy is central to Žižek's seriousness" than Žižek's Jokes ever does. In the afterword, Momus speculates that "the joke has become for Zizek what algebra is for his old ally and rival Badiou: the most concise way Žižek knows how to sum up a universal situational shape." The idea might well be developed further, preferably by someone who knows that Badiou's interest is in formalized set theory rather than algebra. But as formulated it is more a gesture than an insight
A gesture serving mainly to distract attention from two striking things about the book. The first is that Žižek's Jokes makes unavoidably obvious something that it was still possible to overlook 10 years ago: the dynamic role of cut-and-paste in Žižekian production.
Žižek once said that his completed theoretical edifice, spanning several volumes, would amount to a Summa Lacanica rivaling Aquinas's Summa for both scope and cohesion. But along the way, he has met the growing demand for his work from the editors of books, magazines, and newspapers by tearing off suitably sized chunks of whatever manuscript he had in progress. Sometimes he tweaked things to make it appear like freestanding essay or topical news commentary. And sometimes he did not, though publication was almost certain either way. (I know of one case where the author of a book tried, without success, to have the introduction commissioned from Žižek removed since it had nothing to do with the volume in question.)
Over time, reading Žižek became an experience in déjà vu, with passages from one volume reappearing in others or, in one case, twice in the same book. Žižek's Jokes takes this to a new level. He wrote nothing new for it. Even his two-page introduction consists of one long paragraph from an earlier book. It is a remarkable accomplishment and I do not imagine he will be able to surpass it.
The other striking feature of Žižek's Jokes is how grim the experience of reading it quickly proves to be. In accord with Freudian principles, they revolve almost entirely around sex and/or aggression, often involving racist or misogynist sentiments. All of which is fine when they appear as specimens in a cultural critique -- where they might even elicit a laugh, given the incongruity of seeing them in a context where Hegel or Heidegger have set the terms for analysis. But running through them one after another, in the service of no argument, is deadening. It ceases to be shocking. It just seems lame. Maybe he should be known as "the Jay Leno of cultural theory?" (If, you know, Leno had Tourettes.)
Of course it's also possible that Žižek has a hidden agenda -- that he's sick of being considered hilarious by people who aren't really interested in Hegel, et al., and so has decided to destroy that reputation in the most efficient way possible. And I'm not even joking about that. It makes a certain amount of sense.
Two great models of eloquence in the English language are The Book of Common Prayer and the translation of the Bible usually called the King James Version. A memorable passage that appears in both volumes crossed my mind while thinking about a couple of recent works of social criticism. (It also happens that Princeton University Press recently brought out The Book of Common Prayer: A Biography by Alan Jacobs, a professor of humanities at Baylor University, which a couple of readers have highly recommended.)
The text in question appears a couple of times in the New Testament as part of what's usually called "the Lord's Prayer." The Book of Common Prayer, the older of the two volumes, renders one line of the prayer as "Forgive us our trespasses, as we forgive those who trespass against us." The KJV rendering says, "Forgive us our debts, as we forgive our debtors."
To my ear, "trespasses" works better rhythmically, and it expresses the notion of "sin" or "offense" in a slightly more elegant manner. By contrast, "debt" or "debtor" expresses the same thought in a blunt and harsh way, and even conjures the old cartoon image of St. Peter recording good and evil deeds in a big ledger at the gates of heaven. Puzzled by the contrast, I consulted an extremely literal translation by J.N. Darby -- a Victorian Biblical scholar of uncompromising severity -- who suggests that "debt" is indeed what the original text says.
Around the time Darby was working on his translation, Friedrich Nietzsche fleshed out an argument about the interrelationship among guilt, debt, and memory. Bringing up an atheist philosopher pretty well guarantees someone is now offended. But The Genealogy of Morals spells out in bleak and somewhat lurid terms a point left implicit in the prayer: The debtor is at the mercy of the creditor, who has the right (or at least the power) to inflict suffering -- even bloody revenge -- when payment is not made.
Whatever else it may signify, the brutal connotations of "debt" make forgiveness sound much more demanding and consequential than "trespass" would imply. (Awkward recollection: Learning the prayer as a little kid, I pictured God being unhappy that people were ignoring a sign on His lawn.)
Homo economicus never spent all that much time on moral accounting. But at least the old bourgeois virtues included restraint and a residual belief that self-interest was justified insofar as it served a larger good. The issues that concern Andrew Ross in his new bookCreditocracy (discussed in last week's column) unfold in a world where debt itself is a kind of demigod, answerable to no higher power of any kind -- and certainly not to the state.
As the example of credit-default swaps on subprime mortgages in the go-go '00s made clear, the alchemists of finance are able to create profitable investment opportunities out of the risk (i.e., the degree of likelihood) of non-repayment -- making possible the creation of enormous fortunes from loans that cannot be repaid, at least not in full. That is but one link in a complex chain of debt-creation. Should the speculative bubble burst, the job of preventing economic meltdown falls to the government (which already has its own deficits, of course) at whatever risk to allocations for education, infrastructure, etc.
Add to it an average household debt that, Ross notes, grew from 43 percent of gross domestic product in 1980 to 97 percent in 2008 -- across three decades of stagnating wages. Throughout that period, 60 percent of income gains went to the country's wealthiest 1 percent -- a trend that changed dramatically when the economic crisis hit. Since then, 95 percent of income gains have gone to that debt-creating (if not job-creating) sliver.
David J. Blacker, a professor of philosophy of education and legal studies at the University of Delaware, characterizes the situation with a simple image in The Falling Rate of Learning and the Neoliberal Endgame (Zero Books):
"Imagine a casino in which you play with the house money and if you win you get to keep all the winnings to yourself, whereas if you lose, the house covers your bets. The literally astronomical public sums required to continue this arrangement for the minutest percentage of the population is the proximal cause of the squeeze on public resources. Schoolchildren, the poor, the sick, the disabled, the elderly etc., must all sacrifice so elites no longer have to undergo the risks that are officially supposed to be inherent in their role as fearless capitalist risk-takers. ..." But genuine competition and risk are reserved "for small businesses and other little people like private and public sector employees."
Ross responds to the debt-driven status quo by challenging a whole series of moral reflexes that have traditionally accompanied debt: the feelings of obligation and culpability, of shame and implied weakness, that the prayer rendered in the King James translation take as a given. When access to socially necessary goods (particularly higher education) is restricted or undermined by an economy making debt all but inescapable for countless people, someone ought to feel guilty when students default on their loans -- just not the students themselves. The next step is to call for large-scale fiscal disobedience: a social movement of millions of people pledging to default on their student loans. On the far side of that and other radical confrontations with the debt machine, Ross conceives the possibility of morally sound, humanely responsible systems of finance, based on communitarian social forms. Not utopia, perhaps, but a long way from here.
Massive default is a strategy I find it easier to admire, or at least to daydream about, than to recommend. It is not impossible that a million people might make such a pledge. Carrying out the action is another matter -- and if only a fraction see it through, the result is bound to be martyrdom of an uninspiring and ineffectual kind. In any case, I have no student debt to default on in solidarity, and calling for others to do so would be a case of telling them, "Let's you and him go fight."
Like Creditocracy, David Blacker's book was written in the wake of Occupy Wall Street. But where Ross occasionally sounds like Pierre-Joseph Proudhon -- with his vision of a mutualist society of small producers, exchanging goods and services with a new form of money that doesn't promote inequality -- Blacker thinks along much more classically Marxist lines. The predatory forms of financial speculation that led to the crisis five years ago will not be regulated out of existence, nor are they deviations or tumors growing on a fundamentally healthy economy. The casino will keep rewarding the high rollers when they win and shaking the rest of society down when they lose. Such investment in manufacture as continues to be made will need workers with skills and the capacity to adapt to technological developments -- but ever fewer of them.
Most of the population will be an object for social control, rather than Schooling proper. At some level most of us sense this already, making the whole notion of "education as investment in the future" an ever more problematic principle. Blacker has written probably the gloomiest book I have read in years, but in some ways it seems like a practical one. He is not a survivalist. He thinks pedagogy still has a role, provided it's geared to understanding the dire probabilities and finding ways to respond to them. It helps that Blacker is a sharp and forceful writer, giving his analysis something of the vividness and urgency of an Old Testament prophet delivering warnings that nobody really wants to hear.
About 600,000 books from the library of the University of Missouri at Columbia -- stored at an off-campus facilities -- have been damaged by mold, The Columbia Daily Tribune reported. The university plans to remove the mold from some of the books, but the high cost of that process (about $3 per book) probably means that all of the books can't be saved.
The golden age of unsolicited credit-card applications ended about five years ago. It must have been a relief at the post office. At least ten envelopes came each week -- often with non-functioning replica cards enclosed, to elicit the anticipatory thrill of fresh plastic in the recipient’s hot little hand.
For a while, I would open each envelope and carefully shred anything with my name on it, lest an identity thief go on a shopping spree in my name. But at some point I gave up, because there were just too many of them. Besides, any identity thief worth worrying about enjoyed better options than trash-diving for unopened mail.
Something started happening circa 2006 or ’07. More and more often, the very envelopes carried wording to the effect that approval for a new card was a formality, so act now! With the benefit of hindsight, this reads as a last surge of economic acceleration before the crash just ahead. But at the time, I figured that credit-card companies were growing desperate to grab our attention, since many of us were throwing the offers away without a second glance.
The two alternatives -- turbocharged consumerism on the one hand, the depleted willingness (or capacity) of consumers to take on more debt, on the other -- are not mutually exclusive. It was subprime mortgages rather than overextended credit cards that brought the go-go ’00s to an early end, but each was a manifestation of the system Andrew Ross writes about in Creditocracy and the Case for Debt Refusal (OR Books).
Ross, a professor of social and cultural analysis at New York University, was active in Occupy Wall Street, and Creditocracy bears a few traces of the movement, both in its plainspoken and inclusive expressions of anger (this I like) and its redeployment of old anarco-syndicalist ideas (that, not so much).
One commonplace account of the near-collapse of the world financial system in 2008 is that it was the product of consumer hedonism at its most irresponsible. It was just deserts for people playing Xbox on jumbo flat-screen TVs in subprime-mortgaged houses they shouldn't be in. Whatever the limits of its explanatory power, this interpretation allows for a pleasing discharge of moralistic aggression. Hence its popularity. The most familiar argument opposing it places the blame, rather, on bankers, brokers, and other criminals “too big to jail.” It was they who were greedy and short-sighted, not average people.
Besides the more obvious similarities, what these explanations share is an implication that the disaster could have been avoided with some self-discipline and the understanding that hyperbolic discounting is a very bad habit.
Ross leans in the anti-plutocratic direction, but he proves ultimately less interested in the morality of anyone’s decisions than he is in the framework that permits, or demands, those decisions in the first place. The system he calls “creditocracy” turns out debt as fast and efficiently as Detroit once did automobiles, and just as profitably:
“Financiers seek to wrap debt around every possible asset and income stream,” he writes, “ensuring a flow of interest from each…. [T]he tipping point for a creditocracy occurs when ‘economic rents’ – from debt-leveraging, capital gains, manipulation of paper claims through derivatives and other forms of financial engineering – are no longer merely supplementary sources of income, but have become the most reliable and effective instrument for the amassing of wealth and influence.”
At that level of description, Ross has simply given a new name to what Rudolf Hilferding, writing a hundred years ago, called “finance capital.” But what Hilferding had in mind was the merger of banking and industrial capitalism – the marriage of big money and big factories, with monopoly presiding. Creditocracy, by contrast, “goes small,” insinuating itself into every nook and cranny of life. The relationship between creditor and debtor takes many different shapes, some more overt than others.
When you take out a student loan or a mortgage, your submission to the financial system is more or less deliberate, and in any event explicit. It runs deeper, and proves less purely voluntary, if you have to use credit cards in lieu of unemployment insurance. The credit relationship is much more efficiently disguised if it takes the form of an unpaid internship – the “exchange” of your time and skills for intangible and impossible-to-quantify credit” toward a future job, if you’re lucky.
And if that doesn’t pan out, you might end up working in one of the less desirable positions at Walmart or Taco Bell, among other corporations that banks have persuaded, Ross writes, “to pay their employees with prepaid debit cards that are only lightly regulated.” The banks then “charge the users fees to make ATM withdrawls and retail purchases, along with inactivity fees for using their cards. Almost all of these are minimum or subminimum wage employees, compelled to fork over a fee to enjoy their paycheck." (The practice was described in a New York Timesarticle a few months ago.)
In next week’s column, I’ll consider Ross’s analysis of how the impact of creditocracy on education amounts to a ruthless exploitation, not just of present-day society, but of the future. We’ll also take a look at the comparable argument in a new book called The Falling Rate of Learning and the Neoliberal Endgame (Zero Books) by David J. Blacker, a professor of philosophy of education and legal studies at the University of Delaware.
Until then, I’ll sign off by mentioning that someone has just sent me an application for a $40,000 line of credit. This must be evidence of that “recovery” one reads about. If so, we’re in real trouble.
Ties between libraries and their institutions' university presses are growing, according to a survey released Tuesday by the Association of American University Presses. A report issued with the survey results praises this collaboration, but urges both parties to work to avoid duplication of services and to coordinate their activities.
The nine muses are a motley bunch. We’ve boiled them down into a generic symbol for inspiration: a toga-clad young woman, possibly plucking a string instrument. But in mythology they oversaw an odd combination of arts and sciences. They were sisters, which allegorically implies a kinship among their fields of expertise. If so, the connections are hard to see.
Six of them divvied up the classical literary, dramatic, and musical genres – working multimedia in the cases of Erato (inventor of the lyre and of love poetry) and Euterpe (who played the flute and inspired elegaic songs and poems). The other three muses handled choreography, astronomy, and history. That leaves and awful lot of creative and intellectual endeavor completely unsupervised. Then again it’s possible that Calliope has become a sort of roaming interdisciplinary adjunct muse, since there are so few epic poets around for her to inspire these days.
An updated pantheon is certainly implied by Peter Charles Hoffer’s Clio Among the Muses: Essays on History and the Humanities (New York University Press). Clio, the demi-goddess in charge of history, is traditionally depicted with a scroll or a book. But as portrayed by Hoffer -- a professor of history at the University of Georgia – she is in regular communication with her peers in philosophy, law, the social sciences, and policy studies. I picture her juggling tablet, laptop and cellphone, in the contemporary manner.
Ten years ago Hoffer published Past Imperfect, a volume assessing professional misconduct by American historians. The book was all too timely, appearing as it did in the wake of some highly publicized cases of plagiarism and fraud. But Hoffer went beyond expose and denunciation. He discussed the biases and sometimes shady practices of several well-respected American historians over the previous 200 years. By putting the recent cases of malfeasance into a broader context, Hoffer was not excusing them; on the contrary, he was clearly frustrated with colleagues who minimized the importance of dealing with the case of someone like Michael Bellesiles, a historian who fabricated evidence. But he also recognized that history itself, as a discipline, had a history. Even work that seemed perfectly sound might be shot through with problems only visible with the passing of time.
While by no means a sequel, Clio Among the Muses continues the earlier book’s effort to explain that revisionism is not a challenge to historical knowledge, but rather intrinsic to the whole effort to establish that knowledge in the first place. “If historians are fallible,” Hoffer writes, “there is no dogma in history itself, no hidden agenda, no sacred forms – not any that really matter – that are proof against revision… Worthwhile historical scholarship is based on a gentle gradualism, a piling up of factual knowledge, a sifting and reframing of analytical models, an ongoing collective enterprise that unites generation after generation of scholars to their readers and listeners.”
Hoffer’s strategy is to improve the public’s appreciation of history by introducing it to the elements of historiography. (That being the all-too-technical term for the history of what historians do, in all its methodological knottiness.) One way to do so would be through a comprehensive narrative, such as Harry Elmer Barnes offered in A History of Historical Writing (1937), a work of terrific erudition and no little tedium. Fortunately Hoffer took a different route.
Clio Among the Muses instead sketches the back-and-forth exchanges between history and other institutions and fields of study: religion, philosophy, law, literature, and public policy, among others. Historians explore the topics, and use the tools, created in these other domains. At the same time, historical research can exert pressure on, say, how a religious scripture is interpreted or a law is applied.
Clio’s dealings with her sisters are not always happy. One clear example is a passage Hoffer quotes from Charles Beard, addressing his colleagues at a meeting of the American Historical Association in 1933: “The philosopher, possessing little or no acquaintance with history, sometimes pretends to expound the inner secret of history, but the historian turns upon him and expounds the secret of the philosopher, as far as it may be expounded at all, by placing him in relation to the movement of ideas and interests in which he stands or floats, by giving to his scheme of thought its appropriate relativity.”
Sibling rivalry? The relationships are complicated, anyway, and Hoffer has his hands full trying to portray them. The essays are learned but fairly genial, and somehow not bogged down by the fundamental impossibility of what the author is trying to do. He covers the relationship between history and the social sciences – all of them -- in just under two dozen pages. Like Evel Knievel jumping a canyon, you have to respect the fact that, knowing the odds, he just went ahead with it.
But then, one of Hoffer’s remarks suggests that keeping one’s nerve is what his profession ultimately requires:
“Historical writing is not an exercise in logical argument so much as an exercise in creative imagination. Historians try to do the impossible: retrieve an ever-receding and thus never reachable past. Given that the task is impossible, one cannot be surprised that historians must occasionally use fallacy – hasty generalization, weak analogy, counterfactual hypotheticals, incomplete comparisons, and even jumping around in past time and space to glimpse the otherwise invisible yesteryear.”
And if they did not do so, we’d see very little of it at all.