Books

"The Costs of Publishing Monographs," a report from Ithaka (essay)

Nearly a month has passed since the release of “The Costs of Publishing Monographs: Toward a Transparent Methodology,” a document prepared by the consulting and research division of Ithaka S+R. (Ithaka is also associated with JSTOR, the scholarly journals repository.) The report seems not to have drawn much attention outside the ranks of the Association of American University Presses, which seems odd. It ought to be of some interest to the larger constituency of those who buy, read and/or write scholarly books.

If you mention the price of academic-press books to people who’ve never purchased one, the effect is akin to a cartoon character with eyeballs popping out and exclamation marks hovering in the air, with a thought balloon reading, “What a racket!” (On one occasion I heard it said aloud.) The dismay will usually cool off some as you explain how the specialist nature of scholarly publications tends to preclude economies of scale. A small audience means low press runs, yielding high per-unit costs. That’s not the whole story, of course, but it often suffices to explain why, say, a slender new book interpreting Moby Dick might cost five times as much as a Melville biography thick enough to serve as a doorstop -- and why no one in the family has purchased Aunt Louise’s book, even if they’re proud she got tenure for it.

The authors of the new Ithaka report mention a ballpark estimate of the expense to a press of preparing a scholarly book for publication (not printing, just getting it to that point) that has been bandied about over the past couple: $20,000. It’s problematic, but let’s imagine, for the sake of argument, that it costs that much to prepare and to print a monograph, and that every single one of its 400 copies is sold. In that case the absolute lowest wholesale price of a single volume has to be $50, just to break even. Many trade publishers would consider a print run 10 times that size to be small -- with each copy selling at a much lower price while still making a profit. It’s not that trade presses are models of efficiency that scholarly presses ought somehow to emulate -- not at all. They resemble one another about as much as an ostrich egg and a cannonball do, and the differences cannot be tinkered away.

Ithaka’s researchers collected information on the expenses involved in bringing out 382 books from the arts, humanities and social sciences published by 20 American university presses during their 2014 fiscal year. The data assembled were granular -- drawn from the sort of in-house bookkeeping each department (editorial, production, marketing, etc.) had to do while handling each title. Some expenses are more discretely defined than others. The cost of sending a manuscript out for copyediting, for example, is not too hard to determine; just look at the invoice. Calculating the fraction of an acquisition editor’s salary that went into a given book seems more difficult -- besides which there are the overhead expenses of clerical labor, rent, tech support and so on, some of them provided by the hosting university.

The 20 presses surveyed range from small presses (averaging roughly 11 employees publishing 46 titles per year, with an annual revenue from books of under $1.5 million) to powerhouses (circa 82 employees, 253 titles and more than $6 million annual revenue). They are segmented into four size categories, with five presses each, and with some effort made for geographical diversity and varying publishing foci (monographs, journals, regional titles).

In short, it must be one hell of a spreadsheet -- and the researchers establish three ways of defining cost per book to reflect the varying impacts of staff time, overhead expense and institutional support. One effect of the analysis is that the figure of $20,000 per book in preparation expenses goes right out the window: the study “yielded a wide range of costs per title, from a low of $15,140 to a high of $129,909, and the range of costs is wide both within and across groups.” Taking in the varying ways of assessing the expenses of almost 400 titles, the researchers find that the average cost per monograph is between $28,747 (using the minimal baseline) and not quite $40,000 (factoring in indirect overhead expenses). It bears repeating that this is not the final cost of publishing, printing, binding and warehousing monographs of the predigital sort would entail additional expense.

The Ithaka report focuses, rather, “on the costs of producing the first digital copy of ‘a high-quality digital monograph.’” For that to be the benchmark -- rather than the traditional hardback monograph -- is in keeping with the expectation that scholarship be made available in open-access form, as both federal mandates and the emerging academic ethos increasingly demand.

For scholarly publishing to meet the standards of quality established over the past century will require continued investment in the kinds of intensive, skilled labor that university presses foster. How to meet that demand while simultaneously developing ways of funding open-access publishing remains to be worked out. Ithaka S+R’s report doesn’t underestimate the difficulties; it just reminds us that the problem is on the agenda, or needs to be. Otherwise, the shape of things to come in scholarly publishing could get very messy -- and not in an especially creative way.

Editorial Tags: 

Essay on Umberto Eco

“One of the most profoundly exciting moments of my life,” Gertrude Stein recalled in a lecture at Columbia University in the mid-1930s, “was when at about 16 I suddenly concluded that I would not make all knowledge my province.” It is one of her more readily intelligible sentences, but I have never been able to imagine the sentiment it expresses. Why “profoundly exciting”? To me it sounds profoundly depressing, but then we’re all wired differently.

Umberto Eco, who died last week at the age of 84, once defined the polymath as someone “interested in everything, and nothing else.” (Now that’s more like it!) The formulation is paradoxical, or almost: the twist comes from taking “nothing else” to mean “nothing more.” It would be clearer to say that polymaths are “interested in everything, and nothing less,” but also duller. Besides, the slight flavor of contradiction is appropriate -- for Eco is describing an attitude of mind condemned to tireless curiosity and endless dissatisfaction, first of all with its own limits.

Eco’s work has been a model and an inspiration for this column for almost 30 years now, which is about 20 more than I’ve been writing it. The seed was planted by Travels in Hyperreality, the first volume of his newspaper and magazine writings to appear in English. Last year “Intellectual Affairs” celebrated the long-overdue translation of Eco’s book of sage advice on writing a thesis. An earlier essay considered the public dialogues that he and Jürgen Habermas were carrying on with figures from the Vatican. And now -- as if to make a trilogy of it -- saying farewell to Eco seems like an occasion to discuss perhaps the most characteristic quality of Eco’s mind: its rare and distinctive omnivorousness.

Eco himself evidently restricted his own comments on polymathy to that one terse definition. I must be garrulous by contrast but will try to make only two fairly brief points.

(1) As his exchange of open letters with Cardinal Carlo Maria Martini, the former archbishop of Milan, indicated, Eco was a lapsed but not entirely ex-Catholic: one who no longer believed but -- for reasons of personal background and of scholarly expertise as a medievalist -- still carried much of the church’s cultural legacy inside himself. His first book, published in 1956, was a study of St. Thomas Aquinas’s aesthetics that began as a thesis written “in the spirit of the religious worldview” of its subject. And the encyclopedic range and dialectical intricacies of the Angelic Doctor’s Summa Theologica never lost their hold on Eco’s imagination.

“Within Thomas's theological architecture,” Eco wrote in an essay in 1986, “you understand why man knows things, why his body is made in a certain way, why he has to examine facts and opinions to make a decision, and resolve contradictions without concealing them, trying to reconcile them openly …. He aligned the divergent opinions [of established philosophers and theologians], clarified the meaning of each, questioned everything, even the revealed datum, enumerated the possible objections, and essayed the final mediation.”

Eco regarded the Summa’s transformation into an authoritative statement of religious doctrine as nothing less than a disaster. In the hands of his successors, “Thomas's constructive eagerness for a new system” degenerated into “the conservative vigilance of an untouchable system.” Eco was -- like Étienne Gilson and Alasdair MacIntyre, among others -- part of the 20th-century rediscovery of Aquinas as the builder of a dynamo rather than the framer of a dogma. And there’s no question but that the medieval theologian exemplified “an interest in everything, and nothing else.”

(2) In the early 1960s, Eco was invited to participate in an interdisciplinary symposium on “demythicization and image” in Rome, along with an impressive array of philosophers, theologians, historians and classical scholars. Among them would be Jesuit and Dominican monks. He felt an understandable twinge of anxiety. “What was I going to say to them?” he recalled thinking. Remembering his enormous collection of comic books, Eco had a flash of inspiration:

“Basically [Superman] is a myth of our time, the expression not of a religion but of an ideology …. So I arrive in Rome and began my paper with a pile of Superman comics on the table in front of me. What will they do, throw me out? No sirree, half the comic books disappeared; would you believe it, with all the air of wishing to examine them, the monks with their wide sleeves spirited them away ….”

The anecdote might be used as an example of Eco’s interest in semiotics: the direction his work took after establishing himself as a medievalist. Comic books, Leonardo da Vinci paintings, treatises in Latin on demonology …. all collections of signs in systems, and all potentially analyzable. Nor was his conference presentation on Superman the end of it. Not much later, Eco published an essay about the world of Charlie Brown called "On 'Krazy Kat' and 'Peanuts.'"

But in fact those two papers were written before Eco’s turn to semiotics -- or semiology, if you prefer -- in the late 1960s. (The one on Peanuts reads as being influenced by Sartre, as much as anyone else.) Eco’s attitude towards mass media and popular culture was never one either of slumming or of populist celebration. Nor was it a matter of showing off the power and sharpness of cool new theoretical tools by carving up otherwise neglected specimens. He took it as a given that cartoons, movies and the crappy books issued by Italy’s vanity-publishing racket were -- like theological speculation or political conflict -- things that merited analysis and critique or that could become so, given interesting questions about them.

At the end of his remarks on Aquinas 30 years ago, Eco tried to imagine how the author might conduct himself if suddenly returned to life. Of course there’s no way to judge the accuracy of such a thought experiment’s results, but Eco’s conclusion seems like a personal creed: “He would realize that one cannot and must not work out a definitive, concluded system, like a piece of architecture, but a sort of mobile system, a loose-leaf Summa, because in his encyclopedia of the sciences the notion of historical temporariness would have entered …. I know for sure that he would take part in celebrations of his work only to remind us that it is not a question of deciding how still to use what he thought, but to think new things.”

And, Eco might have added, how to avoid settling for less than everything your mind might drive itself to understand.

Editorial Tags: 
Image Source: 
Roberto Serra - Iguana Press / Getty Images
Image Caption: 
Umberto Eco

Article on Antonin Scalia's most-cited law journal article

In October 1985 -- not quite a year before Antonin Scalia took his seat on the U.S. Supreme Court -- the California Law Review published a paper by Fred R. Shapiro called “The Most-Cited Law Review Articles.” Nothing by Scalia was mentioned, and no surprise. He had published a bit of legal scholarship, of course (including a paper in The Supreme Court Review in 1978) but overall his paper trail was fairly thin and unexceptional, which proved a definite advantage in getting the nominee through the Senate hearings without drama.

As for Shapiro's article, it reflected the arrival of a new quantification mind-set about assessing legal scholarship. Culling data concerning some 180 journals, Shapiro (now an associate librarian and lecturer in legal research at the Yale Law School) tabulated and ranked the 50 most influential law review articles published between 1947 and 1985. Or, at least, the 50 most often cited in other law review articles, since he did not count citations in judicial opinions or interdisciplinary journals. At the time, Shapiro described the effort as “somewhere between historiography and parlor game,” but it established him as, in the words of a later law review article, “the founding father of a new and peculiar discipline: ‘legal citology.’”

Shapiro revisited the project in 1996 with a paper that was broader in scope (it included the interdisciplinary “law and ____” journals) and also more fine grained, listing the top 100 “Most-Cited Law Review Articles of All Time” but also identifying the most-cited articles published in each year between 1982 and 1991. The second time around, he stressed the historiographic significance of his findings over any parlor-game aspect. “The great legal iconoclast and footnote-hater, Fred Rodell, missed the point,” wrote Shapiro. “Yes, footnotes are abominations destroying the readability of legal writing, but they proliferate and become discursive because they are where the action is.”

In the meantime, Scalia gave a lecture at Harvard University in early 1989 that appeared in the fall in the University of Chicago Law Review. It had a definite impact. By 1996, Shapiro included Scalia’s “The Rule of Law as a Law of Rules” in the list of the most-cited articles from 1989. It was in fourth place -- flanked, a bit incongruously, by Richard Delgado’s “Storytelling for Oppositionists and Others: A Plea for Narrative” (third) and Joan C. Williams’s “Deconstructing Gender” (fifth). Updating the study once more in 2012, Shapiro and his co-author Michelle Pearse placed Scalia’s “The Rule of Law as a Law of Rules” on its list of the most-cited law-review articles of all time, at number 36. By then, Delgado’s paper was in 68th place, while Williams was not on the list at all.

So much for the late justice’s place in the annals of legal citology. (Wouldn’t it make more sense to call this sort of thing “citistics”?) Turning to “The Rule of Law as a Law of Rules” itself, it soon becomes clear that its impact derives at least as much from the author's name as from the force of Scalia's argument. If written by someone not sitting in the highest court in the land, it would probably have joined countless other papers of its era in the usual uncited oblivion. That said, it is also easy to see why the paper has been of long-term interest, since it is a succinct, lucid and remarkably uncombative statement of basic principles by the figure responsible for some of the Supreme Court’s most vigorous and pungent dissents.

Scalia takes his bearings from a dichotomy he finds expressed in Aristotle’s Politics: “Rightly constituted laws should be the final sovereign; and personal rule, whether it be exercised by a single person or a body of persons, should be sovereign only in those matters on which law is unable, owing to the difficulty of framing general rules for all contingencies, to make an exact pronouncement.”

Scalia assumes here that the reader or listener will know that Aristotle writes this in the context of a discussion of democracy, in which laws are created by those elected to “the court, and the senate, and the assembly” by the many, in keeping with a well-made constitution (rather than issued by monarchs, priests or tyrants). Official policy and decisions must, in turn, follow the body of established and “rightly constituted law.” Anything else would amount to an usurpation of power.

Aristotle’s point would apply to anyone in office, but Scalia is concerned with the authority of judges, in particular. For their part, upholding the law means restraint in determining how it is applied: judges should keep the exercise of their own discretion as minimal as possible. Aristotle allows, and Scalia concurs, that at times it is not clear just how a law ought to be applied. In that case a judge’s decision must be made “on the basis of what we have come to call the ‘totality of the circumstances’ test,” in Scalia’s words.

Sometimes it can't be helped, but Scalia implies that curbs are necessary, lest judges feel an incentive to discover gray areas requiring them to exercise their discretion. “To reach such a stage,” he writes, “is, in a way, a regrettable concession of defeat -- an acknowledgment that we have passed the point where ‘law,’ properly speaking, has any further application.” It is “effectively to conclude that uniformity is not a particularly important objective with respect to the legal question at issue.” And when a higher court reviews a lower one’s decision, Scalia treats appealing to the totality of circumstances as even less acceptable. An appellate decision should draw out and clarify the general principles embodied in the law that apply in the particular case.

“It is perhaps easier for me than it is for some judges to develop general rules,” Scalia writes, “because I am more inclined to adhere closely to the plain meaning of a text.”

What's striking about his formulation is not that Scalia takes a position in the debate between originalism and “living Constitution”-alism, but that he spells out an important assumption. Not only is the “plain meaning” of a law clearly decipherable from the words of its text (once we’ve looked up, if necessary, any unfamiliar expressions from the era when it was written) but so are the rules for determining its principles and for applying the law. The Constitution is like a cake mix with the instructions right there on the box. And if a given concept is not used or defined there --“privacy,” for instance, to name one that Scalia regarded as unconstitutional, or at least nonconstitutional -- then its use is ruled out.

“If a barn was not considered the curtilage of a house in 1791 or 1868,” Scalia writes, “and the Fourth Amendment did not cover it then, unlawful entry into a barn today may be a trespass, but not an unconstitutional search and seizure. It is more difficult, it seems to me, to derive such a categorical general rule from evolving notions of personal privacy.”

The distinction is clear and sharply drawn, however blunt the hermeneutic knife Scalia is wielding. But the example also displays one of the great weaknesses of this approach, spelled out by David A. Strauss in the University of Chicago Law Review some years later: “Even if one can determine what the original understanding was, there is the problem of applying it to radically new conditions: Is a barn in the rural nation of 1791 to be treated as equivalent to, say, a garden shed in 21st-century exurbia?”

Furthermore, the clearly formulated principle in a law can be rendered null and void by those who want only the narrowest construction of “original intent.” In his magnum opus, Reading Law: The Interpretation of Legal Texts (2012), co-authored with Bryan A. Garner, Scalia quoted Joseph Story’s Commentaries on the Constitution of the United States (1833) on the value of preambles in understanding the significance and intended effect of a law: “The preamble of a statute is a key to open the mind of the makers, as to the mischiefs, which are to be remedied, and the objects, which are to be accomplished by the provisions of the statute.” As fellow Reagan judicial appointee Richard A. Posner pointed out when he reviewed Reading Law, an obvious instance would be the Second Amendment: “A well regulated Militia, being necessary to the security of a free State …” The preamble spells out that the amendment is, in Posner’s words “not about personal self-defense, but about forbidding the federal government to disarm state militias.” If it matters that the Constitution never explicitly identifies a right to privacy, then the complete lack of any reference to a right to individual gun ownership seems at least as conspicuous a silence.

Posner notes that when Scalia did mention the preamble in one decision, it was dismissive. Sometimes you “adhere closely to the plain meaning of a text,” it seems, and sometimes you just wish it would go away.

The skyrocket ascent of Scalia’s paper is easy to understand: whatever you think of the ideas, they are clearly and at times forcefully expressed, and “The Rule of Law as a Law of Rules” provided a glimpse into at least part of that enigmatic entity known as “the mind of the Supreme Court.” Absent that, its interest is likely to be chiefly historical or biographical. Other cards will take its place in the parlor game of citation and influence.

Editorial Tags: 

Review of "Hell Is a Very Small Place: Voices From Solitary Confinement"

No Exit by Jean-Paul Sartre involves three characters who are condemned to spend their afterlives together in a shabby but not especially uncomfortable room -- “condemned” because it’s the Devil himself who brings them together. Evidently some kind of infernal algorithm has been used to select the group, designed to create optimal misery. Sartre’s dialogue is quite efficient: we soon learn the broad outlines of their time on Earth and how it was shaped by wishful thinking and self-deception.

We also see how quick they are to recognize one another’s vulnerabilities. Any given pair of characters could find a mutually satisfying way to exploit each other’s neuroses. But there’s always that third party to disrupt things, rubbing salt into old wounds while inflicting fresh ones.

In a moment of clarity, one of them finally recognizes that they are damned and utters Sartre’s best-known line: “Hell is other people.” Quoting this is easy, and misunderstanding it even easier. It sounds like a classical expression of self-aggrandizing misanthropy. But the figures on stage did not wander over from the pages of an Ayn Rand novel. They are not sovereign egos, imposed upon by the demands of lesser beings whose failings they repudiate with lofty scorn.

On the contrary, Sartre’s characters are driven by a desperate and insurmountable need to connect with other people. They crave intimacy, acceptance, reciprocity. They also seek to dominate, manipulate, yield to or seduce one another, which would be difficult enough if they weren’t trying to do more than one at the same time. The efforts fail, and the failures pile up. Things grow messy and frustrating for all parties involved. Hell is other people, but the torment is fueled by one’s own self.

That insight rings even more true in the wake of Hell Is a Very Small Place: Voices From Solitary Confinement (The New Press), an anthology edited by Jean Casella, James Ridgeway and Sarah Shourd. I doubt anyone meant the title as an allusion to No Exit. The activists, scholars and prisoners contributing to the book document a place much darker and more brutal than Sartre imagined -- but akin to it, in that the damned are condemned, not to lakes of fire, but to psychic torture so continuous that it seems eternal. (Casella and Ridgeway are co-founders of Solitary Watch, while Shourd is a journalist who spent 410 days in solitary confinement while imprisoned in Iran.)

A few basic points: solitary confinement initially had humane intentions. Quaker reformers in the early American republic were convinced that prisoners might benefit from a period of reckoning with their own souls, which would come readily in isolation from the evil influence of low company. If so, they would reform and return to society as productive members. More secular versions of this line of thought also caught on. Unfortunately it did not work in practice, since prisoners tended to emerge no better for the experience, when not driven insane. By the turn of the 20th century, the practice was being phased out, if not eliminated, as ineffective and dangerous.

Now, my sense from reading around in JSTOR is that, from about 1820 on, whenever the issue of imprisonment came up, the eyes of the world turned to the United States. Other countries had a similar rise and fall of confidence in solitary confinement over the years. But the practice took on a new life in America starting in the 1980s. The aim of reforming prisoners was no longer a factor. Solitary confinement -- warehousing prisoners alone in a cell for 23 to 24 hours a day, minimizing contact with one another and with the outside world -- permitted mass incarceration at reduced risk to prison guards.

In 2011, Juan E. Méndez, the United Nations Human Rights Council’s Special Rapporteur on Torture and Cruel, Inhuman and Degrading Treatment or Punishment, issued a report on the use of prolonged solitary imprisonment around the world, with “prolonged” meaning more than 15 days. Administrators and government officials rejected his request to inspect isolation units in American prisons. In his contribution to Hell Is a Very Small Place, Méndez writes that the best estimate for the population of those in solitary confinement in the United States at any given time is 80,000 people, “but no one knows that for sure.” The personal accounts by prisoners in the book show that confinement American-style is more than “prolonged.” It can go on for years, and in some cases, for decades.

An isolation cell is sometimes called “the Box,” and the experience of living in one for months and years on end makes it sound like being buried alive. In the chapter “How to Create Madness in Prison,” Terry Kupers describes the symptoms that appear during a long stretch. The prisoner in isolation “may feel overwhelmed by a strange sense of anxiety. The walls may seem to be moving in on him (it is stunning how many prisoners in isolated confinement independently report this experience) …. The prisoner may find himself disobeying an order or inexplicably screaming at an officer, when really all he wants is for the officer to stop and interact with him a little longer than it takes for a food tray to be slid through the slot in his cell door. Many prisoners in isolated confinement report it is extremely difficult for them to contain their mounting rage ….”

But that is far from the extreme end of the spectrum, which involves psychotic breaks, self-mutilation and suicide. In isolation, time no longer passes through the usual cycle of hours, weekdays, months. The damaged mind is left to pick at its own scabs for what might as well be an eternity.

Hell Is a Very Small Place proves fairly repetitious, though it could hardly be otherwise. Reading the book leaves one with the horrible feeling of being overpowered by routines and forces that will just keep running from the sheer force of momentum. Last year, President Obama called for an extensive review and reform of prison conditions, and last month, he issued a ban on the solitary confinement of juveniles in the federal prison system. So that’s the good news, for however long it may last. But consider the enormous obstacle to change represented by the sunk cost of millions or billions of dollars spent to erect Supermax prisons -- let alone the businesses (and lobbyists) who depend on more of them being built.

Anyone housed in solitary for a while would have to envy the characters in No Exit. They have more room (a “box” is typically somewhere between 6 by 9 feet to a luxurious 8 by 10) and they have each other, like it or not. Sartre’s hell is imaginary; it exists only to reveal something about the audience. The idea of burying people alive in concrete tombs degrades the society that has turned it into reality. The phrase “solitary confinement of juveniles in the federal prison system” alone is the sign of something utterly unforgivable.

Editorial Tags: 

Review of Edward H. Miller, "Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy"

Trying to explain recent developments in the American presidential primaries to an international audience, a feature in this week’s issue of The Economist underscores aspects of the political landscape common to both the United States and Europe. “Median wages have stagnated even as incomes at the top have soared,” the magazine reminds its readers (as if they didn’t know and had nothing to do with it). “Cultural fears compound economic ones” under the combined demographic pressures of immigration and an aging citizenry.

And then there’s the loss of global supremacy. After decades of American ascent, “Europe has grown used to relative decline,” says The Economist. But the experience is unexpected and unwelcome to those Americans who assumed that the early post-Cold War system (with their country as the final, effectively irresistible superpower) represented the world’s new normal, if not the End of History. The challenges coming from Putin, ISIS and Chinese-style authoritarian state capitalism suggest otherwise.

Those tensions have come to a head in the primaries with the campaigns of Donald Trump and Bernie Sanders: newcomers to their respective parties whose rise seemed far-fetched not so long ago. To The Economist’s eyes, their traction is, if not predictable, then at least explicable: “Populist insurgencies are written into the source code of a polity that began as a revolt against a distant, high-handed elite.”

True enough, as far as it goes. The analysis overlooks an important factor in how “anti-elite” sentiment has been channeled over the past quarter century: through “anti-elitist” tycoons. Trump is only the latest instance. Celebrity, bully-boy charisma and deep pockets have established him as a force in politics, despite an almost policy-free message that seems to take belligerence as an ideological stance. Before that, there was the more discrete largess of the Koch brothers (among others) in funding the Tea Party, and earlier still, Ross Perot’s 1992 presidential campaign, with its folksy infomercials and simple pie charts, which drew almost a fifth of the popular vote. In short, “revolt against a distant, high-handed elite” may be written into the source code of American politics; the billionaires have the incentives and the means to keep trying to hack it.

If anything, even Perot was a latecomer. In the opening pages of Nut Country: Right-Wing Dallas and the Birth of the Southern Strategy (University of Chicago Press), Edward H. Miller takes note of a name that’s largely faded from the public memory: H. L. Hunt, the Texas oilman. Hunt was probably the single richest individual in the world when he died in 1974. He published a mountain of what he deemed “patriotic” literature and also funded a widely syndicated radio program called Life Line. All of it furthered Hunt’s tireless crusade against liberalism, humanism, racial integration, socialized medicine, hippies, the New Deal, the United Nations and sundry other aspects of the International Communist Conspiracy, broadly defined. ("Nut country" is how John F. Kennedy described Dallas to the first lady a few hours before he was killed.)

Hunt’s output was still in circulation when I grew up in Texas a few years after his death, and reading it has meant no end of déjà vu in the meantime: the terrible ideas of today are usually just the terrible ideas of yesterday, polished with a few updated topical references. Miller, an assistant teaching professor of history at Northeastern University Global, reconstructs the context and the mood that made Dallas a hub of far-right political activism between the decline of Joseph McCarthy and the rise of Barry Goldwater -- a city with 700 members of the John Birch Society. A major newspaper, The Dallas Morning News, helped spur the sales of a book called God, the Original Segregationist by running excerpts. Cold War anti-Communism mutated into a belief that the United States and the Soviet Union were in the process of being merged under the direction of the United Nations, in the course of which all reference to God would be outlawed. John F. Kennedy was riding roughshod over American liberties, bypassing Congress and establishing a totalitarian dictatorship in which, as H. L. Hunt warned, there would be “no West Point, no Annapolis, no private firearms -- no defense!”

An almost Obama-like dictatorship, then. Needless to say, these beliefs and attitudes are still with us, even if many of the people who espoused them are not.

Miller identifies two tendencies or camps within right-wing political circles in Dallas during the late 1950s and early ’60s. “Moderate conservatism” was closer to established Republican Party principles of free enterprise, unrelenting anti-Communism and the continuing need to halt and begin rolling back the changes brought by the New Deal. Meanwhile, “ultraconservatism” combined a sense of apocalyptic urgency with fear of all-pervasive subversion and conspiracy. A reader familiar with recent laments about the state of the Republican Party -- that it was once a much broader tent, with room for even the occasional liberal -- might well assume that Miller’s moderate conservatives consisted of people who liked Ike, hated Castro and otherwise leaned to a bit to the right wing of moderation, as opposed to ultraconservative extremism.

That assumption is understandable but largely wrong: Miller’s moderates were much closer to his ultras than either was to, say, the Eisenhower who sent federal troops to Little Rock, Ark. (Or as someone the author quotes puts it, the range of conservative opinion in Dallas was divided between those who wanted to impeach Supreme Court Justice Earl Warren and those who wanted to hang him.)

Where the difference between the moderates and the ultras ultimately combined to create something more durable and powerful than either of them could be separately was in opposition to the Civil Rights movement and their realignment of the segregationist wing of the Democratic Party. I’ll come back to Miller’s argument on this in a later column, once the primary season has progressed a bit. Suffice it to say that questions of realignment are looking a little antiquarian all the time.

Editorial Tags: 

Essay on David Bowie

There’s a special rung of hell where the serious and the damned writhe in agony, gnashing their teeth and cursing their fate, as they watch an endless marathon of historical documentaries from basic cable networks. Their lidless eyes behold Ancient Aliens, now in its tenth season, and High Hitler, which reveals that the Führer was a dope fiend. The lineup includes at least one program about the career of each and every single condemned soul in the serial-killer department, which is a few rungs down.

In the part of the inferno where I presumably have reservations, a lot of the programming concerns the history of rock music. With each cliché, a devil pokes you, just to rub it in. The monotonous predictability of each band’s narrative arc (struggle, stardom, rehab, Hall of Fame) is just part of it, since there are also the talking-head commentaries, interspersed every few minutes, by people unable to assess any aspect of the music except through hyperbole. Each singer was the voice of the era. Every notable guitarist revolutionized the way the instrument was played -- forever. No stylistic innovation failed to change the way we think about music, influencing all that followed.

Even the devils must weary of it, after a while. It probably just makes them meaner.

Here on earth, of course, such programming can be avoided. Choose to watch Nazi UFOs -- an actual program my TiVo box insists on recording every time it runs -- and you have really no one to blame but yourself.

But David Bowie’s death earlier this month left me vulnerable to the recent rerun of a program covering most of his life and work. Viewing it felt almost obligatory: I quit keeping track of Bowie’s work in the early 1980s (a pretty common tendency among early devotees, the near-consensus opinion being that he entered a long downturn in creativity around that time) so that catching up on Bowie’s last three decades, however sketchily, seemed like a matter of paying respects. It sounds like his last few albums would be worth a chance, so no regrets for watching.

Beyond that, however, the program offered only the usual insight-free superlatives -- echoes of the hype that Bowie spent much of his career both inciting and dismantling. Bowie had a precocious and intensely self-conscious relationship to mass media and spectacle. He was, in a way, Andy Warhol’s most attentive student. That could easily have led Bowie down a dead end of cynicism and stranded him there, but instead it fed a body of creative activity -- in film and theater as well as music -- that fleshed out any number of Oscar Wilde’s more challenging paradoxes. (A few that especially apply to Bowie’s career: “To be premature is to be perfect.” “One should either be a work of art or wear a work of art.” “Man is least himself when he talks in his own person. Give him a mask, and he will tell you the truth.”) There must be a whole cohort of people who lived through early theoretical discussions of postmodernism and performativity while biting our tongues, thinking that an awful lot of it was just David Bowie, minus the genius.

“Genius” can be a hype word, of course, but the biggest problem with superlatives in Bowie’s case isn’t that they are clichéd but that they’re too blunt. Claim that Bowie invented rock stardom, as somebody on TV did, for example, and the statement is historically obtuse while also somehow underestimating just how catalytic an impact he had.

As noted here in a column some months ago, Bowie is not among the artists David Shumway wrote about in Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press, 2014). And yet one aspect of Bowie’s career often taken as quintessential, his tendency to change appearances and styles, actually proves to be one of the basic characteristics of the rock star’s cultural role, well before his Thin White Duke persona rose from the ashes of Ziggy Stardust. Context undercuts the hype.

Elsewhere, in an essay for the edited collection Goth: Undead Subculture (Duke, 2007), Shumway acknowledges that Bowie did practice a kind of theatricalization that created a distinctive relationship between star and fan: the “explicit use of character, costume and makeup … moved the center of gravity from the person to the performance” in a way that seemed to abandon the rock mystique of authenticity and self-expression in favor of “disguising the self” while also reimagining it.

“His performances taught us about the constructedness of the rock star and the crafting of the rock performance,” Shumway writes. “His use of the mask revealed what Dylan’s insistence on his own authenticity and Elvis’s swagger hid.”

At the same time, Bowie’s decentered/concealed self became something audiences could and did take as a model. But rather than this being some radical innovation that transformed the way we think about rock forever (etc.), Shumway suggests that Bowie and his audience were revisiting one of the primary scenes of instruction for 20th-century culture as a whole: the cinema.

Bowie “did not appear to claim authenticity for his characters,” Shumway writes. “But screen actors do not claim authenticity for the fictional roles they play either. Because he inhabits characters, Bowie is more like a movie star than are most popular music celebrities. In both cases the issue of the star’s authenticity is not erased by the role playing, but made more complex and perhaps more intense.”

That aptly describes Bowie’s effect. He made life “more complex and perhaps more intense” -- with the sound of shattering clichés mixed into the audio track at unexpected moments. And a personal note of thanks to Shumway for breaking some, too.

Editorial Tags: 
Image Source: 
Getty Images
Image Caption: 
Memorial to David Bowie

Book review of Hugh Pennington's "Have Bacteria Won?" (essay)

Last month came the unwelcome if not downright chilling news that the antibiotic of last resort -- the most powerful infection fighter in the medical arsenal -- is now ineffective against some new bacterial strains. If, like me, you heard that much and decided your nerves were not up to learning a lot more, then this might be a good time to click over to see what else looks interesting in the Views section menu. There’s something to be said for deliberate obliviousness on matters that you can’t control anyway.

Hugh Pennington’s Have Bacteria Won? (Polity) is aimed straight at the heart of a public anxiety that has grown over the past couple of decades. The author, an emeritus professor of bacteriology at the University of Aberdeen, is clearly a busy public figure in the United Kingdom, where he writes and comments frequently on medical news for the media. A number of recent articles in British newspapers call him a “leading food-poisoning expert,” but that is just one of Pennington’s areas of expertise. Besides contributing to the professional literature, he has served on commissions investigating disease outbreaks and writes “medico-legal expert witness reports” (he says in the new book) on a regular basis.

The fear resonating in Pennington’s title dates back to the mid-1990s. Coverage of the Ebola outbreak in Zaire in 1995 seemed to compete for attention with reports of necrotizing fasciitis (better known as “that flesh-eating disease”), which inspired such thought-provoking headlines as “Killer Bug Ate My Face.”

Pennington refers to earlier cases of food contamination that generated much press coverage -- and fair enough. But it was the ghastly pair of hypervirulent infections in the news 20 years ago that really raised the stakes of something else that medical researchers were warning us about: the widespread overuse of antibiotics. It was killing off all but the most resilient disease germs. An inadvertent process of man-made natural selection was underway, and the long-term consequences were potentially catastrophic.

But now for the good news, or the nonapocalyptic news, anyway: Pennington makes a calm assessment of the balance of forces between humanity and bacteria and, without being too Pollyannaish about it, suggests that unpanicked sobriety would be a good attitude for the public to cultivate, as well.

The history of medical advances in the industrialized world has, he argues, had unexpected and easily overlooked side effects. Now we live, on average, longer than our ancestors. But we also die for different reasons, with mortality from infection no longer being high on the list. The website of the Centers for Disease Control and Prevention makes the point sharply with a couple of charts: apart from a spike during the influenza pandemic following the First World War, death from infectious disease fell in the United States throughout most of the 20th century. Pennington’s point is that we find this trend throughout the modernized world, wherever life expectancy increased. Medical advances, including the development of antibiotics, played a role, but not in isolation. Improved sanitation and increased agricultural output were also part of it.

“There is a pattern common to rich countries,” Pennington notes. “The clinical effects of an infection become much less severe long before specific control measures or successful treatments become available. Their introduction then speeds up the decline, but from a low base. An adequate diet brings this about.”

So death from infectious disease went from being a terrible fate to something practically anomalous within two or three generations. (To repeat, we’re talking about the developed world here: both prosperity and progress impose blinders.) And when serious infectious disease become rare, it also becomes news. “From time to time,” Pennington says, “the media behave like a chief refracting telescope, focusing on an object of interest but magnifying it with a good deal of aberration and fuzziness at the edges because of the poor quality of their lenses.”

Lest anyone think that the competitive shamelessness of the British tabloid press has excessively distorted Pennington’s outlook, keep in mind that CNN once had a banner headline reading, “Ebola: ‘The ISIS of Biological Agents?’” Nor does he demonize the mass media, as such. “Sometimes the journalistic telescope finds hidden things that should be public,” he writes -- giving as an example how a local newspaper identified and publicized an outbreak of infectious colitis at an understaffed and poorly run hospital in Scotland.

Have Bacteria Won? is packed with case histories of outbreaks from the past 60 or 70 years. Each is awful enough in its own right to keep the reader from feeling much comfort at their relative infrequency, and Pennington’s message certainly isn’t that disease can be eradicated. Powerful and usually quite effective techniques exist to prevent or minimize bacterial contamination of food and water, and we now have systematic ways to recognize and treat a wider range of infections than would have been imaginable not that long ago. But systems fail (he mentions several cases of defective pasteurization equipment causing large-scale outbreaks) and bacteria mutate without warning. “Each microbe has its own rules,” Pennington writes. “Evolution has seen to that.”

We enjoy some advantage, given our great big brains, especially now that we have the tools of DNA sequencing and ever-increasing computational power. "This means," Pennington writes, "that tracking microbes, understanding their evolution and finding their weaknesses gets easier, faster and cheaper every day." Given reports that the MCR-1 gene found in antibiotic-impervious bacteria can move easily between micro-organisms, any encouraging word is welcome right about now.

But Pennington's analysis also implies that the world's incredible and even obscene disparities in wealth are another vulnerability. "An adequate diet" for those who don't have it seems like something all that computational power might also be directed toward. Consider it a form of preventative medicine.

Editorial Tags: 

Scholarly database JSTOR sees growth in ebooks program

Smart Title: 

The scholarly database JSTOR, recognizing its role as a starting point for research, sees major growth in its ebook program.

Essay on Wikipedia's fifteenth anniversary

Wikipedia came into the world 15 years ago today -- and, man, what an ugly baby. The first snapshot of it in the Internet Archive is from late March of 2001, when Wikipedia was already 10 weeks old. At that point, it claimed to have more than 3,000 pages, with an expressed hope of reaching 10,000 by the end of summer and 100,000 at some point in the not unimaginably distant future. The first entries were aspirational, at best. The one about Plato, for example, reads in its entirety: “Famous ancient Greek philosopher. Wrote that thing about the cave.”

By November -- with Wikipedia at 10 months old -- the entry on Plato was longer, if not more enlightening: you would have learned more from a good children’s encyclopedia. Over the next several months, the entry grew to a length of about 1,000 words, sometimes in classically padded freshman prose. (“Today, Plato's reputation is as easily on a par with Aristotle's. Many college students have read Plato but not Aristotle, in large part because the former's greater accessibility.”) But encouraging signs soon began to appear. A link directed the reader to supplementary pages on Platonic realism, for example. As of early 2006, when Wikipedia turned five years old, the main entry on Plato had doubled in length, with links to online editions of his writings. In addition, separate pages existed for each of the works -- often consisting of just a few sentences, but sometimes with a rough outline of the topics to be covered in a more ambitious entry somewhere down the line.

The aspirations started to look more serious. There were still times when Wikipedia seemed designed to give a copy editor nightmares -- as in 2003, when someone annotated the list of dialogues to indicate: “(1) if scholars don't generally agree Plato is the author, and (2) if scholars don't generally disagree that Plato is not the author of the work.”

Yet it is also indicative of where the site was heading that before long some volunteer stepped in to unclog that passage's syntactical plumbing. The site had plenty of room for improvement -- no denying it. On the other hand, improvements were actually happening, however unsystematically.

The site hit its initial target of 100,000 pages in early 2003 -- at which point it began to blow up like a financial bubble. There were not quite one million pages by the fifth anniversary of its founding and 3.5 million by the tenth. Growth has slowed of late, with an average of about 300,000 pages being added annually over the past five years.

I draw these figures from Dariusz Jemielniak’s Common Knowledge? An Ethnography of Wikipedia (Stanford University Press, 2014), which also points out how rapidly the pace of editorial changes to articles began to spike. Ten million edits were made during Wikipedia’s first four years. The next 10 million took four months. From 2007 on, the frequency of edits stabilized at a rate of 10 million edits per seven or eight weeks.

We could continue in this quantifying vein for a while. As with the Plato entry finding its center of gravity after a long period of wobbly steps, the metrics for Wikipedia tell a story of growth and progress. So does the format’s worldwide viability: Wikipedia is now active in 280 languages, of which 69 have at least 100,000 entries. It all still seems improbable and inexplicable to someone who recalls how little credibility the very concept once had. (“You can edit this page right now! … Write a little (or a lot) about what you know!”) If someone told you in 2002 that, in 10 years, the Encyclopædia Britannica would suspend publication of its print edition -- while one of the world’s oldest university presses would be publishing material plagiarized from Wikipedia, rather than by it -- the claim would have sounded like boosterism gone mad.

That, or the end of civilization. (Possibly both.) What’s in fact happened -- celebrate or mourn it as you will -- has been a steady normalization of Wikipedia as it has metamorphosed from gangly cultural interloper into the de facto reference work of first resort.

In large measure, the transformation came about as part what Siva Vaidhyanathan has dubbed “the Googlization of everything.” Wikipedia entries normally appear at or near the top of the first page of the search engine’s results. After a while, the priority that the Google algorithm gives to Wikipedia has come to seem natural and practically irresistible. At this point, having a look at Wikipedia usually quicker and easier than deciding not to (as someone once said about reading the comic strip “Nancy”).

Another sign of normalization has been the development of bibliographical norms for citing Wikipedia in scholarship. It signals that the online reference work has become a factor in knowledge production -- not necessarily as a warehouse of authoritative information but as a primary source, as raw material, subject to whatever questions and methods a discipline may bring to bear on it.

In the case of that Plato entry, the archive of changes over time would probably be of minimal interest as anything but a record of the efforts of successively better informed and more careful people. But Wikipedia’s role as a transmitter of information and an arena for contesting truth claims make its records a valuable source for people studying more recent matters. Someone researching the impact of the Sandy Hook Elementary School shootings, for example, would find in the Wikipedia archive a condensed documentation of how information and arguments about the event appeared in real time, both in its immediate aftermath and for years afterward.

I've been reading and writing about Wikipedia for this column for most of its lifespan, and it won't be five years before there's occasion to do so again. There's plenty more to say. But for now, it seems like Professor Wikipedia should get the last word.

Editorial Tags: 

Is this the best acknowledgment section of a scholarly book?

Smart Title: 

Blog post about a scholar's anti-thank-you has lots of people talking.

Pages

Subscribe to RSS - Books
Back to Top