For thousands of years, the treatment of illness involved some combination of superstition, trial-and-error, inadvertent torture, and just plain dumb luck. A treatment counted as effective if the patient survived.
Whatever doctors were doing before the middle of the 19th century, it’s hard to consider it medicine – certainly not by the standards of the 150 years or so since the advent of local anesthesia. Physicians have understood the principles of antisepsis for about as long, while aspirin was synthesized only in 1897. We are spoiled. Not for us the clinical technique of Theodoric of York -- the medieval practitioner Steve Martin used to play in a "Saturday Night Live" sketch – whose treatment for every condition included the draining of excess blood.
Comforting one patient’s mother, Theodoric reminds her of how far the healing arts had advanced as of 1303: “Why, just 50 years ago, they thought a disease like your daughter's was caused by demonic possession or witchcraft,” he tells her. “But nowadays we know that Isabelle is suffering from an imbalance of bodily humors, perhaps caused by a toad or a small dwarf living in her stomach.”
Insofar as Theodoric and his colleagues brought any book-learning to their practice, it came from the pen of the Greco-Roman philosopher and physician Galen, born circa 130 A.D. in what is now Turkey. Susan P. Mattern, a professor of history at the University of Georgia, does not exaggerate in calling her biography The Prince of Medicine: Galen in the Roman Empire (Oxford University Press). His posthumous career was, if anything, one of almost regal authority.
An earned authority, just to be clear about it. Galen had an ego the size of the Empire itself. He never tired of pointing out the abject ignorance of other physicians, and was prone to quoting tributes by illustrious people to his own erudition and skill. It is not charming. But Mattern shows Galen to have been tireless as both a practitioner and a scholar -- and his output of treatises, case histories, popular textbooks, and editions of Hippocrates and other medical authors was astounding. “The most modern edition of his corpus runs to 22 volumes, including about 150 titles,” writes Mattern, “and is one-eighth of all the classical Greek literature that survives.”
Tireless in his efforts to accumulate, compare, and synthesize the recorded medical knowledge of previous centuries, Galen also conducted a great deal of anatomical research (including animal vivisection) to test the theories of the day. In his 20s, he did the second-century equivalent of a residency as the physician-on-call to the gladiators of his hometown. A standard treatment for open wounds was to bathe them in hot water, followed by a plaster made of boiled flour, which Galen reports as totally ineffective – crippling, when not lethal. By contrast, he “soaked linen cloths in wine and placed the folds on the wounds,” the biographer says, “covering the cloths in soft sponges which he moistened day and night.” Whether or not the technique saved the lives of all the gladiators in his care during his first year (so Galen claimed) he definitely understood the antiseptic property of alcohol.
Opening a practice in Rome, he distinguished himself in the constant, cutthroat battle of reputation among physicians, both for his skill in diagnosis and treatment and his erudition. He seems to have been acutely sensitive to changes in a patient’s pulse and body temperature. “Long before laboratory testing,” Mattern writes, “he examined urine and feces, sweat, sputum,” and “vomit, pus, and blood for color, texture, viscosity, and sediment.” Galen’s case histories show he “scruitinized his patients’ faces for signs such as change in skin color or the shrunken eyes of wasting or extreme dehydration.” And he knew how to interview patients about their history and symptoms with more finesse than you can expect from one HMO that I could name.
Mattern stresses that Galen was also an exemplary product of Hellenistic culture – urbane, deeply familiar with Greek philosophy and literature as well as the medical authors, and capable of writing in either a matter-of-fact or a high-flown style as the circumstances required.
She notes that we have no evidence that Galen bothered to learn the language of the Empire. He wrote in Greek and did not cite Latin authors, and his reputation took root among the aristocracy, for which familiarity with Greek was the sine qua non of intellectual sophistication. Medical science in particular was a fashionable topic, and a number of Galen’s works were composed as memoranda or instructional guides for the amateurs in his entourage.
The audience for Galen’s work was not limited to the scroll-buying public. He had also learned the arts of oratory and debate practiced by the sophists -- and his encounters with other physicians were brutal rhetorical showdowns, as were his lectures on anatomy, during which slaves held down monkeys and livestock as Galen and his opponents cut them open to demonstrate their points.
Much of it was done in the street. At one point while reading the book, I got an image of the Prince of Medicine crossing paths with the adherent of some medical philosophy he opposed -- the Empiricists, say, or the Dogmatists -- and performing dissection battles while surrounded by their respective crews (students, patients, slaves, aristocratic fanboys, etc.) as well as random passers-by. Like break-dancing, in togas, with gallons and gallons of blood.
It did not hurt Galen’s reputation that he had inherited considerable wealth and could refuse payment for his services. He disdained the very idea of practicing the divine art of Asclepius (the god of medicine who had visited Galen’s father in a dream to provide vocational guidance on the boy’s future) for money. This, too, had its rhetorical side: it meant he could cast aspersions on doctors with more worldly motivations than pursuit of the pure good of healing.
In his writings, Galen expressed reluctance at being summoned by the emperor, Marcus Aurelius, to serve as court physician. He accepted, of course (it was an offer he couldn’t refuse) and was also retained by the succeeding emperors. Mattern suggests that there may have been more to his professed misgivings than thinly disguised braggadocio. Whatever the boost in prominence, the position also meant less autonomy.
Self-aggrandizing as Galen’s devotion to medicine may have been at times, the biographer calls him “a surprisingly pious man,” fascinated by the “cumulative and overpowering evidence of an intelligence at work” in the animals he dissected in hopes of understanding the human anatomy. This made him one of the more appealing pagan thinkers for adherents of the three major monotheistic religions, who translated them into Hebrew, Latin, and Arabic. A number of his works have survived only because Islamic scholars translated them, although it’s entirely possible that the original Greek texts may yet resurface. As recently as 2007, Galen’s treatise “On the Avoidance of Pain” – long presumed lost in any language – turned up in a Greek monastery.
His biographer points out how unfortunate it is that Galen never challenged the medical value of bloodletting – a practice that continued for so long that historians have wondered if it perhaps have had therapeutic value in treating something, though it could prove fatal when done with a shaky hand.
“While it is perhaps wrong to blame him for failing to break from a tradition that his followers, including the great physicians of the Islamic Middle East and of the European Renaissance and Enlightenment mostly did not question,” writes Mattern, “one wishes he had turned his scorn on this therapy” instead of on the “three-day fasting cure” promoted by some of his peers. “For while Galen did not invent bloodletting, he had the power to consign it to oblivion.”
Maybe he did change his mind, but history lost the memo?
At a time when many question the relevance of history, it is noteworthy that the U.S. Supreme Court case that prohibited the federal government from undercutting a state’s decision to extend "the recognition, dignity and protection" of marriage to same-sex couples, hinged on arguments advanced by professional historians.
Rarely have historians played as important a role in shaping the outcome of a public controversy as in the same-sex marriage cases. Legal, family, women's, and lesbian and gay historians provided key evidence on which U.S. v. Windsor ultimately turned: that the Defense of Marriage Act (DOMA) represented an unprecedented and improper federal intrusion into a domain historically belonging to the states. As Justice Kennedy affirmed, "the federal government, through our history, has deferred to state law policy decisions with respect to domestic relations."
But historical scholarship did more than substantiate a single pivotal argument. It framed the majority’s broader understanding of marriage as an evolving institution and helped convince five justices that opposition to same-sex marriage is best understood as part of a long history of efforts to deprive disfavored groups of equal rights and benefits. In the end, the majority opinion hinged on "the community’s ... evolving understanding" of marriage and of equality and the majority’s recognition that DOMA imposed "a disadvantage, a separate status, and so a stigma upon all who enter into same-sex marriages made lawful by the unquestioned authority of the states."
Briefs filed with the Supreme Court by the American Historical Association and the Organization of American Historians demonstrated that far from being a static institution, marriage has profoundly changed its definition, roles, and functions, and that today's dominant marital ideal, emphasizing emotional intimacy, has nothing to do with gender. Currently, marriage's foremost public function is to distribute benefits, such as those involving health insurance, Social Security, and inheritance, making it all the more valuable for same-sex couples.
Furthermore, these briefs proved that contrary to the widely held assumption that marriage has long been defined by its procreative function, this was not the case. Marriage was justified on multiple grounds. Especially important were the notions that marriage contributed to social stability and provided care for family members. No American state ever forbade marriage to those too old to bear children.
Without reducing the legal history of marriage to a Whiggish, Progressive. or linear narrative, the historians showed that two broad themes characterize the shifting law of marriage in the United States. The first is the decline of coverture, the notion that a married woman's identity is subsumed in her husband's. A second theme is the overturning of earlier restrictions about who can marry whom.
Slowly and unevenly, American society has abolished restrictions on marriage based on people's identity. As recently as the 1920s, 38 states barred marriages between whites and blacks, Chinese, Filipinos, Japanese, Indians, "Malays," and "Mongolians." It was not until 1967 in Loving v. Virginia, the Supreme Court decision that threw out a Virginia ban on black-white marriages, that racial and ethnic restrictions were outlawed.
At the same time, there has been an ongoing legal struggle to recognize women as full rights-bearers within marriage. Instead of seeing their identity subsumed in their husband's -- the notion that spouses cannot testify against one another was originally rooted in this principle -- women gradually attained the right to sue, control their own wages, and manage their separate property.
Perhaps the most powerful recent symbols of this shift are prosecutions for marital rape and elimination of the presumption that a husband is head of the household for legal purposes. Opposition to the liberalization of marriage, the historians demonstrated, has rested on historical misconceptions and upon animus, rooted in ethnocentrism and religious sectarianism.
Marriage today bears scant resemblance to marriage even half a century ago, when the male breadwinner family prevailed and dual-earner and single-parent households were far rarer than today. The contemporary notion of marriage as an equal, gender-neutral partnership differs markedly not only from the patriarchal and hierarchical ideals of the colonial era, but from the notion of complementary spousal roles that predominated during the age of companionate marriage that prevailed from the 1920s into the mid-1960s.
Change, not continuity, has been the hallmark of the history of marriage. Even before the 20th century, marriage underwent certain profound transformations. Landmarks in this history included:
Enactment of the first Married Women's Property laws in the 1830s and 1840s, which established women's right to control property and earnings separate and apart from their husbands.
Passage of the first adoption laws in the mid-19th century, allowing those unable to bear children to rear a child born to other parents as their own.
Increased access to divorce, beginning with judicial divorce supplanting legislative divorce.
The criminalization of spousal abuse starting in the 1870s.
Marriage's persistence reflects its adaptability. DOMA represented an unprecedented federal attempt to fix the definition of marriage and impose this definition upon the states and their inhabitants. Specifically, DOMA represented a federal effort to prohibit lesbian and gay Americans from securing the same civil rights and benefits available to other citizens. DOMA stigmatized a specific group of Americans and represented federal discrimination based on a particular religious point of view. In Justice Kennedy’s ringing words: "The federal statute is invalid, for no legitimate purpose overcomes the purpose and effect to disparage and to injure those whom the state, by its marriage laws, sought to protect in personhood and dignity."
History, in the same-sex marriage controversy, was not simply "preface" -- an interesting but ultimately insignificant detail in cases involving equal treatment under law. History lay bare a series of dangerously misleading assumptions -- above all, the notion that same-sex marriage deviates from a timeless, unchanging marital norm.
Steven Mintz, professor of history at the University of Texas at Austin and the author of Domestic Revolutions: A Social History of American Family Life and Huck’s Raft: A History of American Childhood, signed the American Historical Association brief.
Amid talk of outcomes-based education, a new report from the Commission on the Humanities and Social Sciences stresses the disciplines' role in long-term career success and international competitiveness.
At the recent dedication of the $500 million George W. Bush Presidential Center at Southern Methodist University, President Clinton called it "the latest, grandest example of the eternal struggle of former presidents to rewrite history." In 2004, the Clinton Center and Foundation stunned with its more than $200 million price tag, and less than a decade later Bush has doubled that when the endowment for the Bush Institute is counted. When the Barack Obama center opens around 2020, perhaps on the campus of the University of Chicago, could it be the first billion-dollar presidential center? Possibly. A total of $1.4 billion was raised for Obama’s two successful presidential campaigns, and so for a center dedicated to his final campaign for a better place in history it’s at least likely that he’ll surpass previous records.
Although the final decision on the location of the Obama center is probably a couple of years away, professors and administrators at the University of Chicago (where he once taught) and the University of Hawaii (where his mother studied and his sister taught) are thinking about what it might mean if it lands on their campus. Chicago State University also wants to be considered. For universities, presidential centers present both opportunities and significant costs and challenges. Academics should consider carefully before getting into a bidding war over a presidential library, and weigh how much these centers promote spin in addition to scholarship.
Prime campus real estate is sometimes sacrificed for these presidential temples, which, although they house valuable historical records impartially managed by the National Archives, also have museums that high school students who have passed the Advanced Placement U.S. History test would likely find biased, as well as foundations or institutes that have agendas that the host university does not control.
Clinton was right in saying that these centers are attempts by former presidents to write their own history and polish their reputations. And to a significant degree they work. President Carter’s reputation was tarnished when he left office in 1981, but as The New York Times put it in a nearly prescient headline in 1986: "Reshaped Carter Image Tied to Library Opening" — and today, Carter is one of the more respected former presidents.
But Clinton exaggerated when he said that the struggle by former presidents to remake their images stretches back to the beginning of American history. Until the 20th century, former presidents rarely even wrote memoirs, and the first president to have a presidential library run by the federal government was Franklin D. Roosevelt. The Roosevelt Library, which opened on his estate at Hyde Park, New York, in 1941, was modest compared with succeeding presidential libraries. Its initial cost was about $7 million in today’s dollars, but critics still accused FDR of building a "Yankee pyramid." There was more than a grain of truth in the charge. When FDR first saw Egypt’s pyramids, he said, "man’s desire to be remembered is colossal." Although what Roosevelt said may not be true for everyone, it certainly was true for FDR and his successors.
Most succeeding presidential libraries dwarf FDR’s: The Harry S. Truman Library in Independence, Missouri, evokes Queen Hatshepsut’s Temple in Egypt, as well as being the first to feature a full-scale Oval Office replica (something copied by most of the others), and the Dwight D. Eisenhower Library in Abilene, Kansas, is a complex of buildings with a park that takes up an entire city block.
The first president to affiliate his library with a university was President Kennedy. JFK envisioned his library on the grounds of his alma mater, Harvard University. After Kennedy’s death some at Harvard decided they didn’t like the idea of common tourists on their campus (99 percent of the visitors to presidential libraries are tourists, and only 1 percent are researchers), and architecture critic Ada Louis Huxtable humorously lampooned their fear of "Goths overwhelming the intelligentsia." Harvard did establish the Kennedy School of Government, but the Kennedy Library itself was located on a campus of the University of Massachusetts, on a spectacular site overlooking Boston harbor.
The Kennedy Library was also the first to have a "starchitect," when Jackie Kennedy chose I.M. Pei — who later designed the East Building of the National Gallery of Art, as well as the expansion of the Louvre — to design her husband’s memorial. Originally, the Kennedy Library was going to be a large pyramid with the top cut off — representing JFK’s tragically truncated achievement — but eventually that plan was scrapped, and Pei reimagined that design as the glass pyramid at the Louvre. Pei’s final design for The Kennedy Library and Museum was a futuristic glass, steel, and concrete edifice that still looks like it could be used in a Star Trek movie.
President Lyndon Johnson, with Lady Bird Johnson’s help, also hired a star architect for his monument to himself. Gordon Bunshaft of the famous Skidmore, Owings, and Merrill firm had designed such modernist icons as Yale University’s beautiful Beinecke Library with its translucent marble walls. Bunschaft’s design for the Johnson Library on the campus of the University of Texas at Austin has, as Ada Louis Huxtable wrote, "a Pharaonic air of permanence" that "puts Mr. Johnson in the same class as some Popes and Kings who were equally receptive clients for architects with equally large ideas." The Johnson Library looks like a cross between an Egyptian pylon temple and a space-age bureaucracy.
We could talk about award-winning architect James Polshek’s design for the Clinton Center, or the renowned Robert A. M. Stern’s imposing design for the Bush Center at SMU, but you get the idea. All presidents since FDR have an edifice complex. Becoming a patron of a huge architectural project dedicated to yourself is one of the perks of being an Imperial Ex-President. Another perk is becoming a museum curator. Initially, the exhibits in presidential libraries are campaign commercials in museum form, designed with a lot of help from the former president. Eventually these exhibits become more balanced and complete, but it’s usually 30-50 years after a president leaves office before the National Archives installs decent exhibits. The former president and many of his supporters need to die before their power to spin subsides.
Supporters of presidential libraries hail their archives with their raw materials of history open to scholars, journalists, and even school kids. But these records would be available anyway because by law they are owned by the American people and must be impartially administered and released by the National Archives. If a president didn’t have a presidential library, the records would be housed in an equally accessible facility (probably in Washington), it just wouldn’t be so architecturally grandiose.
It was Jimmy Carter who first morphed the presidential library into a presidential center. The Carter Center, which is next to but administratively separate from the Carter Library and Museum in Atlanta, has been so effective at living up to its mantra of "Waging Peace. Fighting Disease. Building Hope" that President Carter won the Nobel Peace Prize in 2002. But Carter has also generated considerable controversy over the years because of his views on Israel. If the Carter Center had been located on the campus of nearby Emory University (with which it is loosely affiliated) that institution’s reputation might have been affected, but since the Carter Center is geographically separate from Emory the university was largely shielded.
There is not as much shielding for SMU from former President Bush and his views on such issues as enhanced interrogation techniques. The Bush Institute was inspired in part by the Hoover Institution on the campus of Stanford University, which is considered one of the nation’s leading conservative think tanks. The Hoover Institution has long offered a platform for high-profile Republicans such as George Schultz, Condoleezza Rice, and Donald Rumsfeld.
The Hoover Institution is to a large degree administratively separate from Stanford, and so although it effectively leverages the prestige of its host university to expand its influence, Stanford does not have a corresponding control over it. It’s possible that President Obama will seek a similar arrangement with a host university for a future Obama Center, or whatever he might choose to call it.
And the bottom line here is the bottom line: Although the price tag for the actual building of the Bush Library, Museum, and Institute was a cool quarter of a billion dollars, an equal amount was raised to endow the Bush Institute. And Bush and his supporters will continue their aggressive fund-raising for the foreseeable future, making the ultimate price tag and influence of the Bush Center perhaps in the billion-dollar range sometime in the next decade or two.
When President Johnson helped found the LBJ School of Public Affairs at the University of Texas at Austin, he gleefully anticipated breaking what he called "this goddamned Harvard" hold on top government positions. But like the Kennedy School of Government at Harvard, the Johnson School is run by its university, not by a self-perpetuating board largely independent of the university that seeks, in part, to enhance the reputation of the president whose name is on the building. In other words, as presidential centers have evolved and grown they have become a better and better deal for former presidents, but it’s less certain that they are a good deal for the universities that might host them.
What would make a presidential center a better deal for a university and the public? It would be useful for the 99 percent who will visit the future Obama museum to encourage the involvement of some history professors at the host university to help create exhibits with rigorous content. This content should be of a quality that would actually help future high school students pass the relevant portion of a future AP U.S. history test, rather than just being a museum of spin.
For a future Obama foundation or institute, it would be worthwhile for the university to have a significant number of faculty members from a variety of departments on the governing board. The university should have more than token input into a foundation that will be a big player on campus for many decades, perhaps even centuries. For, as some have noted, these presidential centers have become the American equivalent of the temples and tombs of the pharaohs. If professors, students, and the general public are to be more than bystanders or even would-be political worshippers, the host university needs to negotiate for the best interests of not just the university but the American public. Universities should not simply acquiesce to the desire that Clinton spoke of (only half-jokingly) that presidents have to rewrite their own history in self-glorifying memorials.
And President Obama himself would need to be involved in the process of reforming the presidential center. He has to a degree already taken on this role, for in his first full day in office in 2009 he revoked President Bush’s infamous Executive Order 13233, which restricted access to presidential records for political reasons. Obama and the university he partners with should continue this work so that presidential centers cease to remind us of the lines of the poem by Percy Shelley: "My name is Ozymandias, King of Kings, Look on my works, ye Mighty, and despair!"
“Before the Freedom of Information Act,” Henry Kissinger told a gathering of diplomats in Turkey in March 1975, “I used to say at meetings, ‘The illegal we do immediately; the unconstitutional takes a little longer.’ But since the Freedom of Information Act, I'm afraid to say things like that.”
Not that afraid, obviously. The Machiavellian quip got a laugh at the time, according to the official transcript -- and clearly it merits a spot in any future collection of familiar quotations, alongside Kissinger’s remark about power being the ultimate aphrodisiac. For now, it serves as the epigraph to a press release from WikiLeaks announcing the opening of the Public Library of U.S. Diplomacy, with its first collection consisting of more than 1.7 million diplomatic cables from 1973 to ’76.
All of the material was routinely (if belatedly) declassified after 25 years, per U.S. law, and has been available from the National Archives and Records Administration. WikiLeaks made the collection searchable and is “housing” it on servers presumably beyond the reach of Big Brother. Now they can’t be reclassified.
As announcements from WikiLeaks go, it’s all fairly underwhelming. But it does make an important revelation -- however unintentional -- by reminding the public that three years have passed since the group last made a world-shaking release of information. The leaks, it seems, have been plugged. Secret documents are staying secret. Even the most ardent admirer of Bradley Manning will be understandably reluctant to share his fate. While it is too soon to pronounce WikiLeaks dead, it does appear to be in a coma.
Castronovo, a professor of English and American studies at the University of Wisconsin at Madison, links the “Cablegate” of 2010 to a Revolutionary War-era incident through the concept of “a new kind of network actor” distinct from “the traditional person of liberal democracy.” The case in question was the Thomas Hutchinson affair of 1773, when letters by the governor of the Massachusetts Bay Colony somehow found their way into the hands of the Sons of Liberty, who then circulated them via newspaper and pamphlet.
Hutchinson had borne the brunt of serving His Majesty during the Stamp Act riots a few years earlier, and was in office during the Boston Massacre. In his correspondence he referred to the need for “abridgement of what are called English liberties" among the unruly colonial subjects, which was just so much gasoline on the fire.
The source of the leak was one Benjamin Franklin, colonial postmaster. Franklin later insisted that this ethical lapse was committed in an attempt (alas! unsuccessful) to reduce American hostility towards Parliament and the Crown by documenting that the real source of trouble was someone much lower in the chain of command. Castronovo treats this claim with greater suspicion than have some historians -- and not just because Franklin was such a master of irony, pseudonymous commentary, and the fake-out.
Franklin was also a node in multiple correspondence networks, and understood perfectly well how porous they could be. Alongside the official channels of communication between Court and colony, there were informal but durable long-distance connections among merchants, officials, publishers, and so on. A letter by someone within such a network tended to have, so to speak, an implicit “cc” or “bcc” field.
“More significant than the sending and receipt of private letters between individuals,” writes Castronovo, the activity of these epistolary networks “encompassed a range of public activities, including the recitation of letters aloud, the printing of handwritten letters in newspapers, the transmission of pamphlets, and the sending of circular letters by local governments....” Such communications might be “opened by third parties and forwarded without permission, shared in social circles and reprinted in newspapers.”
By transmitting Hutchinson’s letters to figures within his own circles who were in contact with the more hot-headed American revolutionary circles, Franklin was creating a political weapon against the authorities. He was, in effect, both a whistleblower and Julian Assange at the same time.
Having put it that way, however, I must immediately backtrack to say that the analogy is not Castronovo’s point at all. “At issue,” he writes, “is how communication spreads and metastasizes, how ideas proliferate and take root, how views and opinions propagate themselves.”
The network in each case – epistolary or digital – is not just a medium or tool that individuals use to communicate or act. In it, rather, “individual agency becomes unmoored from stable locations and is set adrift along an interconnected web of tendril-like links and nodes.” This is a perspective derived from the work of Bruno Latour, among others. It rejects the familiar way of thinking of society as consisting of distinct individuals who interact and so create networks. Instead -- to put things one way – it’s networks all the way down. Society emerges from a teeming array of networks that overlap and intersect, that get knotted together or fray with use.
Franklin’s catalytic intervention in the American crisis of 1773 was as effective as it was by virtue of his ability to channel communication from one network to another. And it was effective because it was done quietly; he advanced the revolutionary process involving “a public interlinked and excited by expressions of dissent” without making himself known. “In a perhaps uncharacteristic move,” Castronovo says, “Franklin refuses to occupy the center [of public discussion], instead preferring to sit back in the shadows where, after all, the shadowy work of espionage gets done.”
But the state – however much it may use networks of its own – insists on ascribing public action to individuals possessing stable and legible identities. By 1774, the Privy Council knew about Franklin’s role in the matter and summoned him to a hearing in London, where he was denounced, in humiliating terms, for more than an hour.
Bradley Manning, of course, faces worse – while the coiner of that witticism about operating illegally and unconstitutionally has never endured the consequences of his actions. What does that imply for a Latourian theory of social ontology? I don’t know, but it surely demonstrates that not all networks are equal before the law.
Once it would have been possible to jump right into a discussion of Michael Gordin’s The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe (University of Chicago Press) with the reasonable assumption that readers would have at least a nodding acquaintance with the maverick psychoanalyst’s ideas.
But today -- as Gordin, a professor of history at Princeton University, notes -- few people under the age of 50 will recognize Velikovsky’s name, much less know of his theory of the traumatic impact of cosmic catastrophes on human history. It was a heated topic for discussion in the 1970s. I recall seeing a poster for a meeting of Chaos and Chronos, a student organization dedicated to Velikovskian matters that once had clubs on many U.S. college campuses. This was as late as 1980 or ’81. (Which only corroborates Gordin’s point, for I am approaching the half-century mark at an alarming speed.)
So, first, a lesson in now-dormant controversy.
Although he published several other books during his lifetime, plus a few more posthumously, Velikovsky presented his core argument in a volume called Worlds in Collision (1950). It was an attempt to formulate the key to all mythologies, or at least an explanation of some of the more striking stories and beliefs of antiquity. Drawing on sources both classical and obscure, he showed that cultures all over the world preserved narratives in which the world passed through incredible catastrophes: the earth shook, the heavens darkened, the sun stood still, floods wiped out society, fire or stones or both fell from the sky, and so on. The cultures that preserved the tales explained the events as a manifestation of God’s wrath at humanity, or as the consequence of gods’ behavior toward one another.
An orthodox Freudian, Velikovsky had no use for Jung’s nebulous ideas about archetypes in the collective unconscious. His theory was more concrete, if no less strange. The far-flung legends all made sense as distorted accounts of a series of astronomical anomalies beginning circa 1500 B.C.E., when (he argued) a huge mass of matter broke off the planet Jupiter and spun off into space. It passed dangerously close to Earth a couple of times before eventually settling into orbit as the planet we now know as Venus.
Its comet-like transit through the solar system generated a series of events, both in outer space and here below, that continued for the better part of a thousand years. Earth and proto-Venus came near enough to affect each other’s orbits, and that of Mars as well. Bewildered by the strange things happening in the sky, mankind endured terrestrial upheaval on an incredible scale -- tectonic disasters, weird weather, and shifts of the planet’s axis, for example.
Once, when proto-Venus came close to Earth, its atmosphere permeated our own long enough to precipitate a fluffy, snow-like substance made of hydrocarbons. And so it came to pass that the Israelites received the manna falling from heaven that the Lord did send to nourish them.
Well, it’s a theory, anyway.Harper’s magazine ran an article about Velikovsky’s book in advance of its publication. Other, less sober publications followed suit, presenting Worlds in Collision as demonstrating the literal (albeit distorted) truth of events recorded in the Bible. The response by scientists was less enthusiastic, to put it mildly. The word “crackpot” tended to come up. Velikovsky’s interdisciplinary erudition impressed them only as evidence that he was profoundly ignorant in a number of fields. The American people would be dumber for reading the book, and so on.
Upon seeing the early publicity for Velikovsky’s book, some scientists were so disgusted that they wrote to Velikovsky’s publisher, Macmillan, to complain. Worlds in Collision had been listed in the firm’s catalog as a scientific work. The letter-writers considered this disgraceful, and warned of the potential damage to the press’s reputation in the scientific community. After a few university science departments canceled their meetings with Macmillan’s textbook salesmen, the publisher became alarmed and sold its right to the book to Doubleday.
Velikovsky was unhappy about this, but the deal was hard on Macmillan as well. At its peak, Worlds in Collision was selling a thousand copies a week, despite being a rather pricey hardback. The backstage furor soon died down, as did public interest in Velikovsky’s claims. By 1951, his ideas must have seemed as if they would have no more of a future than the other big fad of the previous years, L. Ron Hubbard’s Dianetics. (The scholarly literature seems to have overlooked this bit of synchronicity, though I’m sure there is a master’s thesis in it for somebody.)
The Worlds in Collision affair might have been forgotten entirely if not for a special issue of the journal American Behavioral Science devoted to the whole matter, published in 1963. The contributors were interested not so much in Velikovsky’s ideas as in how scientists had responded to them – with peremptory dismissals based on the Harper’s article, emotional rhetoric, and behind-the-scenes pressure on his publisher. It amounted to censorship and the repression of ideas – the assertion of scientific authority against a theory, despite the lack of serious engagement with the book itself.
Velikovsky and Albert Einstein both lived in Princeton, N.J., during the 1950s, and Velikovsky could quote remarks from the physicist’s letters and conversation suggesting that Worlds in Collision was at least interesting and worthy of a hearing. This was by no means the only thing Einstein had to say. Gordin quotes a number of occasions when Einstein described Velikovsky as “crazy” -- and clearly he regarded the man as a pest, at times. But it's not difficult to imagine why the most famous Jewish immigrant in postwar America might develop sympathetic feelings for someone of a comparable background who seemed to be facing unfair persecution. Besides, they could speak German together. That counted for a lot.
In any case, their friendship also made it easier to argue that Velikovsky just might be too far ahead of his time. In the mid-1960s, students at Princeton University formed a discussion group on Worlds in Collision, and Velikovsky himself spoke there – the first of what became many lectures to packed halls. Given the spirit of the time, having been rejected and anathemized by the scientific establishment was, in its own way, a credential. Among young people, Velikovsky enjoyed the special authority that comes when mention of one’s ideas is sufficient to annoy, very noticeably, one’s professors.
In 1972, the editors of Portland State University’s student magazine Pensée turned it into a forum defending and developing Velikovsky’s ideas. Papers were peer-reviewed, sort of: they were submitted to scholars and scientists for vetting, though most of the reviewers were sympathetic to Velikovsky (and, it sounds like, also contributors). Pensée’s first all-catastrophism issue clearly met a need. It had to be reprinted twice and sold 75,000 copies, after which the journal’s circulation settled down to a still-remarkable 10-20,000 copies per year.
And if any more evidence of his status as countercultural eminence were needed, the American Association for the Advancement of Science held a Velikovsy symposium at its annual conference in February 1974. The most famous participant was the astronomer Carl Sagan, who challenged the author’s supposed scientific evidence for the cosmic-catastrophe scenario at considerable length. Velikovsky and his supporters were angry that all of the invited speakers were critical of his work. But the organizers invited Velikovsky himself to respond, which he did, also at considerable length. The symposium may not have vindicated Velikovsky, but it gave him an unusually prominent place at the table
He died in 1979, and five years later Henry Bauer (now an emeritus professor of chemistry and science studies at Virginia Tech) published Beyond Velikovsky: The History of a Public Controversy (University of Illinois Press). It was the first book-length analysis of the whole saga and, for a long time, the last. Most of the secondary literature on Velikovsky appearing since his death resembles the material about him published during his lifetime, in that it is polemical, for or against. The one published biography of Velikovsky that I know of, drawing on his own memoirs, is by his daughter.
So Gordin’s The Pseudo-Science Wars belongs to the fairly small number of studies that do not simply pour the old controversies into new bottles. In that regard, the title is something of a fake-out. The author doesn’t treat Velikovsky’s catastrophism as a variety of pseudoscience. He is dubious about the concept, both because it is applied to too many phenomena that don’t share anything (what do astrology, cold fusion, biorhythms, and the study of how ancient astronauts shaped human evolution really have in common?) and because no one has established an epistemological “bright line” to distinguish science-proper from its pretenders. The word’s real significance lies in its use in shoring up the authority of those who use it. Calling something pseudoscience is more profoundly delegitimizing than calling it bad science.
Happily the author spends only a little time on Sociology of Deviance 101-type labeling theory before getting down to the altogether more compelling labor of using archival material that was unavailable to Bauer 30 years ago -- especially the theorist’s personal papers, now in Princeton’s collection. Velikovsky was something of a packrat. If he ever parted with a document, it cannot have been willingly. The Pseudo-Science Wars fills in the familiar outline of his career and controversy, as sketched above, with an abundance of new detail as well as insight into what Gordin calls “the development of Velikovskian auto-mythology.”
We learn, for one thing, that tales of a deliberate campaign of letter-writing and well-organized pressure on Macmillan through the threat of a boycott have little evidence to back them up. Accounts treating Velikovsky as an American Galileo typically suggest that his opponents wanted to prevent his ideas from receiving any hearing at all – that they were, in principle if not in method, book-burners.
But the existing documents suggest that the scientific community was chiefly troubled at seeing Worlds in Collision issued under the full authority of a major science and textbook publisher. A number of scientists responded by exerting pressure on Macmillan, but Gordin says the letters “were disorganized, uncoordinated, and threatened different things – some not to buy books, some not to referee manuscripts, others not to write them.”
Textbooks represented up to 70 percent of the publisher’s revenue, so professorial displeasure “had to be taken seriously. Macmillan could not afford to call it a bluff.” Once Worlds in Collision was sold to Doubleday (a trade publisher) scientists were content to mock the author’s grasp of geology, chemistry, celestial mechanics, etc. – or simply to ignore the book altogether.
Velikovsky converted the episode into a kind of moral capital, and Gordin demonstrates how shrewdly he and his admirers used it to build a scientific counter-establishment -- what one might otherwise call a full-scale pseudoscience. The analysis requires a number of detours, heading into territory where intellectual historians seldom venture – as in the sad tale of Donald Wesley Patten, author and publisher of The Biblical Flood and the Ice Epoch (1966), who ultimately proved too Velikovskian for the fundamentalists, and vice versa.
For a long time it seemed as if no one could go beyond Beyond Velikovsky. Gordin's book does not replace the earlier study, which remains an interesting and valuable book, and certainly worth the attention of anyone trying to decide whether to explore the terrain in more detail. But The Pseudoscience Wars puts the catastrophist’s ideas and aura into a wider and thicker context of ideas, people, and institutions -- a remarkable array, spanning from the Soviet genetics debates of the 1940s to today's fractious niche of (please accept my sincere apology for this next word) post-Velikovskyism.
Speaking of which, let me end with a prediction. While reading Gordin, it crossed my mind that the scenario of upheaval in Worlds in Collision might well speak to the sense of how precarious our little ball in space really is. Americans are not a thrifty people, but we do tend to recycle our cultural phenomena, and if there is one 20th-century idea that seems a likely candidate for 21st-century revival, it is probably catastrophism.