His reputation will never recover from that unfortunate business in Salem, but Cotton Mather deserves some recognition for his place in American medical history. He was the anti-vaccination movement’s first target.
The scene was Boston in 1721. Beginning in April, a smallpox epidemic spread from a ship anchored in the harbor; over the course of a year, it killed more than 840 people. (Here I’m drawing on Kenneth Silverman’s excellent The Life and Times of Cotton Mather, winner of the Pulitzer Prize for biography in 1985.) In the course of his pastoral duties, Mather preached the necessary funeral sermons, but he was also a corresponding member of the Royal Society of London for Improving Natural Knowledge. The Puritan cleric had been keenly interested in medical issues for many years before the epidemic hit. He knew of a treatment, discussed in the Society’s journal, in which a little of the juice from an infected person’s pustule was scratched into the skin of someone healthy. It warded off the disease itself, somehow. The patient might fall ill for a short while, but would be spared the more virulent sort of infection.
Two months into the epidemic, Mather prepared a memorandum on the technique to circulate among area doctors, one of whom decided to go ahead with a trial run on three human guinea pigs. All survived the experiment, and in a remarkable show of confidence Mather had his son Samuel inoculated. (Mather himself had contracted smallpox in 1678, so was already immune.)
News of the procedure and its success became public just as the epidemic was going from worrying to critical, but not many Bostonians found the developments encouraging. The whole idea seemed absurd and dangerous. One newspaper mocked the few supporters of inoculation for giving in to something “like the Infatuation Thirty Years ago, after several had fallen Victims to the mistaken notions of Dr. M____r and other clerics concerning Witchcraft.”
Still more unkind was the person or persons responsible for trying to bomb Mather’s house. It failed to go off, but the accompanying note made the motive clear: “You dog, damn you, I’ll inoculate you with this….”
The colonial era falls outside the purview of Vaccine Nation: America’s Changing Relationship with Vaccination (University of Chicago Press) by Elena Conis, an assistant professor of history at Emory University, who focuses mainly on the 20th century, especially its last four decades. The scene changed drastically since Mather's day. Knowledge lagged behind technique: pioneering though early vaccination advocates were, they had no sound basis for understanding how inoculation worked. And the “natural philosophy” of Mather’s era was nowhere near as institutionalized or authoritative as its successor, the sciences, grew in the 19th century.
By the point at which Vaccine Nation picks up the story -- with John F. Kennedy announcing what would become the Vaccination Assistance Act of 1962 – both the nation-state and the field of biomedical research were enormous and powerful, and linked up in ways that Conis charts in detail. “If the stories herein reveal just one thing,” she writes, “it is that we have never vaccinated for strictly medical reasons. Vaccination was, and is, thoroughly infused with our politics, our social values, and our cultural norms.”
Be that as it may, the strictly medical reasons were compelling enough. The Act of 1962 was a push to make the Salk vaccine -- which between 1955 and 1961 had reduced the number of new polio cases from 30,000 to under 900 – available to all children. This seems like progress of a straightforward and verifiable sort, with the legislation being simply the next step toward eradicating the disease entirely. (As, indeed, it effectively did.)
But in Conis’s account, the fact that JFK announced his support for a vaccination program on the anniversary of Franklin Delano Roosevelt’s death was more than a savvy bit of framing. The reference to FDR, “the nation’s most famous polio victim and survivor,” also “invoked the kind of bold, progressive Democrat [JFK] intended to be.” It positioned his administration as sharing something with “the nation’s impressive biomedical enterprise and its recent victory against a disease that had gripped Americans with fear in the 1940s and 1950s.”
It was technocratic liberalism at its most confident -- a peak moment for the belief that scientific expertise might be combined with far-sighted government to generate change for the common good. And it’s all pretty much downhill from there: Vaccine Nation is, in large part, the story of an unraveling idea of progress. True, scientists developed new vaccines against measles, diphtheria, rubella, and other diseases. But at the same time, the role of federal power in generating “public awareness and acknowledgement of a set of health threats worth avoiding” came into question. So did public trust in the authority of medical science and practice.
The erosion was, in either case, a drawn-out process. A couple of instances from Conis’s narrative will have to suffice as examples. One was the campaign against mumps. The military lost billions of man-hours to the highly contagious disease in the course of the two world wars. But a vaccine against mumps developed in the 1940s was left on the shelf when peace came. Mumps went back to being treated as a childhood ailment, rather than a disease with an associated cost.
But the postwar baby boom created a new market of parents susceptible to warnings about the possible (if very rare) long-term side-effects of getting mumps in childhood. Messages about the responsibility to immunize the kids were targeted at mothers in particular, stressing that the possible danger from contracting mumps made prevention more urgent than statistics could ever measure.
The logic of that appeal – “Why risk a danger that you can actively avoid?" – applied in principle to any disease for which a vaccine could be manufactured, and by the 1970s, early childhood meant having a cocktail of them shot into the arm on a regular basis. Then came the great swine flu scare of ’76. The government warned of an impending crisis, stockpiled a vaccine for it, and began immunizing people – especially the elderly, who faced the greatest risk.
The epidemic never hit, but the vaccine itself proved fatal to a number of people and may have been the cause of serious medical problems for many more. All of this occurred during the last months of Gerald Ford’s administration, though it has somehow become associated with the Carter years. There is no historical basis for the link, but it has the ring of truthiness. The whole debacle seemed to refute JFK’s vision of science and the state leading the march to a safer and healthier future.
The largely unquestioned confidence in vaccination was perhaps a victim of its own success. Insofar as nearly everyone was immunized against several diseases, any number of people suffering from a medical problem could well believe that the shots had somehow caused it or made them susceptible. And in some cases there were grounds for the suspicion. There were also cases of inoculation inducing the disease it was supposed to prevent, as well as allergic reactions to substances in the vaccine.
But Colis sees the rise of an anti-vaccination mood less as direct response to specific problems than as a byproduct of countercultural movements. Feminists challenged the medical profession’s unilateral claim of authority, and some women took the injunction to protect children by immunizing them and turned it on its head. If they were responsible for avoiding the risk, however slight, of preventable childhood illnesses, then they were equally responsible for avoiding the dangers, however unlikely, posed by vaccines.
Another strain of anti-vaccinationist thinking was an offshoot of environmental awareness. While industrial society polluted the air and water, heedless of the effects, medicine was pumping chemicals and biological agents into the smaller ecosystem of the human body.
Similar concerns had been expressed by opponents of vaccination in the late 19th and early 20th centuries -- though without much long-term effect, particularly given the effectiveness of immunization in preventing (even obliterating) once-terrifying diseases. Conis depicts anti-vaccinationists of more recent times as more effective and better-established.
Besides the feminist and ecological critiques, there is the confluence of anti-government politics and new media. Supporters of vaccination once downplayed the issue of side effects, but it’s an area that demands – and is receiving – serious medical investigation.
In places, Vaccine Nation suggests that the critics and opponents have made points worthy of debate, or at least raised serious concerns. And that may be true. It would almost have to be, the real question being one of degree.
But even with that conceded, many of the arguments the author cites are … well, to be nice about it, unpersuasive. “DISEASE IS NOT SOMETHING TO BE CURED,” says one vintage anti-vaccinationist tract revived in the 1980s. “IT IS A CURE.” The cause for illness? “Excess poisons, waste matters, and incompatible food” – but not, most emphatically, germs.
“Did you know,” asks another figure Conis quotes, “that when immunity to disease is acquired naturally, the possibility of reinfection is only 3.2 percent? If the immunity comes from a vaccination, the chance of reinfection is 80 percent.” In a footnote, Conis indicates that the source of these fascinating statistics “is unclear.” That much, I bet, is true.
Poor old Cotton Mather’s thinking combined superstition and enlightened reason. They can and do mix. But not in a statement such as “DISEASE IS NOT SOMETHING TO BE CURED. IT IS A CURE." The good reverend would dismiss that as little more than ignorance and magical thinking -- and rightly so.
Some years ago I met a woman who owned a large calico cat bearing a certain resemblance to Queen Victoria: stout, regal, disapproving. She had enjoyed her mistress’s undivided attention as a kitten; thesecond cat joining them a few years later proved easy to dominate. But the large male primate who began coming to the apartment on some evenings was another matter. I appeared incapable of taking a hint, and she was not amused.
Before long I was spending most evenings there. Oaf though I may have been, I did get a feeling of being disdained, at best, and could imagine the older cat taking the woman aside to say, through unhappy looks and feline telepathy, “This guy has got to go.” If things ever reached that point, the woman held her ground – and reader, I married her. (The cat with less seniority had in the meantime grown fond of me, which may have helped.)
The situation de-escalated before reaching the stage depicted in Octave Tassert’s Die eifersüchtige Katze, one of the paintings reproduced in Jealousy (Yale University Press) by Peter Toohey, a professor of classics at the University of Calgary. The author roams across several cultures, media, and disciplines in his investigation of the green-eyed passion. In literature, jealousy tends to resemble a kind of madness, and it usually becomes part of the daily news only after escalating into lethal violence. But Tassert’s canvas presents the emotion in one of its more comic expressions.
Painted circa 1860, “The Jealous Cat” depicts a love triangle of sorts. We see a woman sprawled on her bed in dishabille -- with her lover in, let’s say, close proximity, still clothed for the most part but with his pants below his knees. (A coat hangs on a nearby chair, not draped over the back but thrown on it at an angle suggesting haste.) At first glance he appears to be standing. But the angle of his legs and the way one arm seems to be swinging upward -- and the startled expression on his face as he looks over his shoulder -- all suggest he has just bolted upright. Just behind him, and a little lower, you see the creature giving the painting its title: a jealous cat, stretching up to sink both claws into the man’s exposed buttocks.
“They’re obviously more tempting than the uninspiring ball of string left by the chair,” Toohey deadpans. The painting itself is humorous but it raises perennial questions about emotion: Do animals have them, or is that just anthropomorphizing? And if they do experience feelings that in a human would be understood as emotion, how similar to ours are they?
Animals’ inability to self-report their own mental states makes any answer more or less unverifiable, and we are in the same position regarding the emotions of the human child in its first few years. What we have with both nonverbal animals and preverbal infants is behavior that looks and sounds like what we associate with happiness, excitement, fear, and perhaps one or two other emotions. But is jealousy among them? The experience of it can be raw and overwhelming, but it responds to a situation that is fairly complex. “The foundation stone of jealousy,” writes Toohey, “is triangular”: the product of a situation “usually [involving] two people and some form of possession, animate or inanimate.” The classic form – “the clichéd sine qua non of the jealous situation,” as the author puts it – is the romantic triangle: the jealous party’s claim on the significant other is violated, or at least menaced, by a rival.
Whether or not its brain can process all the elements in play, the jealous feline in Octave Tassert’s painting has at least determined the fastest and most efficient way to disrupt the situation. A desire to hurt the rival may not be noble, but it’s understandable and reasonably straightforward, especially when the rival is standing right there.
Human beings are prone to making things more complicated. The desire for retribution can target the beloved as well as the rival, and even become more intense – sometimes to really horrifying extremes. The author cites one case that sounds like the brainchild of an exploitation-movie director trying to outdo the competition: A British man who spent a week beating, strangling, and threatening his girlfriend also tried to fill her ears and eyes with quick-sealing putty. In handing down a prison sentence, the magistrate told him: “You are almost insanely jealous.” Almost?!
Explosions of jealousy -- even of sexual jealousy, by all counts the most excruciating sort -- usually stop short of mayhem. Toohey notes that there is just enough of a stigma around jealousy to limit how openly we feel comfortable expressing it. At the same time, jealousy is a persistent enough force to make subduing it hellishly difficult, and also irresistible as raw material for art and literature. In Othello, the work most indelibly identified with the experience of jealousy, Shakespeare treats it as a passion that, once ignited, feeds itself, with imagination as the fuel -- even when the grounds for it are entirely false.
Toohey writes of the moment when an individual sees or hears something that ignites the emotion. Even when based in rock-solid fact – with no Iago whispering baseless insinuations – the suffering of the jealous person comes mostly from scenes and conversations running in an obsessive loop within the mind. One of the most interesting chapters of Jealousy considers how literary and artistic works present our eyes and ears as the organs that make us vulnerable to the suspicion then elaborated upon within the brain’s theater.
Perhaps that accounts for the bizarre revenge taken by the “almost insanely jealous” man mentioned earlier. And perhaps imagination is the factor distinguishing human jealousy from whatever it is animals feel when faced with rivalry. Our motives are more complex, and our memories are longer. That gives us an evolutionary advantage. But it also opens up wide vistas of potential misery, where the jealous mind is condemned to wander in circles.
Hannah Arendt’s thinking takes an unexpected turn in the final paragraph of The Origins of Totalitarianism -- rescuing a glimmer of hope from the ruins of the 20th century’s final solutions and ends-of-history. Permanent closure is not in our nature as a species. What distinguishes us, rather, is the power to begin. It is “the supreme capacity of man,” she writes; “politically, it is identical with man's freedom.”
As consolations go, this seems wanting. It sounds cold, distant, and much too abstract to be of much comfort -- though by that point, readers will take what they can get. But Arendt’s closing sentence begins to put some flesh on the idea: “This beginning is guaranteed by each new birth; it is indeed every man.” Her reference to birth is literal. Human existence changes -- or at least has the potential to change -- with each new participant. And that is true in a way that can’t be with, for example, chimpanzee existence, for our mode is defined by language, tools, and institutions. None of them are programmed into the human genome (at most, the potential to assimilate them might be) and yet they are transmitted from one generation to the next. Without the thick exoskeleton of culture and technology, we would be little more than exceptionally vulnerable primates, distinguished by hairlessness and a peculiar tendency to walk upright.
But the exoskeleton does not grow quickly. Our physical birth is, in effect, always premature; it takes several years for the rest of what makes us human to develop. And we are not passive participants in that effort. Arendt elaborated on the idea a few years later in The Human Condition:
“With word and deed we insert ourselves into the human world, and this insertion is like a second birth, in which we confirm and take upon ourselves the naked fact of our original physical appearance.”
Education, then, is an umbilicus, nourishing us throughout the more or less protracted transition from infancy to personhood. Complicating the analogy is the fact that the process has no fixed term. Nature sets limits to its gestation period; culture tends to be much less decisive. Periods of rapid social or technological change make it especially hard to define the skills and competencies required for full, functional maturity.
Robert Pogue Harrison’s Juvenescence: A Cultural History of Our Age (University of Chicago Press) concerns the still more problematic situation that emerges after the flux has gone on for decades. Harrison, the chair of graduate studies in Italian at Stanford University, delivered an earlier version of the book as a series of lectures at the Collège de France.
“An older person has no idea what it means to be a child, an adolescent, or a young adult in 2014,” he writes. “Hence he or she is hardly able to provide any guidance to the young when it comes to their initiation into the public sphere – the public sphere for which the young must eventually assume responsibility, or pay the consequences if they fail to meet that responsibility. It has yet to be seen whether a society that loses its intergenerational continuity to such a degree can long endure.”
Taken out of context, that sounds like an example of a genre of cultural criticism that might be called “the higher worrying.” With a change of date it could have been published at any point in the last century. (See also Paul Valery's essay “The Crisis of the Mind” : “We later civilizations, we too know that we are mortal..... We see now that the abyss of history is deep enough to hold us all.”)
Fortunately there is more to Juvenescence than that. Harrison recognizes that a gulf between generations is not the disruption of otherwise stable and healthy cultural patterns. In fact, it is a cultural pattern -- one articulated and reflected on in literature, philosophy, and political thought for at least 2500 years now. And at issue are qualities of mind more complex than anything expressed in stereotypical contrasts between youth (eager, impatient, impulsive, resilient) and age (sober, cautious, sententious, fond of taking naps).
Harrison’s thinking develops in dialogue with Hannah Arendt – among many others, though her concept of natality, which I sketched earlier, seems especially important for him in Juvenescence. We are born into a particular society that exists before we do, and will presumably continue to do so for some while afterward, but that isn’t eternal or static. It leaves its mark on us (and we on it, to whatever degree). We are affected by its changes.
More to the point we are part of the changes, even when we are incapable of recognizing them. (Especially then, in fact.) It’s possible to get some perspective on things -- to challenge, or at least evaluate, what we’ve come to accept and expect from the world – through learning about the past, or formulating questions, or absorbing stories and other cultural expressions of other people.
Harrison coins the expression heterochronicity to point out the reality the present is never pure or self-contained. The people around us are being pushed and pulled by senses of the world (including memories and expectations) that can be profoundly different from our own, and from one another. Heterochronicity is the matrix of generational conflict, but Juvenescence explores it through readings of Antigone and King Lear rather than the contrasts between boomers and millennials.
The book is somehow both digressive and closely reasoned, and arguably it owes as much to Giambattista Vico or Stephen Jay Gould as it does to Hannah Arendt. The words “cultural history” appearing in the subtitle are not especially helpful in conveying the quality of the book, which would more accurately be called a meditation -- or, better still, an anatomy. (But then that might be even more misleading in the era of Viagra and cosmetic surgery.) It’s odd and brilliant -- clearly the product of thought given time to ripen.
The National Endowment for the Humanities on Monday announced a new grant program to promote the publication of serious nonfiction, based on scholarly research, on subjects of general interest and appeal. Winners of the grants will receive stipends of $4,200 per month for 6-12 months. A statement from NEH Chairman William D. Adams said: “In announcing the new Public Scholar program we hope to challenge humanities scholars to think creatively about how specialized research can benefit a wider public.”
Last weekThe New York Times published a reproduction of a poison-pen letter that Martin Luther King Jr. received 50 years ago this month, a few weeks before he accepted the Nobel Peace Prize. A couple of passages in the screed suggest it was accompanied by audiotape of MLK in a hotel room, indulging in a round of extramarital recreation. King and his circle assumed that J. Edgar Hoover was behind the whole thing – a reasonable guess, since bugging a hotel room counted as a sophisticated surveillance operation in 1964.
Portions of the letter have been quoted by King’s biographers for years, and Hoover’s animus against King and the rest of the civil rights movement was obvious enough. But in her essay for the Times, Beverly Gage -- the history professor from Yale who found the original draft in the National Archives – underscores something that only shows through with the whole document in front of you. It might be called an element of psychosexual frenzy.
The note -- prepared by one of Hoover’s agents, but reflecting his own preoccupations regarding King -- purported to be from a disillusioned African-American supporter. MLK’s “alleged lovers get the worst of it,” writes Gage. “They are described as ‘filthy dirty evil companions’ and ‘evil playmates,’ all engaged in ‘dirt, filth, evil and moronic talk.’ The effect is at once grotesque and hypnotic, an obsessive’s account of carnal rage and personal betrayal…. Near the end, it circles back to its initial target, denouncing him as an ‘evil, abnormal beast.’ ”
All in a day’s work at J. Edgar’s FBI. The only thing surprising about the note is the lack of any charge that King was a Communist Party stooge. Hoover’s practice of collecting information on the sex lives of prominent individuals served the perfectly straightforward function of bolstering his personal authority, of course. And it worked: he served as the bureau’s director for almost 50 years, in part because he had the goods in his hands to derail any effort to replace him. But there is also a hint of voyeurism to the director’s “Official/Confidential File.” Blackmail is power -- and power, as someone once said, is the ultimate aphrodisiac.
The director only comes onstage about halfway through Jessica R. Pliley’s Policing Sexuality: The Mann Act and the Making of the FBI (Harvard University Press). It would be excessive to call Hoover a minor figure in the book, but it certainly displaces him from his familiar status as prime mover in the bureau’s history.
Pliley, an assistant professor of women’s history at Texas State University, begins a generation or two before the creation of the Bureau of Investigation in 1908 (the name was changed in 1935), with the stresses and strains of American society in the late 19th century that ultimately gave rise to one of the laws the bureau tried to enforce: the Mann Act, which made it a felony to transport “any woman or girl” across state lines “for the purpose of prostitution or debauchery, or for any other immoral purpose.”
The law, passed in 1910, now seems almost idiomatically peculiar: As with the decision to make alcohol, tobacco, and firearms the purview of a single law-enforcement agency, most people would have a hard time explaining the logic behind it. Pliley traces its roots to a series of moral panics in the United States over the changes induced by the country’s rapid expansion and urbanization. A growing national economy brought with it an expanded market for prostitution -- the horrors of which were summed up by 19th-century reformers as “white slavery.”
"[M]easures to reassert control over the American libido were always one or two steps behind the social changes -- and enforcement could never be much more than episodic."
That phrase expressed the moral fervor of the abolitionist spirit finding a new cause, while also carrying its share of racial overtones, especially in sensational accounts of blue-eyed girls servicing the lusts of nonwhite customers. The influx of immigrants was another concern. Women finding their way in a new country were especially vulnerable. But there was also the need to protect America's precious bodily fluids from the contaminating influence of foreign cultures, with their deplorably lax moral standards and unwholesomely exotic bedroom practices. (Despite the xenophobia, there was something to the last point. By the 1920s, any bordello trying to keep its clientele had to offer “the French,” i.e., fellatio.)
Urbanization and the automobile multiplied the temptations for other sins of the flesh, as well as the venues for committing them. The danger of a young woman being seduced and abandoned after false promises of marriage became more intense when parents knew that the cad might impregnate her in the rumble seat, then drive off to who knows where.
Pliley devotes most of the first third of her book to building up, layer by layer, a picture of the trends and anxieties of the period -- some of them overblown, but with enough examples from the legal record of women raped and then forced into sexual labor to show that it wasn’t all a matter of yellow journalism.
Pliley also discusses various laws and social campaigns that emerged in response -- efforts to shore up the norms by which sexual activity would be restricted to monogamous, legally married straight couples of the same race, who, while not necessarily born in the U.S., otherwise tried to make themselves as inconspicuous as possible. But measures to reassert control over the American libido were always one or two steps behind the social changes -- and enforcement could never be much more than episodic.
When Representative James R. Mann proposed the White Slave Traffic Act (soon to be known by his name) to Congress in 1910, its odd mandate reflected the effort to patch over some of the existing gaps in terms just broad enough to cover problems that ever-faster means of transportation were bound to create.
It met a little opposition. One Congressman expressed concern that “immoral purposes” was so vague that it might apply to horse racing and chicken fighting. Southern politicians were initially troubled that the law might infringe on states’ rights, but found themselves charged with a lack of concern with the protection of white womanhood, which settled the matter soon enough. President Taft signed the bill into law the day Congress sent it to his desk.
The burden of enforcing the Mann Act soon fell to the Justice Department’s recently formed Bureau of Investigation, which had a small staff and not much precedent for how to proceed. An early investigation seemed like a promising way to crack the organized traffic in prostitution between bordellos in Connecticut, Louisiana, and other states. But it turned out the hookers operated as free agents who traveled from bordello to bordello in a circuit. Customers, madams, and sex workers alike seem to have found it a reasonably satisfactory arrangement.
Pliely points out that the whole “white slavery” discourse rested on the idea that women wanted, more or less by instinct, to establish a monogamous relationship and start a family, and would only enter or remain in prostitution under threat of violence. But the interstate pimp ring that turned out not to exist suggested otherwise. The author shows that a great deal of the case load for agents in the early decades of the bureau pertained to cases of adultery where the lovers had fled the state. The aggrieved spouse could charge the adulterous man with violating the Mann Act, despite his paramour being perfectly happy with the situation. She had been taken “across state lines for immoral purposes,” though the investigation usually ended once she had agreed to return to her husband.
Thanks to the Great Depression, the bureau was able to enter the headlines for cases involving banks robbers and gangsters – and, a bit later, political radicals, as well as professional spies. But Pliley notes that in the late 1930s, Hoover (who joined the bureau in 1919 and became director five years later) reasserted the original understanding of the Mann Act as a measure against prostitution.
“The Bureau investigated only when the right person invited it,” she writes, “a father, a husband, or a male local law enforcement official. When the Bureau considered aggravated cases of sexual exploitation, it almost always conceived of prosecuting these crimes as defending the family (and concomitantly upholding men’s rights to control the sexuality of their dependents) rather than upholding an idea of female sexual sovereignty.”
It seems almost superfluous to mention the other implicit requirement: the man in question had to be white. The author names a few cases in which the complainant was of another color, but it seems that the most agents ever did was to fill out some paperwork, presumably to humor him.
None of this can really be attributed to Hoover, though. He executed the law, and enforced its biases, but they were established well before he joined the Bureau.
Policing Sexuality takes the story up to roughly America’s entry in World War Two, but I think the surveillance of MLK and the vicious letter from 50 years ago take on a new aspect in light of Pliley’s research. She directs our attention away from the director to the matrix in which the Bureau took shape. That challenges the habit of regarding the FBI as an institution shaped, and distorted, by his personality -- parts of which are expressed in the letter to King, written by a subordinate who knew what he wanted.
But the letter also echoes the concerns that Pliley finds in the Mann Act well before Hoover took power. Besides hostility to African-American advancement (one undercurrent of the "white slavery" theme), it expresses a fervent, one might even say deranged, aversion to sex outside of marriage. That Hoover shared these attitudes made him a perfect fit for the job. He thrived in it, and was good at it, although “good” isn't really how it looks from here.
The American Psychological Association will conduct an independent investigation into whether it colluded with the government concerning post-9/11 interrogation practices, The New York Timesreported. The investigation appears to have been prompted by new revelations about association staff members' involvement in shaping policies for psychologists involved in interviewing suspected terrorists during the Bush administration. The revelations appear in a new book, Pay Any Price: Greed, Power and Endless War, written James Risen, an investigative report for The Times. The association criticized Risen's reporting last month, but Risen said it didn't refute key claims.
An eloquent commentator once declared that the new communications technology “[had], as it were, assembled all of mankind upon one great plane, where they can see everything that is done and hear everything that is said, and judge of every policy that is pursued at the very moment those events take place.”
A trifle overblown, yes, but it’s held up better than many other rhapsodies and prophecies inspired by new media over the years. Nowadays we have too much perspective to believe that “mankind” can really “see everything that is done and hear everything that is said.” (Only people with access to the NSA servers enjoy that privilege.) But there is no denying the commentator’s clear sense of human experience speeding up -- with news and information moving faster than ever before, so that people would have to adapt, somehow, or else be crushed by the juggernaut of progress.
It happens that the far-sighted analyst here was Lord Salisbury, three-time prime minister of Britain, addressing the founding meeting of the Institute of Electrical Engineers in 1889; the technology in question was the telegraph. Judy Wajcman cites his remark in Pressed for Time: The Acceleration of Life in Digital Capitalism (University of Chicago Press), while criticizing the common idea “that our current ambivalence toward technological change has no precedent.” Wajcman, a professor of sociology at the London School of Economics and Political Science, gives the date as 1899, which is perhaps as much an echo as a typo: Salisbury’s comment sounds a bit like the techno-boosterism and globalization-speak common during the late ‘90s of the more recent century.
But for Wajcman, it’s the overtone of uneasiness that counts -- and she’s undoubtedly right to emphasize it, given the speaker. His Lordship was a rock-ribbed conservative who, it seems, once boiled his principles down to a pithy formula: “Whatever happens will be for the worse, and therefore it is in our interest that as little should happen as possible.” As a political strategy, that, too, sounds curiously familiar and contemporary.
Pressed for Time has at its core a paradox that will have occurred to most readers at some point: On the one hand, the technological innovations that come our way are designed to be efficient; they promise to save time and energy. In principle, the savings should add up, so that we’d have more of each. But scarcely anyone feels that they do add up. If anything, people seem to feel ever more harried.
The situation is genuinely paradoxical, since the technology really does tend to become faster and more efficient, and more Swiss Army Knife-like in near-universal applicability. By rights, we should all be enjoying what Wajcman calls “temporal sovereignty and sufficient leisure time,” and little more of each all the time. Yet the gizmos and apps are part of the problem, somehow. Indeed it often seems that they are the problem itself -- as if their speed and power set the pace, like a treadmill that accelerates when you walk faster, without ever slowing down if you can’t keep up.
Wajcman cites a study of Blackberry use among “corporate lawyers, venture capitalists, and investment bankers” who said, in interviews, that mobile email “enhance[d] their flexibility, control, and competence as professional workers.” But the seeming increase in personal autonomy canceled itself out through “the unintended consequences of collective use.” In other words, the advantage to an individual of being able to work and communicate whenever and wherever it was possible or convenient “also heightens expectations of availability and responsiveness” from colleagues, who also have continuous connectivity, thereby “reducing [one’s] personal downtime and increasing stress” by “escalating engagement with work at all hours of day and night.”
The “autonomy paradox” (as the researchers called it in a journal article) isn’t just for corporate lawyers, venture capitalists, or investment bankers anymore – or even for Blackberry users, that dwindling breed. It is the way we live now.
But as Wajcman digs into the conundrum, Pressed for Time questions some routine assumptions about technology and culture made by sociologists as well as everyday citizens of modernity. One is the tendency to think that technical innovation induces social change in a fairly linear and one-directional way: a relationship of cause and effect, if not of technological determinism.
Lord Salisbury’s thumbnail assessment of the telegraph is one example. The new communication system allows information to move across vast distances instantaneously, or close enough for the Victorian era. Its social impact (the whole world becoming aware of breaking events in real time) was the direct and almost self-evident realization of the potentials inherent in the technology. The difference between Salisbury’s remark to the engineers and what Wajcman calls “grand, totalizing narratives of postindustrial, information, postmodern, network society” is often one of idiom more than of substance.
The science and technology studies (STS) research informing Pressed for Time, by contrast, focuses on the system of relays and feedback loops through which technological innovation and social life influence each other. Understanding the impact of the telegraph on people’s sense of space and time means also considering another development of that era, long-distance railway travel. In the pre-railroad era, time was set locally: the same moment showing as noon on the clocks in one town or city might be several minutes earlier or later on timepieces a few miles away.
The variation had not been much of a problem until the advent of a regular railway schedule. (Note that nothing in the technology itself made timetables inevitable. But they were essential if the railroad was to serve as a reliable way to get products to market.) The telegraph was an important tool for synchronizing places separated by long distances, with Greenwich mean time eventually bringing “the world within one grid of time,” writes Wajcman, “uprooting older, local ways of marking [its] passage of time.”
We make use of tools, and they return the compliment. The chains of cause and effect are knottier than we habitually assume. But the author’s analysis of the time-pressure paradox also challenges the supposition that technological developments impinge on us all equally, or at least in uniform ways. But there are pretty tangible grounds for arguing that they don’t.
It's possible to sit through many a discussion of time-and-labor-saving devices without more than a passing reference to the washing machine. Somehow a device operating mainly in the domestic sphere – traditionally the responsibility of women, who studies indicate still do two-thirds of the (unpaid) work -- counts as having less social significance than, say, transportation or communications technology. “To most commentators,” Wajcman writes, “the history of housework is the story of its elimination.” But while the washing machine does remove most of the drudgery of cleaning clothes, its effect has been less to reduce the total amount of domestic labor than to change its nature and priorities: less time spent on laundry, more time driving the family vehicle.
The technological developments of the past couple of decades are usually lumped together as “the digital revolution,” though that’s starting to sound quaint. At some point the cumulative effect will make it very difficult to imagine that things could be otherwise. Wajcman delivers one sharp tap after another at the calcified interpretations that surround those changes. It leaves the reader with a clear sense that paradox of becoming trapped by devices that promise to free us follows, not from the technology itself, but from habits and attitudes that go unchallenged.
The tools we now have probably could be used to shorten the workday for everyone, for example -- but we’d have to want that and make some effort to realize it. Instead, being constantly “on the grid,” overstressed from work, and emotionally available to other people only during designated (and calibrated) “quality time” has become a kind of status symbol. Pressed for Time helps elucidate how things shaped up as they have. It seems less paradoxical than pathological, but Wajcman suggests, rather quietly, that it doesn’t have to be this way.
Our devices grow ever more efficient, but our lives only more hectic. Scott McLemee reviews a book on the paradox of digital temporality.