Tobacco itself is no aphrodisiac, but one of the great tropes of classic American cinema might be called “the precoital cigarette” -- an emblem of desire smoldering on-screen when Humphrey Bogart gazes at Lauren Bacall, or when she exhales after accepting a light.
It was foreplay by proxy, or as much of it as Hollywood once allowed. And 70 years later, the scenes still work. All the gestures of asking for a smoke or offering one -- the moments of sharing a cigarette or plucking it from someone’s lips to make way for a kiss -- still communicate feelings of intimacy and languor, even for audiences that remember seeing the blackened lungs of smokers in health class and have never doubted the surgeon general’s warning.
The students whose behavior Mimi Nichter analyzes in Lighting Up: The Rise of Social Smoking on College Campuses (New York University Press) are in a similarly untenable position. They feel the allure while knowing better. “Young adults have the highest prevalence of smoking of all other age groups,” she notes, “with approximately 35 percent reporting that they currently smoke.”
At the same time, the undergraduates whose rituals and folk culture interest Nichter (a professor of anthropology at the University of Arizona) recognize the stigma attached to smoking. Bogie’s aura has faded. The smoker has become a pariah: unwelcome in restaurants and other public places, a menace to the health of others through secondhand smoke, at best the pitiful dupe of Joe Camel and other shills for Big Tobacco.
So how do they handle the cognitive dissonance? The short answer is that they make a distinction between enjoying a few cigarettes in social situations while in college and really being a smoker, i.e., someone addicted to nicotine for life. The contrast parallels that between social drinking and alcoholism. Social smoking is occasional rather than compulsive, something done in groups, never alone. No stigma need apply.
The comparison works at another level, since most of the social smoking discussed in Lighting Up takes place at parties where cigarettes go with alcohol “like cookies with milk,” to use one sorority girl’s expression.
As an anthropologist, rather than a psychologist, Nichter is ultimately less concerned with the rationalizations for smoking than the group situations and norms in which it is embedded. Besides conducting surveys and drawing on the work of other researchers, she has gathered detailed accounts of social smoking from native informants (freshmen and sophomores) and checked her ethnography by presenting draft chapters to her classes: “Students have told me that my descriptions of student life and smoking and drinking on campus are quite accurate.”
The picture that emerges is in some respects familiar to anyone who has ever been on a college campus in their late teens. Smoking, like drinking, is one of the behaviors perennially available for asserting the adult right to make decisions, which makes it appealing even for those who hadn’t been rebellious enough to try it in high school.
But Nichter’s inquiry also finds in effect now a common attitude towards social smoking as something to do while in college, but only then. It’s something you can and will quit once in “the real world.” Giving it up sooner would mean the loss of both a stress reliever and a set of routines useful for sociability. There are benefits to being able to introduce yourself via bumming a cigarette, to go outside for a smoke with friends at a party and to collect your thoughts before saying anything by pausing to light up.
Nichter’s respondents understood their smoking as “a habit that they engaged in when they chose to, at times when they and others seemed appropriate. ...Being really addicted, defined as ‘needing your cigarettes wherever you are,' was associated with those who were weak of will or had real problems. In contrast, many college students saw themselves as needing to smoke but only in a limited number of contexts.”
It’s not clear from Lighting Up’s otherwise very detailed account just when this cluster of attitudes and behaviors emerged. But occasional remarks by the author suggest that the antismoking public-service announcements of the past 20 years or so had a lot to do with it. Depicting smoking as addictive -- and reminding the public that tobacco companies have done research on how to make their products even more so -- seems to have had the paradoxical effect of encouraging young people to prove themselves able to light up while remaining in control.
But there are problems with such limit-setting efforts. One is that there is no definite threshold at which nicotine becomes addictive. The difference between smoking only at social events on weekends and low-level daily smoking (one or two cigarettes per day) begins to blur quite rapidly with students who begin unwinding on Thursday afternoons.
And while the undergraduate social-smoker ethos may be prepared to go cold turkey after the senior year, current trends make graduation less of a decisive transition point than it once was:
“Many grads today are stepping into an uncertain future, where the prospect of finding a good job in a timely manner is unlikely. Their 20s may be characterized by multiple moves (in and out of their parents’ and friends’ homes) and compounded by multiple stressors, not the least of which is finding oneself in a time of high unemployment and low wages. Moving into adulthood is now an elongated process, as markers of ‘settling down,’ like marriage, edge upward into one’s late 20s, if that. For those who have come to depend on the comfort of cigarettes during their college years, this array of life stressors may make cutting back or quitting more difficult, despite their intentions and understandings of the harms of tobacco.”
Smoking as a deliberate and controlled way to enjoy oneself is completely different from developing a nasty habit tinged with a death wish -- or it can be, for a while. The cigarette companies depend on people overestimating how much time they really have, and they're in no real danger of losing money on that score.
His reputation will never recover from that unfortunate business in Salem, but Cotton Mather deserves some recognition for his place in American medical history. He was the anti-vaccination movement’s first target.
The scene was Boston in 1721. Beginning in April, a smallpox epidemic spread from a ship anchored in the harbor; over the course of a year, it killed more than 840 people. (Here I’m drawing on Kenneth Silverman’s excellent The Life and Times of Cotton Mather, winner of the Pulitzer Prize for biography in 1985.) In the course of his pastoral duties, Mather preached the necessary funeral sermons, but he was also a corresponding member of the Royal Society of London for Improving Natural Knowledge. The Puritan cleric had been keenly interested in medical issues for many years before the epidemic hit. He knew of a treatment, discussed in the Society’s journal, in which a little of the juice from an infected person’s pustule was scratched into the skin of someone healthy. It warded off the disease itself, somehow. The patient might fall ill for a short while, but would be spared the more virulent sort of infection.
Two months into the epidemic, Mather prepared a memorandum on the technique to circulate among area doctors, one of whom decided to go ahead with a trial run on three human guinea pigs. All survived the experiment, and in a remarkable show of confidence Mather had his son Samuel inoculated. (Mather himself had contracted smallpox in 1678, so was already immune.)
News of the procedure and its success became public just as the epidemic was going from worrying to critical, but not many Bostonians found the developments encouraging. The whole idea seemed absurd and dangerous. One newspaper mocked the few supporters of inoculation for giving in to something “like the Infatuation Thirty Years ago, after several had fallen Victims to the mistaken notions of Dr. M____r and other clerics concerning Witchcraft.”
Still more unkind was the person or persons responsible for trying to bomb Mather’s house. It failed to go off, but the accompanying note made the motive clear: “You dog, damn you, I’ll inoculate you with this….”
The colonial era falls outside the purview of Vaccine Nation: America’s Changing Relationship with Vaccination (University of Chicago Press) by Elena Conis, an assistant professor of history at Emory University, who focuses mainly on the 20th century, especially its last four decades. The scene changed drastically since Mather's day. Knowledge lagged behind technique: pioneering though early vaccination advocates were, they had no sound basis for understanding how inoculation worked. And the “natural philosophy” of Mather’s era was nowhere near as institutionalized or authoritative as its successor, the sciences, grew in the 19th century.
By the point at which Vaccine Nation picks up the story -- with John F. Kennedy announcing what would become the Vaccination Assistance Act of 1962 – both the nation-state and the field of biomedical research were enormous and powerful, and linked up in ways that Conis charts in detail. “If the stories herein reveal just one thing,” she writes, “it is that we have never vaccinated for strictly medical reasons. Vaccination was, and is, thoroughly infused with our politics, our social values, and our cultural norms.”
Be that as it may, the strictly medical reasons were compelling enough. The Act of 1962 was a push to make the Salk vaccine -- which between 1955 and 1961 had reduced the number of new polio cases from 30,000 to under 900 – available to all children. This seems like progress of a straightforward and verifiable sort, with the legislation being simply the next step toward eradicating the disease entirely. (As, indeed, it effectively did.)
But in Conis’s account, the fact that JFK announced his support for a vaccination program on the anniversary of Franklin Delano Roosevelt’s death was more than a savvy bit of framing. The reference to FDR, “the nation’s most famous polio victim and survivor,” also “invoked the kind of bold, progressive Democrat [JFK] intended to be.” It positioned his administration as sharing something with “the nation’s impressive biomedical enterprise and its recent victory against a disease that had gripped Americans with fear in the 1940s and 1950s.”
It was technocratic liberalism at its most confident -- a peak moment for the belief that scientific expertise might be combined with far-sighted government to generate change for the common good. And it’s all pretty much downhill from there: Vaccine Nation is, in large part, the story of an unraveling idea of progress. True, scientists developed new vaccines against measles, diphtheria, rubella, and other diseases. But at the same time, the role of federal power in generating “public awareness and acknowledgement of a set of health threats worth avoiding” came into question. So did public trust in the authority of medical science and practice.
The erosion was, in either case, a drawn-out process. A couple of instances from Conis’s narrative will have to suffice as examples. One was the campaign against mumps. The military lost billions of man-hours to the highly contagious disease in the course of the two world wars. But a vaccine against mumps developed in the 1940s was left on the shelf when peace came. Mumps went back to being treated as a childhood ailment, rather than a disease with an associated cost.
But the postwar baby boom created a new market of parents susceptible to warnings about the possible (if very rare) long-term side-effects of getting mumps in childhood. Messages about the responsibility to immunize the kids were targeted at mothers in particular, stressing that the possible danger from contracting mumps made prevention more urgent than statistics could ever measure.
The logic of that appeal – “Why risk a danger that you can actively avoid?" – applied in principle to any disease for which a vaccine could be manufactured, and by the 1970s, early childhood meant having a cocktail of them shot into the arm on a regular basis. Then came the great swine flu scare of ’76. The government warned of an impending crisis, stockpiled a vaccine for it, and began immunizing people – especially the elderly, who faced the greatest risk.
The epidemic never hit, but the vaccine itself proved fatal to a number of people and may have been the cause of serious medical problems for many more. All of this occurred during the last months of Gerald Ford’s administration, though it has somehow become associated with the Carter years. There is no historical basis for the link, but it has the ring of truthiness. The whole debacle seemed to refute JFK’s vision of science and the state leading the march to a safer and healthier future.
The largely unquestioned confidence in vaccination was perhaps a victim of its own success. Insofar as nearly everyone was immunized against several diseases, any number of people suffering from a medical problem could well believe that the shots had somehow caused it or made them susceptible. And in some cases there were grounds for the suspicion. There were also cases of inoculation inducing the disease it was supposed to prevent, as well as allergic reactions to substances in the vaccine.
But Colis sees the rise of an anti-vaccination mood less as direct response to specific problems than as a byproduct of countercultural movements. Feminists challenged the medical profession’s unilateral claim of authority, and some women took the injunction to protect children by immunizing them and turned it on its head. If they were responsible for avoiding the risk, however slight, of preventable childhood illnesses, then they were equally responsible for avoiding the dangers, however unlikely, posed by vaccines.
Another strain of anti-vaccinationist thinking was an offshoot of environmental awareness. While industrial society polluted the air and water, heedless of the effects, medicine was pumping chemicals and biological agents into the smaller ecosystem of the human body.
Similar concerns had been expressed by opponents of vaccination in the late 19th and early 20th centuries -- though without much long-term effect, particularly given the effectiveness of immunization in preventing (even obliterating) once-terrifying diseases. Conis depicts anti-vaccinationists of more recent times as more effective and better-established.
Besides the feminist and ecological critiques, there is the confluence of anti-government politics and new media. Supporters of vaccination once downplayed the issue of side effects, but it’s an area that demands – and is receiving – serious medical investigation.
In places, Vaccine Nation suggests that the critics and opponents have made points worthy of debate, or at least raised serious concerns. And that may be true. It would almost have to be, the real question being one of degree.
But even with that conceded, many of the arguments the author cites are … well, to be nice about it, unpersuasive. “DISEASE IS NOT SOMETHING TO BE CURED,” says one vintage anti-vaccinationist tract revived in the 1980s. “IT IS A CURE.” The cause for illness? “Excess poisons, waste matters, and incompatible food” – but not, most emphatically, germs.
“Did you know,” asks another figure Conis quotes, “that when immunity to disease is acquired naturally, the possibility of reinfection is only 3.2 percent? If the immunity comes from a vaccination, the chance of reinfection is 80 percent.” In a footnote, Conis indicates that the source of these fascinating statistics “is unclear.” That much, I bet, is true.
Poor old Cotton Mather’s thinking combined superstition and enlightened reason. They can and do mix. But not in a statement such as “DISEASE IS NOT SOMETHING TO BE CURED. IT IS A CURE." The good reverend would dismiss that as little more than ignorance and magical thinking -- and rightly so.
Some years ago I met a woman who owned a large calico cat bearing a certain resemblance to Queen Victoria: stout, regal, disapproving. She had enjoyed her mistress’s undivided attention as a kitten; thesecond cat joining them a few years later proved easy to dominate. But the large male primate who began coming to the apartment on some evenings was another matter. I appeared incapable of taking a hint, and she was not amused.
Before long I was spending most evenings there. Oaf though I may have been, I did get a feeling of being disdained, at best, and could imagine the older cat taking the woman aside to say, through unhappy looks and feline telepathy, “This guy has got to go.” If things ever reached that point, the woman held her ground – and reader, I married her. (The cat with less seniority had in the meantime grown fond of me, which may have helped.)
The situation de-escalated before reaching the stage depicted in Octave Tassert’s Die eifersüchtige Katze, one of the paintings reproduced in Jealousy (Yale University Press) by Peter Toohey, a professor of classics at the University of Calgary. The author roams across several cultures, media, and disciplines in his investigation of the green-eyed passion. In literature, jealousy tends to resemble a kind of madness, and it usually becomes part of the daily news only after escalating into lethal violence. But Tassert’s canvas presents the emotion in one of its more comic expressions.
Painted circa 1860, “The Jealous Cat” depicts a love triangle of sorts. We see a woman sprawled on her bed in dishabille -- with her lover in, let’s say, close proximity, still clothed for the most part but with his pants below his knees. (A coat hangs on a nearby chair, not draped over the back but thrown on it at an angle suggesting haste.) At first glance he appears to be standing. But the angle of his legs and the way one arm seems to be swinging upward -- and the startled expression on his face as he looks over his shoulder -- all suggest he has just bolted upright. Just behind him, and a little lower, you see the creature giving the painting its title: a jealous cat, stretching up to sink both claws into the man’s exposed buttocks.
“They’re obviously more tempting than the uninspiring ball of string left by the chair,” Toohey deadpans. The painting itself is humorous but it raises perennial questions about emotion: Do animals have them, or is that just anthropomorphizing? And if they do experience feelings that in a human would be understood as emotion, how similar to ours are they?
Animals’ inability to self-report their own mental states makes any answer more or less unverifiable, and we are in the same position regarding the emotions of the human child in its first few years. What we have with both nonverbal animals and preverbal infants is behavior that looks and sounds like what we associate with happiness, excitement, fear, and perhaps one or two other emotions. But is jealousy among them? The experience of it can be raw and overwhelming, but it responds to a situation that is fairly complex. “The foundation stone of jealousy,” writes Toohey, “is triangular”: the product of a situation “usually [involving] two people and some form of possession, animate or inanimate.” The classic form – “the clichéd sine qua non of the jealous situation,” as the author puts it – is the romantic triangle: the jealous party’s claim on the significant other is violated, or at least menaced, by a rival.
Whether or not its brain can process all the elements in play, the jealous feline in Octave Tassert’s painting has at least determined the fastest and most efficient way to disrupt the situation. A desire to hurt the rival may not be noble, but it’s understandable and reasonably straightforward, especially when the rival is standing right there.
Human beings are prone to making things more complicated. The desire for retribution can target the beloved as well as the rival, and even become more intense – sometimes to really horrifying extremes. The author cites one case that sounds like the brainchild of an exploitation-movie director trying to outdo the competition: A British man who spent a week beating, strangling, and threatening his girlfriend also tried to fill her ears and eyes with quick-sealing putty. In handing down a prison sentence, the magistrate told him: “You are almost insanely jealous.” Almost?!
Explosions of jealousy -- even of sexual jealousy, by all counts the most excruciating sort -- usually stop short of mayhem. Toohey notes that there is just enough of a stigma around jealousy to limit how openly we feel comfortable expressing it. At the same time, jealousy is a persistent enough force to make subduing it hellishly difficult, and also irresistible as raw material for art and literature. In Othello, the work most indelibly identified with the experience of jealousy, Shakespeare treats it as a passion that, once ignited, feeds itself, with imagination as the fuel -- even when the grounds for it are entirely false.
Toohey writes of the moment when an individual sees or hears something that ignites the emotion. Even when based in rock-solid fact – with no Iago whispering baseless insinuations – the suffering of the jealous person comes mostly from scenes and conversations running in an obsessive loop within the mind. One of the most interesting chapters of Jealousy considers how literary and artistic works present our eyes and ears as the organs that make us vulnerable to the suspicion then elaborated upon within the brain’s theater.
Perhaps that accounts for the bizarre revenge taken by the “almost insanely jealous” man mentioned earlier. And perhaps imagination is the factor distinguishing human jealousy from whatever it is animals feel when faced with rivalry. Our motives are more complex, and our memories are longer. That gives us an evolutionary advantage. But it also opens up wide vistas of potential misery, where the jealous mind is condemned to wander in circles.
Hannah Arendt’s thinking takes an unexpected turn in the final paragraph of The Origins of Totalitarianism -- rescuing a glimmer of hope from the ruins of the 20th century’s final solutions and ends-of-history. Permanent closure is not in our nature as a species. What distinguishes us, rather, is the power to begin. It is “the supreme capacity of man,” she writes; “politically, it is identical with man's freedom.”
As consolations go, this seems wanting. It sounds cold, distant, and much too abstract to be of much comfort -- though by that point, readers will take what they can get. But Arendt’s closing sentence begins to put some flesh on the idea: “This beginning is guaranteed by each new birth; it is indeed every man.” Her reference to birth is literal. Human existence changes -- or at least has the potential to change -- with each new participant. And that is true in a way that can’t be with, for example, chimpanzee existence, for our mode is defined by language, tools, and institutions. None of them are programmed into the human genome (at most, the potential to assimilate them might be) and yet they are transmitted from one generation to the next. Without the thick exoskeleton of culture and technology, we would be little more than exceptionally vulnerable primates, distinguished by hairlessness and a peculiar tendency to walk upright.
But the exoskeleton does not grow quickly. Our physical birth is, in effect, always premature; it takes several years for the rest of what makes us human to develop. And we are not passive participants in that effort. Arendt elaborated on the idea a few years later in The Human Condition:
“With word and deed we insert ourselves into the human world, and this insertion is like a second birth, in which we confirm and take upon ourselves the naked fact of our original physical appearance.”
Education, then, is an umbilicus, nourishing us throughout the more or less protracted transition from infancy to personhood. Complicating the analogy is the fact that the process has no fixed term. Nature sets limits to its gestation period; culture tends to be much less decisive. Periods of rapid social or technological change make it especially hard to define the skills and competencies required for full, functional maturity.
Robert Pogue Harrison’s Juvenescence: A Cultural History of Our Age (University of Chicago Press) concerns the still more problematic situation that emerges after the flux has gone on for decades. Harrison, the chair of graduate studies in Italian at Stanford University, delivered an earlier version of the book as a series of lectures at the Collège de France.
“An older person has no idea what it means to be a child, an adolescent, or a young adult in 2014,” he writes. “Hence he or she is hardly able to provide any guidance to the young when it comes to their initiation into the public sphere – the public sphere for which the young must eventually assume responsibility, or pay the consequences if they fail to meet that responsibility. It has yet to be seen whether a society that loses its intergenerational continuity to such a degree can long endure.”
Taken out of context, that sounds like an example of a genre of cultural criticism that might be called “the higher worrying.” With a change of date it could have been published at any point in the last century. (See also Paul Valery's essay “The Crisis of the Mind” : “We later civilizations, we too know that we are mortal..... We see now that the abyss of history is deep enough to hold us all.”)
Fortunately there is more to Juvenescence than that. Harrison recognizes that a gulf between generations is not the disruption of otherwise stable and healthy cultural patterns. In fact, it is a cultural pattern -- one articulated and reflected on in literature, philosophy, and political thought for at least 2500 years now. And at issue are qualities of mind more complex than anything expressed in stereotypical contrasts between youth (eager, impatient, impulsive, resilient) and age (sober, cautious, sententious, fond of taking naps).
Harrison’s thinking develops in dialogue with Hannah Arendt – among many others, though her concept of natality, which I sketched earlier, seems especially important for him in Juvenescence. We are born into a particular society that exists before we do, and will presumably continue to do so for some while afterward, but that isn’t eternal or static. It leaves its mark on us (and we on it, to whatever degree). We are affected by its changes.
More to the point we are part of the changes, even when we are incapable of recognizing them. (Especially then, in fact.) It’s possible to get some perspective on things -- to challenge, or at least evaluate, what we’ve come to accept and expect from the world – through learning about the past, or formulating questions, or absorbing stories and other cultural expressions of other people.
Harrison coins the expression heterochronicity to point out the reality the present is never pure or self-contained. The people around us are being pushed and pulled by senses of the world (including memories and expectations) that can be profoundly different from our own, and from one another. Heterochronicity is the matrix of generational conflict, but Juvenescence explores it through readings of Antigone and King Lear rather than the contrasts between boomers and millennials.
The book is somehow both digressive and closely reasoned, and arguably it owes as much to Giambattista Vico or Stephen Jay Gould as it does to Hannah Arendt. The words “cultural history” appearing in the subtitle are not especially helpful in conveying the quality of the book, which would more accurately be called a meditation -- or, better still, an anatomy. (But then that might be even more misleading in the era of Viagra and cosmetic surgery.) It’s odd and brilliant -- clearly the product of thought given time to ripen.
The National Endowment for the Humanities on Monday announced a new grant program to promote the publication of serious nonfiction, based on scholarly research, on subjects of general interest and appeal. Winners of the grants will receive stipends of $4,200 per month for 6-12 months. A statement from NEH Chairman William D. Adams said: “In announcing the new Public Scholar program we hope to challenge humanities scholars to think creatively about how specialized research can benefit a wider public.”
Last weekThe New York Times published a reproduction of a poison-pen letter that Martin Luther King Jr. received 50 years ago this month, a few weeks before he accepted the Nobel Peace Prize. A couple of passages in the screed suggest it was accompanied by audiotape of MLK in a hotel room, indulging in a round of extramarital recreation. King and his circle assumed that J. Edgar Hoover was behind the whole thing – a reasonable guess, since bugging a hotel room counted as a sophisticated surveillance operation in 1964.
Portions of the letter have been quoted by King’s biographers for years, and Hoover’s animus against King and the rest of the civil rights movement was obvious enough. But in her essay for the Times, Beverly Gage -- the history professor from Yale who found the original draft in the National Archives – underscores something that only shows through with the whole document in front of you. It might be called an element of psychosexual frenzy.
The note -- prepared by one of Hoover’s agents, but reflecting his own preoccupations regarding King -- purported to be from a disillusioned African-American supporter. MLK’s “alleged lovers get the worst of it,” writes Gage. “They are described as ‘filthy dirty evil companions’ and ‘evil playmates,’ all engaged in ‘dirt, filth, evil and moronic talk.’ The effect is at once grotesque and hypnotic, an obsessive’s account of carnal rage and personal betrayal…. Near the end, it circles back to its initial target, denouncing him as an ‘evil, abnormal beast.’ ”
All in a day’s work at J. Edgar’s FBI. The only thing surprising about the note is the lack of any charge that King was a Communist Party stooge. Hoover’s practice of collecting information on the sex lives of prominent individuals served the perfectly straightforward function of bolstering his personal authority, of course. And it worked: he served as the bureau’s director for almost 50 years, in part because he had the goods in his hands to derail any effort to replace him. But there is also a hint of voyeurism to the director’s “Official/Confidential File.” Blackmail is power -- and power, as someone once said, is the ultimate aphrodisiac.
The director only comes onstage about halfway through Jessica R. Pliley’s Policing Sexuality: The Mann Act and the Making of the FBI (Harvard University Press). It would be excessive to call Hoover a minor figure in the book, but it certainly displaces him from his familiar status as prime mover in the bureau’s history.
Pliley, an assistant professor of women’s history at Texas State University, begins a generation or two before the creation of the Bureau of Investigation in 1908 (the name was changed in 1935), with the stresses and strains of American society in the late 19th century that ultimately gave rise to one of the laws the bureau tried to enforce: the Mann Act, which made it a felony to transport “any woman or girl” across state lines “for the purpose of prostitution or debauchery, or for any other immoral purpose.”
The law, passed in 1910, now seems almost idiomatically peculiar: As with the decision to make alcohol, tobacco, and firearms the purview of a single law-enforcement agency, most people would have a hard time explaining the logic behind it. Pliley traces its roots to a series of moral panics in the United States over the changes induced by the country’s rapid expansion and urbanization. A growing national economy brought with it an expanded market for prostitution -- the horrors of which were summed up by 19th-century reformers as “white slavery.”
"[M]easures to reassert control over the American libido were always one or two steps behind the social changes -- and enforcement could never be much more than episodic."
That phrase expressed the moral fervor of the abolitionist spirit finding a new cause, while also carrying its share of racial overtones, especially in sensational accounts of blue-eyed girls servicing the lusts of nonwhite customers. The influx of immigrants was another concern. Women finding their way in a new country were especially vulnerable. But there was also the need to protect America's precious bodily fluids from the contaminating influence of foreign cultures, with their deplorably lax moral standards and unwholesomely exotic bedroom practices. (Despite the xenophobia, there was something to the last point. By the 1920s, any bordello trying to keep its clientele had to offer “the French,” i.e., fellatio.)
Urbanization and the automobile multiplied the temptations for other sins of the flesh, as well as the venues for committing them. The danger of a young woman being seduced and abandoned after false promises of marriage became more intense when parents knew that the cad might impregnate her in the rumble seat, then drive off to who knows where.
Pliley devotes most of the first third of her book to building up, layer by layer, a picture of the trends and anxieties of the period -- some of them overblown, but with enough examples from the legal record of women raped and then forced into sexual labor to show that it wasn’t all a matter of yellow journalism.
Pliley also discusses various laws and social campaigns that emerged in response -- efforts to shore up the norms by which sexual activity would be restricted to monogamous, legally married straight couples of the same race, who, while not necessarily born in the U.S., otherwise tried to make themselves as inconspicuous as possible. But measures to reassert control over the American libido were always one or two steps behind the social changes -- and enforcement could never be much more than episodic.
When Representative James R. Mann proposed the White Slave Traffic Act (soon to be known by his name) to Congress in 1910, its odd mandate reflected the effort to patch over some of the existing gaps in terms just broad enough to cover problems that ever-faster means of transportation were bound to create.
It met a little opposition. One Congressman expressed concern that “immoral purposes” was so vague that it might apply to horse racing and chicken fighting. Southern politicians were initially troubled that the law might infringe on states’ rights, but found themselves charged with a lack of concern with the protection of white womanhood, which settled the matter soon enough. President Taft signed the bill into law the day Congress sent it to his desk.
The burden of enforcing the Mann Act soon fell to the Justice Department’s recently formed Bureau of Investigation, which had a small staff and not much precedent for how to proceed. An early investigation seemed like a promising way to crack the organized traffic in prostitution between bordellos in Connecticut, Louisiana, and other states. But it turned out the hookers operated as free agents who traveled from bordello to bordello in a circuit. Customers, madams, and sex workers alike seem to have found it a reasonably satisfactory arrangement.
Pliely points out that the whole “white slavery” discourse rested on the idea that women wanted, more or less by instinct, to establish a monogamous relationship and start a family, and would only enter or remain in prostitution under threat of violence. But the interstate pimp ring that turned out not to exist suggested otherwise. The author shows that a great deal of the case load for agents in the early decades of the bureau pertained to cases of adultery where the lovers had fled the state. The aggrieved spouse could charge the adulterous man with violating the Mann Act, despite his paramour being perfectly happy with the situation. She had been taken “across state lines for immoral purposes,” though the investigation usually ended once she had agreed to return to her husband.
Thanks to the Great Depression, the bureau was able to enter the headlines for cases involving banks robbers and gangsters – and, a bit later, political radicals, as well as professional spies. But Pliley notes that in the late 1930s, Hoover (who joined the bureau in 1919 and became director five years later) reasserted the original understanding of the Mann Act as a measure against prostitution.
“The Bureau investigated only when the right person invited it,” she writes, “a father, a husband, or a male local law enforcement official. When the Bureau considered aggravated cases of sexual exploitation, it almost always conceived of prosecuting these crimes as defending the family (and concomitantly upholding men’s rights to control the sexuality of their dependents) rather than upholding an idea of female sexual sovereignty.”
It seems almost superfluous to mention the other implicit requirement: the man in question had to be white. The author names a few cases in which the complainant was of another color, but it seems that the most agents ever did was to fill out some paperwork, presumably to humor him.
None of this can really be attributed to Hoover, though. He executed the law, and enforced its biases, but they were established well before he joined the Bureau.
Policing Sexuality takes the story up to roughly America’s entry in World War Two, but I think the surveillance of MLK and the vicious letter from 50 years ago take on a new aspect in light of Pliley’s research. She directs our attention away from the director to the matrix in which the Bureau took shape. That challenges the habit of regarding the FBI as an institution shaped, and distorted, by his personality -- parts of which are expressed in the letter to King, written by a subordinate who knew what he wanted.
But the letter also echoes the concerns that Pliley finds in the Mann Act well before Hoover took power. Besides hostility to African-American advancement (one undercurrent of the "white slavery" theme), it expresses a fervent, one might even say deranged, aversion to sex outside of marriage. That Hoover shared these attitudes made him a perfect fit for the job. He thrived in it, and was good at it, although “good” isn't really how it looks from here.