Books

Essay on Watergate and Cass R. Sunstein's 'Impeachment: A Citizen's Guide'

From the perspective of a fourth-grader, Watergate was a real nuisance. You would settle down in front of the TV in late afternoon to watch Star Trek reruns (or whatever) only to find congressional hearings instead, with boring old guys going on and on about … what? For the longest time, it wasn't clear if even the adults knew. Something to do with plumbers?

And it seemed endless. At this distance, I find it hard to believe just 16 months passed between the start of the hearings in May 1973 and Richard Nixon's resignation in August 1974. As with a building remembered from childhood, it felt a lot bigger, and indeed my memory of the Watergate era is more of a place than a period. The space was crowded: certain names (G. Gordon Liddy, Archibald Cox, Deep Throat) were always in the news, and the demand that Nixon turn over his tape recordings in particular made for a heavy, brooding atmosphere.

It was a crisis -- though by the standards of today's 24-hour news cycle, the pace was glacial. In time it became obvious that the one thing at stake, fundamentally, was whether or not the president was telling the truth. And by 1975, my jaded sixth-grade self could look back with embarrassment at how naïve I had been in having ever assumed that he was.

That cynicism was not especially precocious. With hindsight, I think Gerald Ford made it inevitable. Shortly after Nixon resigned and he was sworn in, Ford assured us that "our long national nightmare [was] over," with the whole experience going to show that the Constitution worked. One month later, he gave Nixon a pardon, and the question of whether a quid pro quo were involved was moot. From then on, it seemed like common sense to assume that political figures and institutions were untrustworthy, if not totally corrupt, until proven otherwise.

The Pew Research Center's recent report on U.S. public opinion concerning the government shows that this was indeed a common assessment at the time and one that Americans have, over the long run, taken ever more as a given. Analyzing polling data gathered between 1958 and 2017, the study shows the highest level of confidence in Washington -- with 77 percent of respondents trusting the government to do the right thing just about always or most of the time -- occurred in 1964, during LBJ's first year in office, but dropped to 62 percent by the time he left. The decline became even more precipitous over the Nixon years and slowed only a little after that.

There have been upticks and even sustained upturns, particularly during the Reagan and Clinton administrations. But a graph of public opinion over the past half century makes public trust in the government look something like the bounce of a rubber ball -- never returning to the level it was at before Watergate, and at 20 percent as of early April.

Since then, President Donald Trump's firing of FBI Director James Comey in May has set in motion what the comedian John Oliver calls “Stupid Watergate” -- that is, “a scandal with all the potential ramifications of Watergate, but where everyone involved is stupid and bad at everything.” Leaving matters of quality control aside, what I've noticed over the past few months is not so much historical parallels but an echo of the experience described earlier: the sense of being inundated by news that it would be nice to tune out for a while but that you can't not pay attention to for very long. Or stuck in a maze, with no certainty that there is a way out.

Nothing in Cass R. Sunstein's Impeachment: A Citizen's Guide (Harvard University Press) specifically discusses the current president; in fact, it never mentions Trump at all, unless you count a newspaper headline cited in the notes. The author is a professor at Harvard Law School, and the book itself is pretty didactic. He is not trying to rally the public to any cause apart from his belief that "the impeachment clause was among the most important parts of the Constitution" and "a kind of unused key that might unlock the whole republic."

That said, the volume's first wave of readers will skew anti-Trump and probably take A Citizen's Guide to imply a do-it-yourself handbook. All in all, they might be advised to turn directly to chapter seven, in which Sunstein gives 21 examples of presidential actions that might stir up calls for impeachment, each accompanied by a brief analysis of whether or not it constitutes an impeachable offense. The earlier chapters explain the history and principles involved in making those calls, but it might be better to pick an intuitive sense of the issues by starting with the case studies. Also worth an early look is chapter eight, on the 25th Amendment's provisions for removing a president rendered "unable to discharge the powers and duties of his office."

Readers eager to get Stupid Watergate over with will not find a how-to manual in those two chapters. On the contrary, Sunstein leaves you with a good understanding of the difficulties involved in removing a president by impeachment and a sense that doing so under the 25th Amendment is unlikely this side of the commander in chief falling into a coma. At the same time, the book is a tribute to the Founding Fathers' wisdom in providing for a remedy in case someone who is vicious, lawless and unfit should somehow end up in power. What they could not foresee, of course, is that popular distrust for the government might culminate in the election of someone who made his own nihilism a campaign platform. From here it seems hard to see the exit.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Review of R. Alexander Bentley and Michael J. O’Brien's 'The Acceleration of Cultural Change: From Ancestors to Algorithms'

In the 2010 film The Social Network, Sean Parker, the first president of Facebook, invokes the spirit of manifest destiny, 21st-century style: “We lived on farms, then we lived in cities, and now we are going to live on the internet!”

To be clear, this piece of dialogue comes from an actor playing Sean Parker, speaking from an Oscar-winning screenplay by Aaron Sorkin -- who probably came up with the line, as it does not appear in Ben Mezrich’s book The Accidental Billionaires (the film’s source material). In any case, it sounds like the quintessential visionary hype of a Silicon Valley entrepreneur -- and in a recent interview, the real-life Parker expressed misgivings about ever indulging in social-media boosterism.

"The thought process that went into building these applications," he said, "was all about: 'How do we consume as much of your time and conscious attention as possible?'" The solution was to create “a social-validation feedback loop” by “exploiting a vulnerability in human psychology” at the neurochemical level: “a little dopamine hit every once in a while” from seeing that “someone liked or commented on a photo or a post or whatever.” When a network “grows to a billion or two billion people … it literally changes [people’s] relationship with society, with each other … God only knows what it's doing to our children's brains."

Well, it’s a little late now! one thinks. You might’ve thought of that before you went and destroyed human civilization. The image of hardy pioneers forging a new life on the frontier does not serve a plausible analogy for the future to be extrapolated from his remarks. The line at a methadone clinic -- narrowly focused but comfortably numb -- might be a better fit.

When the interview with Parker made the news last week, I was in the middle of reading The Acceleration of Cultural Change: From Ancestors to Algorithms by R. Alexander Bentley and Michael J. O’Brien, published by MIT Press. The authors take a long view of cultural change in which the more recent developments -- social networks, memes, big data -- come into perspective as extreme cases of the creative and disruptive potentials of our tool-oriented species. (Some of our primate cousins are able to use tools, but we are uniquely dedicated to doing so, as well as dependent on that ability.)

Bentley is a professor of anthropology at the University of Tennessee and O’Brien a professor of history at Texas A&M University, San Antonio. In collaboration, they prove fairly insouciant about disciplinary boundaries -- and willing to go wherever the muse of popularization leads them, including a couple of pages spent on Gilligan’s Island.

Trimming back the digressions and abstracting from the examples, we find an argument along roughly the following lines.

An important study in the early 1990s found a strong relationship between the size of the primate brains (in particular, of the neocortex) and the number of members in the social groups typical of their species. On the basis of that correlation, and using the fossil remains of early hominids, it was possible to estimate that our distant ancestors tended to live in bands of about 150 members -- a figure called “Dunbar’s number,” after its discoverer, who identified it “as being the typical limit of real -- defined as meaningful -- social relationships that a person will have.” Decades later, studies have found that the pool of friends or followers on social media with which an individual has significant or sustained interactions tends to come in at Dunbar’s number, more or less.

Putting that finding to the side for a moment, it’s obvious that our ancestors eventually began to form social groups many orders of magnitude greater in scale than the original Dunbar number-sized cohorts. The latter had developed weapons for hunting that, the archaeological evidence suggests, remained unchanged for hundreds of thousands of years -- a pace that, the authors note, was literally slower than the movement of glaciers.

“If the tools had been constrained by brainpower,” they write, “we would expect changes in parallel with brain size, yet we don’t see them. Alternatively, maybe hominids needed larger groups for technological change to occur. Yet this applies to more complex technologies, where it pays to learn from the expert in the group … In the case of Pleistocene stone tools, probably every individual could knap a hand ax without necessarily learning from an expert -- if there even was one.”

The point here is not to choose a single cause that would explain the quantum leap but to identify the factors (increasing brainpower, group size, technical innovation, training) that reinforced each other as early human society began growing beyond the Dunbar limit. Culture began not just to change but also to evolve: “Evolution means there are different variants transmitted between generations, over which these variants are sorted as some are transmitted more frequently than others.”

For most of human history, the winnowing process tended to favor forms of knowledge and technology that were specific to a place, region or group -- with irrelevant details or inessential changes stripped away and the important elements rendered compact, formulaic and memorable. Innovations can be transmitted, as well, but only after proving themselves and finding a place in a deep order.

“Without the kind of vetting that has long typified cultural transmission,” the authors say, “culture is bound to accumulate a lot of junk.” While oral transmission “prunes away superfluous details, rendering it more learnable and relevant,” nothing of the sort happens with, say, a viral video: it “gets copied identically millions of times without being streamlined by the transmission process and actually accumulates more junk in the form of comments and metadata.”

Furthermore, the act of looking something up online generates data that search engines use to calculate the value or pertinence of information for others who might do a similar search: “Within a shallow time depth, algorithms guide human followers like schools of herring, using popularity as a beacon.”

At the same time, the size of communities of meaningful contact among individuals hovers somewhere around Dunbar’s number. The authors are not alarmists or pessimists; they see the situation as a challenge, not a disaster. “To anticipate the future of cultural evolution,” they write, “think about populations, not individuals, and certainly not yourself. How will variation, transmission and selection be affected? … What wave should we be surfing now, and how will we find the next wave after that?” In any event, don’t ask Alexa.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Review of Steven E. Schier and Todd E. Eberly's 'The Trump Presidency: Outsider in the Oval Office'

Written at top speed and with a bare minimum of punditry, Steven E. Schier and Todd E. Eberly’s The Trump Presidency: Outsider in the Oval Office (Rowman & Littlefield) is the first book from a scholarly press about Trump’s first year in office. Or about the first half of that year, at least, though it felt a lot longer at the time.

The Trump Presidency is an analysis, not a chronology, but a careful reading suggests that it was substantially finished by the time John Kelly became chief of staff at the end of July. You get a sense of the authors working feverishly, borne up by the prospect of getting a little rest before the new semester began. (Schier is professor of political science at Carleton College, and Eberly is an associate professor of political science and public policy at St. Mary's College of Maryland.) And it was a reasonable expectation. The president would be on vacation and Congress in recess in August; the Washington press corps looked forward to getting eight consecutive hours of sleep on a regular basis for the first time since April 2016.

But the defining quality of the Trump era is that it never lets you forget about it for very long. August brought renewed tensions with North Korea, violence between white supremacists and antiracist protesters in Charlottesville, and a string of resignations from presidential commissioners in protest against Trump for apportioning equal blame for that violence “on both sides.” August also saw the departure of two advisers, Steve Bannon and Sebastian Gorka, whose official responsibilities were never clear, but who in practice served as the president’s ideologues in residence. Each claimed to have left voluntarily, though leaks at the time suggested otherwise.

In keeping with the format for citing online material, Schier and Eberly indicate when each item they refer to was accessed. Just a few date from the first half of August, but there is a kind of surge on Aug. 21 and 22, when at last their labors were complete -- seven months after inauguration, not quite to the day.

Individually and in collaboration, the authors have written and edited dozens of books on presidential campaigns and administrations from Bill Clinton's onward. As political scientists, they look for the patterns and regularities beneath the course of events -- and on that point, it bears noting that their focus here is mainly on Trump in office rather than as a candidate. While taking up the issue of whether 2016 was a transformational or realignment election, Schier and Eberly say relatively little about the relationship between Trump’s particular variety of nativism and protectionism, on the one hand, and the development of the Republican Party since the Reagan years, on the other.

Instead, they emphasize the established pattern of “systematic reaction” in which growing voter frustration regularly drives each party in and out of control of the White House and of Congress. Since 1952, there has only been one case of a party holding the White House for more than two terms. With that in mind, Trump’s victory seems like a regular swing of the pendulum -- except for the anomalous fact that it was Hillary Clinton, the “insider,” who won the popular vote by a comfortable margin. The “outsider,” having won the Electoral College majority, took office with what was (and remains) the lowest public-approval ratings for a new president since surveys began in the 1930s.

While Trump took office with the Republicans holding unified control of Congress, the advantage was minimal, since he had run as an outsider to the party as well as to Washington. He arrived, the authors say, “with a political victory that shocked the world but produced virtually no political capital for the new president.” And their account of his first months in office might be read as a case study in how someone can manage, against all odds, not to accumulate any. Most of the legislation he signed in his first six months consisted of regulatory rollbacks under the Congressional Review Act; 60 percent of them were one page long.

Otherwise, the lack of a working relationship with congressional leaders obliges him to rely on “the unilateral tools of presidential proclamations, memoranda and executive orders to reverse additional regulations under his direct authority.” Also, tweeting. Every chapter of The Trump Presidency opens with one or two items from his Twitter feed, and the authors quote them to document various points. As a medium, Twitter has allowed him to carry on a really impressive range of feuds with individuals, institutions, nations and provisions of the United States Constitution. None of which, by Schier and Eberly’s account, banks him much credit toward the conduct of political business.

A particularly interesting section of the book looks at Trump through the distinction (borrowed from the political scientist Paul Quirk) between two categories of presidential leadership: the self-reliant and the minimal. The former, exemplified by Franklin D. Roosevelt, “assumes significant responsibility for all decision making” and “requires a substantial degree of skill as well as a clear understanding of how government and policy work” -- or at least “the ability to ingest and digest significant amounts of information in order to make decisions.” This seems wide of the mark.

Trump might be assumed to fall under the other heading. The minimalist is “not required to be particularly well versed in the specifics of governing or policy” and “acts very much as a chairman of the board responsible for setting general policies and goals.” The example suggested is Ronald Reagan, known for “delegating to his cabinet and other key personnel the responsibility of determining specifics.” But successfully delegating responsibility would seem to preclude saying your secretary of state is “wasting his time” with diplomacy or holding a grudge against the attorney general and making sure everyone hears about it.

The authors say, per their colleague Quirk, that the contemporary presidency is too complex a job for either the self-reliant or the minimalist type to do it adequately. Trump, they write, “embodies elements of the minimalist and self-reliant models.” They express hope that he will acquire what Quirk terms "strategic competence" -- a blend of "self-discipline, a knowledge of one's limitations and abilities, and a willingness to accept the advice and counsel of those who may hold the expertise he lacks." Nothing in the book seems to shore up the assessment that anything of the sort is possible; it stands as an example of wishfulness at its most desperate.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Editors discuss new book on diversity in Christian higher education

Editors discuss new book about a push by many Christian colleges to diversify their institutions.

Review of Roderick A. Ferguson’s ‘We Demand: The University and Student Protests’

The assertive, present-tense title makes Roderick A. Ferguson’s We Demand: The University and Student Protests (University of California Press) sound like a manifesto or an organizing manual. Likewise, with the book’s opening salutation to “you, the student who believes that we can or should do better than the world that we’ve inherited … in which people are thrown into a chasm full of dangers, cruelties and inequalities,” the implied audience is likely to be in search of advice about how to turn conviction into action -- or an informed account of how others have done so in the past. Ideally, both.

We Demand provides neither, although the author (a professor of American studies at the University of Illinois at Chicago) obviously wants to be helpful and encouraging in other ways. The book is neither long nor opaque, but the exact nature of what it brings to the activist’s card table is not immediately clear. I will venture a guess on that score later, but first an outline of the major points.

We Demand makes a historical argument about the student protests of the 1960s and ’70s but spends little time on particular issues, events or movements of that period. (References to student activism before or after that period are even more perfunctory.) Instead, Ferguson is concerned with those protests as part of a wider -- and ongoing -- challenge to the order of American life by those previously excluded or marginalized: “communities made up of immigrants, people of color, women, indigenous people, queers, transgender persons and disabled people.” (Besides American studies, Ferguson is also professor of gender and sexuality studies and African-American studies.)

The university was not just one of the tables at which these communities were demanding a place. An institution with countless links to political, economic, military, scientific and other institutions, the postwar American university is a site of public life par excellence -- giving student protest far more potential social impact than the number of participants alone could register.

“In this context,” Ferguson writes, “the makeup of university knowledge, faculty hires and student admittance takes on both political and intellectual importance … Plainly put, when students challenged the university, they were calling for a new social and intellectual makeup of the university and for a new social order in the nation at large.”

But hegemony can’t be dismantled in a day, or even a decade -- and the aftermath of the late-1960s peak in social upheaval was a series of efforts to delegitimize or suppress student protest, including through violence. Ferguson’s reading of the Nixon-era “Report of the President’s Commission on Campus Unrest” (1970) and other documents traces how university administrations responded to pressures from outside to contain the disruptions. University presidents started lobbying state legislatures to permit them to set up on-campus police departments; most colleges and universities with more than 2,500 students now have one. Diversity became an area for administrative specialization -- something to be managed, rather than a challenge to orderly functioning.

The product, the author says, was “an institution that dramatically transformed itself from a simple and straightforward academic enterprise into an administrative system that has become more and more state-like, with apparatuses that try to ensure order by both persuasion and force.”

Hence the order of things that “you” (the concerned student addressed at the beginning of the book and again at the end) find on campuses now -- on not quite the 50th anniversary of the spring semester of 1968 that shook the world. “You will more than likely graduate with not only a degree but a financial debt that will probably follow you for years to come,” the author says, though few will need reminding.

The level of social tension that Ferguson understands to have driven the student protests of an earlier era seem to ratchet up continuously. The status quo Ferguson describes, however, amounts to Dystopian U -- managed and surveilled too well for rebels to get much traction.

But We Demand is not an apathetic book by any means. The references to student protests of the recent past are fleeting but hopeful. And it has implications that are easy to overlook -- at least they escaped my attention on first reading. For Ferguson’s understanding of earlier campus militancy and of its containment rests on a clear sense of the university as a crucial part of the social machine -- a spot where cogwheels mesh and circuits connect. Student movements are distinctly positioned to be able to identify, in his words, “the connections among systems of power that [arise] between academy, government and corporation.” A little disruption sometimes goes a long way. Full awareness of the connections is worth cultivating. Once that awareness reaches a certain intensity, the demands will pretty much issue themselves.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Author discusses how racism is perpetuated in elite colleges

Author discusses how college diversity programs can result in students overattributing success to factors like merit and hard work, while ignoring systemic or institutional problems.

Author discusses his new book on why liberal arts majors make great employees

Author discusses his new book about why those who major in liberal arts disciplines -- and the humanities in particular -- make great employees.

A university president discusses her new book on how colleges can prepare students for success

Bentley University president discusses her new book on preparing students for success.

Review of Nathan Kravis's 'On the Couch: A Repressed History of the Analytic Couch From Plato to Freud'

Quite a few naked women fill the pages of Nathan Kravis's On the Couch: A Repressed History of the Analytic Couch From Plato to Freud (MIT Press). Even those depicted as fully clothed tend to be conspicuously dishabille or sunk in languor, if not half-asleep. A number are mythological figures; none are patients. The painter or photographer (or, in the case of the oldest images, mosaic maker or relief carver) gazes at them full-on -- unlike the psychoanalyst, who normally sits perpendicular to the couch, with an ear turned to the analysand.

This arrangement, established by Freud himself, turned into one of the definitive protocols of “orthodox” analysis (along with 50-minute sessions conducted four or five times a week) as well as the premise of countless New Yorker cartoons. That a piece of furniture has become practically synonymous with psychoanalysis seems odd given how little the founder said about the couch. And when he did, as in the paper “On Beginning the Treatment” (1913), it’s clear that the practice originally had “a personal motive, but one which others may share, namely: ‘I cannot put up with being stared at by other people for eight hours a day (or more).’”

OK, but still: Why a couch? For as Kravis points out, having the patient sit in a chair, suitably angled, would presumably do the trick just as well. Freud refers to having the patient assume a reclining position as a “ceremonial” aspect of the treatment. Someone unimpressed by psychoanalysis’s claim to the status of a science might well take this as an inadvertent admission that the technique is grounded in ritual, not research. In that case, the unconscious and the superego are mythological figures, too, just like Venus and Cupid.

The efficacy and centrality of the couch have come under scrutiny within the analytic community itself, and Kravis’s lavishly illustrated book contributes to that effort. The author, a clinical professor of psychiatry at Weill Cornell Medical College, is also a supervising analyst at the Columbia University Center for Psychoanalytic Training and Research. He notes that many analysts now regard the couch as “nonessential” and “no guarantor of an analytic process,” and even “a relic of a more authoritarian era, a power play on the part of the analyst that unnecessarily regresses or infantilizes the analysand.”

On the Couch neither endorses those criticisms nor categorically rejects them. Kravis treats both automatic conformity to tradition and a dogmatic rejection of it as instances of “a frozen, rigid, doctrinaire or overly detached stance on the part of the analyst rather than an effort to sense what’s going on [in treatment] and adapt accordingly.” The challenge is to comprehend how the couch entered the analytic tradition in the first place -- following some 2,500 years of development as part of material culture.

The need for a piece of furniture designed for comfort during the hours one spends neither toiling nor sleeping was not especially urgent for most of humanity through most of its history. “If today the bed is associated primarily with sleep and sex,” writes Kravis, mentioning two matters of great psychoanalytic interest, “in earlier centuries it was strongly associated with grandeur and privilege, just as the couch was associated with ease and luxury in Greco-Roman culture.” The couch was more status symbol than convenience: the wellborn and prosperous would dine and socialize on their couches -- inspiring envy and, when possible, imitation by those lower in the social hierarchy. Among the early images that Kravis presents is a funerary sculpture depicting a woman reclining in the company of her children and a slave who, of course, stand. A number of depictions of the Last Supper less well-known than Leonardo da Vinci’s portray Jesus and the disciples lying on their sides on couches around a table.

The fall of the Roman Empire brought “the decline of reclining dining,” to borrow the author’s once-in-a-lifetime phrase, but imagery from later periods continue to associate the couch with luxury and social position. It also provided room for the pleasures of reading and conversation.

“Newly emerging ideals of comfort were becoming inseparable from notions of social intimacy,” Kravis writes. “These are among the changes that provided the cultural conditions necessary for the eventual emergence of psychotherapy in general, and psychoanalysis in particular.”

Finally, war casualties and the spread of tuberculosis in the 19th century saw the rise of the mass production of recliners and daybeds, including furniture that could be adjusted to provide “as many graduations as possible between sitting and lying.” Freud trained and practiced as a doctor while this market was growing, and it seems significant that the German word he used for couch (or “sofa,” in the earliest English translation) literally means “resting” or “calm” bed, with connotations of the sanatorium or “rest cure” rather than the bourgeois drawing room.

“For it even to become thinkable to lie down in the presence of another person for the purposes of talking to him or her,” writes Kravis, “there had to be an evolution in attitudes toward the private and the social reflected in the history of recumbence -- its social meanings and contexts.” And ultimately, it is the combination of intimacy and distance associated with analysis that he wants to preserve, whether or not the couch facilitates it in a given case. The analytic couch has a rich cultural heritage, which it is possible to defend without succumbing to a fetish.

But one of the articles in his bibliography is a reminder that it is not the public who need persuading. “The Couch as Icon” by Ahron Friedberg and Louis Linn appeared in The Psychoanalytic Review five years ago. The authors did a literature review of “over 400 papers on the usage of the couch in analysis.” And while they found no real consensus on its value or effects, clinical reports suggested that with some patients, the lack of eye contact could be a problem or even dangerous. A depressed person with limited social contacts “may come to an analytic hour and find his loneliness reinforced,” for example. Someone dealing with trauma or early loss can find the experience not just alienating but seriously damaging.

At the same time, Friedberg and Linn reported that a considerable number of their colleagues continued to think of the “orthodox” arrangement as the gold standard of the profession. It’s what they went through and what their training analysts did before them. And patients who do not benefit from it have, in effect, failed the legacy, not vice versa. On the Couch is an interesting and attractive perspective on the roots of an analytic tradition, but parts of that tradition sound downright compulsive.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Review of Debora Diniz's 'Zika: From the Brazilian Backlands to Global Threat'

For the moment, anyway, the subtitle of Debora Diniz’s Zika: From the Brazilian Backlands to Global Threat (Zed, distributed by University of Chicago Press) looks like the warning sign, glimpsed in a rearview mirror, for a danger no longer on the road ahead.

At least the Biomedical Advanced Research and Development Authority (part of the U.S. Department of Health and Human Services) seems to think the menace is behind us. On Sept. 1, the French pharmaceutical manufacturer Sanofi’s website announced that BARDA’s latest “assessment of all Zika-related projects they are funding” had downgraded the priority of creating a vaccine. Work would continue, Sanofi said, “to a point where development would be indefinitely paused but could be restarted if the epidemic re-emerges.”

Decision making at that level is stratospherically removed from the world that Diniz, a professor of bioethics at the University of Brasilia, set out to document in her historical and ethnographic study. Much of the fieldwork was done in “the Northeast” part of Brazil -- an area much less urbanized than the southern half of the country, which is, she notes, “site of the top universities and research centers.” When Zika was reported in the Americas for the first time in 2015, it was in the Northeast, in a virulent strain that proved to be especially dangerous to pregnant women. The number of infants born with microcephaly increased a staggering 2,023 percent over the previous year.

Both the arrival of the virus itself and its role in the birth defect were first announced “in the lilting accent of the Northeast,” as Diniz puts it, by “clinicians and practitioners of bedside medicine -- where the focus is on the doctor-patient relationship -- many of whom were unknown to the public at large or the academic community.” But by the time the World Health Organization declared a public health emergency of international concern in February 2016, an official narrative had taken shape that stressed the contributions of established medical authorities, rather than the front-line health-care providers.

Funny how that works. More is at issue than regional pride. The initial study showing the Zika virus in the amniotic fluid of women whose babies had congenital malformations was done by a doctor who -- in part because of her bedside rapport -- was able to persuade the mothers to donate the bodies for research. The overwhelming majority of such mothers “were also poor Northeasterners, many of them farmworkers, many black and brown, women whose faces and biographies are usually all but invisible in Brazil’s socially stratified world.”

Nor did that change when the Ministry of Health issued its press release announcing confirmation of the link between the virus and microcephaly. It managed to erase both the doctor’s role and the generosity of the anguished mothers. The medical institute that confirmed the doctor’s findings -- as goes almost without saying -- was credited by name.

“This immediate replication of the reigning social stratification within Brazilian science should come as no surprise,” Diniz writes, “because a single event cannot in itself be expected to undermine unequal patterns of resource distribution.” Social and intellectual capital takes care of its own.

Diniz’s book was first published in Brazil last year, with the dates in her citations suggesting the manuscript was done about three months before the director-general of the World Health Organization declared an end to the state of emergency last November. A second outbreak of Zika in Brazil had been less catastrophic in its impact; the country was on the warpath against the mosquito that had been the virus’s major vector. And in early 2017, WHO affirmed the “need to manage Zika not on an emergency footing, but in the same sustained way we respond to other established epidemic-prone pathogens.”

An inhabitant of one of the highly industrialized or metropolitan regions of the world is apt to take this narrative arc as bending toward progress. But the most recent Situation Report issued by WHO states that the disease “continues to spread geographically to areas where competent vectors are present,” and it lists 61 countries showing “new introduction or reintroduction [of Zika] with ongoing transmission.”

And a recent New Yorker article points out that “the geographic distribution of Aedes aegypti, the mosquito species that transmits Zika, continues to spread into “unexpected parts of North America and Europe.” Among them is Washington, where it “appears to have survived four consecutive winters.” So hardy a virus and so competent a vector will probably be menacing us for a while yet.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Pages

Subscribe to RSS - Books
Back to Top