“All eras in a state of decline and dissolution are subjective,” said Goethe in a moment of sagely grumbling about the poets and painters of the younger generation, who, he thought, confused wallowing in emotion for creativity. “Every healthy effort, on the contrary, is directed from the inward to the outward world.”
I didn’t make the connection with Svend Brinkmann’sbook Stand Firm: Resisting the Self-Improvment Craze until a few days after writing last week’s column about it. One recommendation in particular from the Danish author’s anti-self-help manual seems in accord with Goethe’s admonition. As Brinkmann sees it, the cult of self-improvement fosters a kind of bookkeeping mentality. We end up judging experiences and relationships “by their ability to maximize utility based on personal preferences -- i.e. making the maximum number of our wishes come true.” The world becomes a means to the ego’s narrow ends, which is no way to live.
Besides offering a 21st-century guide to the Stoic ethos of disinvestment in the self, Brinkmann encourages the reader to rediscover the world in all its intrinsic value -- its fundamental indifference to anybody’s mission statement. How? By spending time in museums and forests:
“A museum is a collection of objects from the past (near or distant), e.g. art or artifacts that say something about a particular era or an aspect of the human experience. Obviously, you learn a lot from a museum visit -- but the greatest joy lies in just reveling in the experience with no thought of how to apply the knowledge and information. In other words, the trick is to learn to appreciate things that can’t be ‘used’ for some other function....
Similarly, a walk in the woods gives us a sense of being part of nature and an understanding that it shouldn’t be seen as consisting of resources that exist merely to meet human needs and desires. ... There are aspects of the world that are good, significant, and meaningful in their own right -- even though you derive nothing from them in return.”
Making similar points from a quite different angle is The Usefulness of Useless Knowledge by Abraham Flexner (1866-1959), the founding director of the Institute for Advanced Study, in an edition from Princeton University Press with a long introduction by the institute’s current director, Robbert Dijkgraaf.
The essay giving the book its title first appeared in Harper’s magazine in October 1939 -- a few months into the New York World’s Fair (theme: The World of Tomorrow) and just a few weeks into World War II. “I [am] pleading for the abolition of the word ‘use,” Flexner wrote, “and for the freeing of the human spirit.” It must have seemed like one hell of a time for such an exercise. But the essay’s defense of the Ivory Tower was tough-minded and far-sighted, and Dijkgraaf’s introduction makes a case for Flexner as a major figure in the history of the American research university whose contribution should be remembered and revived.
The germ of The Usefulness of Useless Knowledge was a memorandum Flexner wrote as executive secretary of the General Education Board of the Rockefeller Foundation in 1921.The principles it espouses were also expressed in his work bringing Albert Einstein and other European academic refugees to the Institute at Princeton in the early 1930s.The essay defends “the cultivation of beauty ... [and] the extension of knowledge” as “useless form[s] of activity, in which men [and, as he acknowledges a few sentences earlier, women] indulge because they procure for themselves greater satisfactions than are otherwise available.”
But the impact of Flexner’s argument does not derive primarily from the lofty bits. He stresses that the pursuit of knowledge for its own sake has in fact shown itself already to be a powerful force in the world -- one that the ordinary person may not be able to recognize while swept up in “the angry currents of daily life.” The prime exhibits come from mathematics (Maxwell’s equations or Gauss’s non-Euclidian geometry took shape decades before practical uses could be found for them), though Flexner also points to the consequential but pure curiosity-driven work of Michael Faraday on electricity and magnetism, as well as Paul Ehrlich’s experiments with staining cellular tissue with dye.
“In the end, utility resulted,” Flexner writes, “but it was never a criterion to which [researchers’] ceaseless experimentation could be subjected.” Hence the need for institutions where pure research can be performed, even at the expense of pursuing ideas that prove invalid or inconsequential. “[W]hat I say is equally true of music and art and of every other expression of the untrammeled human spirit,” he adds, without, alas, pursing the point further.
The untrammeled human spirit requires funding in any case. Although written towards the end of the Great Depression -- and published ten years to the month after the stock market crash -- The Usefulness of Useless Knowledge reads like a manifesto for the huge expansion of higher education and of research budgets in the decades to follow.
Flexner could point to the Institute for Advanced Study with justified pride as an example of money well-spent. He probably corrected the page proofs for his essay around the same time Einstein was writing his letter to President Roosevelt, warning that the Germans might be developing an atomic bomb. And as Robbert Dijkgraaf reminds us in his introduction, another Flexner appointee was the mathematician John von Neumann, who “made Princeton a center for mathematical logic in the 1930s, attracting such luminaries as Kurt Godel and Alan Turing.” That, in turn, led to the invention of an electronic version of something Turing had speculated about in an early paper: a machine that could be programmed to prove mathematical theorems.
“A healthy and balanced ecosystem would support the full spectrum of scholarship,” Dijkgraaf writes, “nourishing a complex web of interdependencies and feedback loops.” The problem now is that such a healthy and balanced intellectual ecosystem is no less dependent on a robust economy in which considerable amounts of money are directed to basic research -- without any pressing demand for a return on investment. “The time scales can be long,” he says, “much longer than the four-year periods in which governments and corporations nowadays tend to think, let alone the 24-hour news cycle.”
That would require a culture able to distinguish between value and cost. Flexner’s essay, while very much a document from eight decades ago, still has something to say about learning the difference.
In lists of winners of the Nobel Prize for Literature, an asterisk sometimes appears next to the name of the entry for 1964. That year Jean-Paul Sartre declined the award because, among other things, a writer must “refuse to let himself be transformed into an institution.” The refusal cannot be called all that effective, in part because Sartre already was an institution (on an international scale to which, so far as I know, no author today really compares) and in part because the Swedish academy did not give the award to anyone else that year. He remains on the list, marked as a sore winner.
That same year, a future Nobel laureate issued his third and fourth albums, The Times They Are a-Changin’ and Another Side of Bob Dylan. The second title in particular hints at the ambivalence that the songwriter formerly known as Robert Zimmerman was beginning to feel toward his most ambitious creation -- to whit, “Bob Dylan,” a persona shaped in part through his own borrowings from various folk-music legends (especially Woody Guthrie) and in part by the felt need of segments of the American public for someone to embody the voice of his generation. In acquiring an audience, he took on the weight of its expectations and demands. (Reasonable and otherwise: Dylan had what in 1960s were not yet known as stalkers.) “By many accounts, he’d shed his boyish charm and had become moody, withdrawn and dismissive of those who either stood in his way or who wanted something from him,” writes Andrew McCarron in Light Come Shining: The Transformations of Bob Dylan (Oxford University Press). In public he sometimes had to wear a disguise, just to be left alone.
A connection can be drawn between Sartre and Dylan not only through their shared Nobel status (something of a coincidence almost, given the literature committee’s caprice in recent years) but because Light Come Shining belongs to a genre to which Sartre devoted a great deal of attention over the years: the psychobiography. Indeed, McCarron’s whole perspective on Dylan’s life and work shows the influence of concepts from Sartre’s “existential psychoanalysis,” especially that of the project. McCarron, who heads the religion, philosophy and ethics department at Trinity School in New York City, draws on quite a few more recent developments in psychology. But the Sartrean component is central enough -- and nowadays unusual enough -- to be striking.
Psychobiography in this sense should not be confused with the hunt for formative family relationships, childhood traumas, personal secrets, etc.: the sort of diagnosis at a distance, licensed or otherwise, practiced by many if not most biographers over the past century. It combs the available information about a subject’s life -- especially his or her own recollections and interpretations of things -- not for symptoms or concealed truths but, McCarron writes, for “the themes and structures of a life narrative that shed light on the mind and life-world behind the story.” An inaccurate memory or an outright lie may prove more revealing than what it distorts: “Appropriating, embellishing, misrepresenting, fantasizing, projecting and contradicting are all par for the course within the narrative realm. … The psychological truth that a given story conveys is considerably more valuable from a study of lives perspective than its historical truth.” The search is for the deep pattern in how the subject has understood life and tried to steer it (accurately or not) in certain directions. The psychobiographer is interested in “what [someone] succeeds in making of what he has been made,” as Sartre put it in a passage McCarron quotes.
Bob Dylan has been famous for his massive changes of direction, both in songwriting style (folk to rock to country, on to every permutation thereof) and personal identity. Early in his career he claimed to have been a carny and a hobo, among other things, and his interviews across the decades have often been performances, deflecting questions as much as answering them. More dramatic even than his shift from anti-war and civil rights balladeer to introspective surrealist -- with the two albums from 1964 marking the transition -- was Dylan’s conversion to Christianity in the late 1970s. For a while his concerts became confrontational, both from his refusal to play old songs and his impromptu fire-and-brimstone preaching. Whatever his religious affiliation now, the proselytizing phase did not last. He’s had his quota of marital and romantic drama and career downturns. Light Come Shining was finished before Dylan received the Nobel, and it’s possible he has not seen his last metamorphosis.
The psychobiographer, then, faces an excess of material with Dylan, not to mention more than 50 years of investigation, speculation and exegesis by obsessive fans. McCarron sifts through it and finds “variations on a repetitive plotline” coming to the fore with particular clarity at a number of points: “I have lost my sense of identity and purpose. I feel anxious and vulnerable to death and destruction. I turn to the songs and artists of my youth for guidance. I feel a redeemed sense of self and purpose. I reflect upon the change and understand it as the process of developing into who I’m supposed to be.”
One case of anxious and unmoored feelings was Dylan’s sense of being crushed by celebrity circa 1964 -- a period culminating in his motorcycle crash in 1966. (If that’s what really happened, rather than a stint in rehab, for which there seems to be more evidence.) McCarron identifies similar phases of great personal strain in the late 1970s and ’80s, followed by, respectively, his religious conversion and the major revival of his creative powers shown in Dylan’s songwriting in the 1990s. At each turn, he escaped desolation row by reconnecting with his musical roots: the blues, gospel, Western swing, the sounds of New Orleans, the memory of seeing Buddy Holly a few days before his death.
“All of Sartre’s studies of lives reveal features characteristic of traditional religious narratives,” wrote Stuart L. Charmé in Meaning and Myth in the Study of Lives: A Sartrean Perspective (University of Pennsylvania Press, 1984). And that makes sense insofar as what the psychobiographer looks for in a subject’s life is a kind of private mythology: the self’s innermost sense of its origins and its course. (As mentioned earlier, Sartre calls this a project; perhaps “projectile” also fits, since there’s a definite sense of movement, of throwing, or being thrown, into the future.)
If what McCarron identifies as Dylan’s psychobiographical bedrock might also be called a story of death and resurrection, that’s not necessarily because of the songwriter’s midlife experience of being “born again” and driven to evangelize. A great deal of the music that Dylan loves and immerses himself in echoes biblical language and themes, and it turns out that any number of songs about worldly pleasures and follies were written by performers who did a bit of preaching, too. Dylan absorbed musical traditions so deeply that they became part of himself, then projected them forward, in constant awareness that -- in a lyric that McCarron oddly never cites -- “he not busy being born is busy dying.”
Attributing human characteristics to animals -- as in the case of Henri the Cat, the existentialist feline -- is a case of anthropomorphism. But the word is perhaps less suitable when the creatures in question are monkeys or apes. Anthropomorphizing disregards the vast difference between an animal’s world and our own. Watching primates is another matter.
Not that the gap is smaller, but it’s tangible and fascinating in its own right. Projecting human qualities onto primates can boomerang: we are close enough on the evolutionary tree to make every point of anatomical or behavioral resemblance a challenge to our egocentricity as a species. From a certain angle, it probably looks like we’re just a species of jumped-up chimpanzee.
Two camps have formed in the study of how intelligence evolved, according to Julia Fischer’s Monkeytalk: Inside the Worlds and Minds of Primates, published in Germany five years ago and now out in translation from the University of Chicago Press. One camp takes human beings as “the analytical point of departure” and “seeks to discover which other animal groups share competencies” with us. The anthropocentric researcher then goes in search of “a plausible explanation … for when a particular trait emerged in the course of evolution.”
In contrast, what Fischer calls the “evolutionary-ecological approach” starts out from an understanding of intelligence as one aspect of how animals engage with and adapt to their environment, raising questions about how “various species solved similar problems in the course of evolution” and what circumstances foster the power to learn or to generalize from experience. (Or, conversely, what factors might inhibit that power.)
Drawing on her own work in the field and the lab as well as that of other researchers, Fischer considers it “most productive to incorporate both perspectives” -- the anthropocentric and the evolutionary-ecological -- “to develop a comprehensive understanding of animal intelligence” and of primates especially. But my impression is that she inclines more to the evolutionary-ecological camp: much of the book reflects on her observation of three species (the Barbary macaque and two kinds of baboon) in different environments, and Fischer keeps the reader aware of the natural fit between behavioral pattern or social structure and immediate issues such as predator threats and food availability.
Fischer’s recollections of field research (where “strong nerves, grit and oftentimes a morbid sense of humor are essential”) and descriptions of monkey behavior are highly engaging. The account of babysitting among Barbary macaques is especially vivid and memorable. A male will snatch a newborn (not necessarily his own progeny) from its mother for use as a status symbol and icebreaker with the guys. Then:
He can more confidently approach another male and engage in mutual grooming than if he approaches alone. When two male Barbary macaques sit together holding an infant, they often engage in a peculiar ritual, lifting the baby up high, nuzzling it and thoroughly inspecting it. They chatter their teeth, smack their lips and emit deep grunting sounds. Sometimes they will bask in the afterglow, calmly remaining beside each other, while at other times one of the males will brusquely snatch the infant up and rush off to repeat the ritual with another male.
Eventually the baby gets hungry, making it less amusing, whereupon it is returned to the mother. From observation of chacma baboons, Fischer found that at the age of 10 weeks, youngsters did not respond to recordings of baboon calls. By four months, they did pay attention, without regard for what kind of call it was. And two months after that, “They reacted clearly to alarm calls and had learned to ignore contact calls, save for those produced by their mothers.” A learning process had transpired, though Fischer notes it is difficult for researchers to work out just how it happens in the wild.
Monkeytalk reports on findings concerning three dimensions of the primate mind: social behavior, cognition and communication. One of the arguments Fischer considers is “that intelligence has arisen as a consequence of life in complexly structured groups”; the other, “that intelligence and communicative ability are intimately interconnected.”
From our limb of evolutionary development, it’s tempting to consider them as all inextricably linked. The anthropocentrist would insist on a third link: one between communicative ability and social complexity, which work together like pistons in the engine of human cognition. (See Kenneth Burke’s “Definition of Man” for another formulation of this idea.) But from Fischer’s review of the evidence, the connections are much more loosely imbricated than we might think:
Primate intelligence is not limited to the social domain. Primates competently interpret objects and events in their physical surroundings and draw correct inferences about them -- or at least they do when the pertinent stimuli are not too misleading …. Yet indirect evidence and “invisible” causal connections remain completely alien to them. … While intelligence is tied to a rich representation of the social world, it by no means entails a sophisticated system of communication. At the same time, primates are evidently capable of perceiving the subtlest differences in the signaling behavior of their fellows and investing those nuances with distinctive meaning. In addition, they make use of, and adaptively respond to, a variety of information sources, such as contextual clues and signals.
Only on the final page (not counting acknowledgments and other apparatus) does Fischer make the reader fully aware of two very dark clouds hanging over the progress of knowledge concerning our fellow primates. One is that long-term research -- while necessary, since most species have long life spans -- is difficult given the scarcity of long-term funding. The other is that a majority of species are now endangered, and many are on the verge of extinction. Monkeytalk certainly leaves you with a feeling of the depths that loss will mean.
A monograph of long gestation, Peter J. Spiro’s At Home in Two Countries: The Past and Future of Dual Citizenship (NYU Press) is clearly not aimed at the readership of Americans who are considering an exit strategy right about now. A number of handbooks are already available, should that be your interest.
The author, a professor of law at Temple University, is more concerned with the logic of dual citizenship -- its evolution as a juridical concept and a practical option over the past 300 years or so -- than with the logistics involved in obtaining it. That said, Spiro notes that he and his children, while all born and residing in the United States, now also hold European passports. It’s a reminder of his larger point: that the tide of globalization in recent decades has turned dual citizenship from an anomalous and potentially dangerous condition into something almost commonplace -- or at least no big deal. Whether it will remain that way is another question.
The historical narrative in At Home in Two Countries has a fairly well-demarcated beginning, middle and end -- with each phase defined by how much strain dual citizenship places on the relationship between the individual and the nation-state. (Also by the potential for conflict it creates between the nation-states involved, but let’s leave that to the side for a moment.)
In the beginning, everything is reasonably straightforward. You were not the citizen of a nation-state but the subject of a sovereign. God had placed you in your respective positions -- tying you together on this earth for what were, presumably, good reasons that, in any case, were not up for discussion. It was “not in the power of any private subject to shake off his allegiance, and to transfer it to a foreign prince,” as the U.K.’s House of Lords declared in 1747, nor could “any prince, by naturalizing and employing a subject of Great Britain … dissolve the bonds of allegiance between subject and crown.”
Implicit in such an official statement of the doctrine of perpetual allegiance is the reality that it was being violated in practice. And within 30 years came the virtually unthinkable developments in the American colonies, where British subjects began “shak[ing] off … allegiance” to their sovereign without “transfer[ing] it to a foreign prince” but to their own republic instead.
Emigration was a constant drain on the sovereign’s human capital -- especially on military resources, since it provided a way to avoid conscription. So a variant of the doctrine of perpetual allegiance remained in effect even after the secular nation-state took over from divinely installed royalty. Becoming the naturalized citizen of another country did not necessarily bring an end to expectation that you should meet the motherland’s obligations and obey its laws. Nor would your children be exempt. That could make visiting family in the old country a risky enterprise. Dual citizenship of this sort was involuntary and unintentional, and it had potentially grave diplomatic consequences if the government of an individual’s adopted country tried to intervene.
The legal and political fights so occasioned throughout the 19th and early 20th centuries make for the most interesting pages in At Home in Two Countries. Laws and treaties took shape that made expatriation, naturalization and election (i.e., the choice of nationality by someone born to parents of different citizenships) more routine and less volatile -- as much as that was possible, anyway, amid wars and international tensions.
But the other side of this stabilizing trend was -- at least, until fairly recently -- a strong sense that dual citizenship itself was something to be avoided and prevented as much as possible. At best it would be a temporary condition, to be cured with the proper paperwork and no delay.
“On the one hand,” Spiro writes, “dual nationals represented a potential spark in the tinderbox, as issues relating to their protection or responsibility for their actions could readily escalate into interstate conflict. On the other hand, in a world premised on the fact of some level of interstate conflict, dual nationals could only be presumed to do an adversary’s bidding from within.”
In the United States, the peak of what Spiro calls “the consensus opprobrium” regarding dual citizenship came in the early 1950s, with Cold War nerves at their most taut. The timing is interesting, because it coincides with a rapid decline of the issue driving much of the 19th-century debate: the concern with foreign sovereigns trying to conscript naturalized citizens traveling abroad. It was no longer a problem routinely facing the American diplomatic corps, and by the 1960s, European and Latin American countries adopted conventions to end it as a source of friction among themselves.
“As states stopped fighting over dual nationals,” Spiro says, “there was much less incentive to combat the status.” What followed was the slow and uneven normalization of dual citizenship, as some countries ceased to require emigrants to renounce citizenship upon naturalizing elsewhere and others reaped benefits from absorbing immigrants who maintained their birthright citizenship. (“To the extent that a renunciation requirement deters naturalization,” writes Spiro, “society’s loss from the reduced rate of naturalization plainly overshadows the benefits of enforced renunciation.”)
So from the era of perpetual allegiance (in which dual citizenship was more or less a contradiction in terms) to the long decades of reducing the strains of expatriation and naturalization (when dual citizenship became an anomaly to avoid), we’ve reached the epoch of high globalization, with dual citizenship an established if not quite ubiquitous mode of transnational life. With dual citizenship “normalized as an incident of globalization,” Spiro devotes a chapter to the case for “the emergence of an articulated, protected right to the status” recognized by international law.
Here the author hits a note of expectancy that implies something almost historically inevitable: the result of forces moving in certain identifiable directions. For the course Spiro identifies moves in a recognizable direction. From epoch to epoch, the individual gains power in determining his or her status vis-à-vis instituted authorities. At the same time, conflict among those authorities tends to subside. Nationalism will grow kinder and gentler, to be replaced in time by a higher stage of cosmopolitan citizenship, as envisioned by Immanuel Kant or Thomas Friedman, albeit in somewhat different ways.
It will take much work and goodwill, but there’s no reason why things can’t keep moving forward in a virtuous circle. The potential for retrogression is not really a part of the scenario. It figures the normal global citizen of the future as someone choosing among citizenships -- rather than as a refugee without the option of claiming a single one, caught between nationalisms out for blood. In Spiro’s long-term perspective, the evolution of dual citizenship seems destined to keep on advancing, while at the moment it feels like we are at the edge of something, possibly a cliff.
“As nearly all scholars recognize,” we read in an article published in Presidential Studies Quarterly in 1983, “there is no apprenticeship or training an individual may obtain in preparation for the presidency. There is no convenient book or guide which provides a detailed step-by-step analysis of the requirements and demands of the office.”
How true! An acquaintance with the Constitution would surely be helpful, but it’s not as if you have to pass a test on it -- even one with simple questions, such as “Would requiring Muslims to register with the government follow the First Amendment (a) to the letter, (b) in spirit or (c) none of the above?” (It’s surprising how far you can get in public life without being able to answer that one correctly.)
But the whole point of the paper just quoted -- “On ‘Becoming’ President of the United States: The Interaction of the Office with the Office Holder” by Robert E. Denton Jr. -- is that coping with the lack of an orientation handbook is one of the simultaneous, urgent and inflexible demands over which the incoming chief executive must demonstrate a mastery, beginning almost immediately. The author is a professor of communications (and head of the department) at Virginia Tech, with a special interest in the “symbolic dimensions of the American presidency,” to borrow the title of the first of his more than two dozen books.
His vita shows that Denton has been analyzing presidential communications more or less in real time since the first Reagan administration, when “On ‘Becoming’ President” appeared. One of his earliest publications, it proves especially interesting just now -- despite having been written long before official speeches and press conferences were joined by such message-delivery formats as the tweet.
“On ‘Becoming’ President” takes its bearings from symbolic interactionism: a school of thought at the intersection of sociology and psychology, and well established even then. Its defining insight -- drawn largely from the American pragmatist philosophers, especially George Herbert Mead -- is that communication between human beings always involves considerably more than the content of a message. We also take in cues about one another’s roles, statuses, expectations and so on -- an ongoing process of learning to see oneself from other people’s vantage points.
They are doing so at the same time, of course. It can get complicated, even when the roles, beliefs and shared expectations are all reasonably clear or well established. Arguably the symbolic-interactionist researcher and the novelist or filmmaker each tries to depict and analyze the range of communicative multitasking constantly underway in life.
The Oval Office emerges as the scene where symbolic interactions of global consequence take place that are conditioned by “expectations and functions of the office [that] are often competing, conflicting and contradictory.” In addition to the president’s constitutionally specified roles (chief of state, chief executive, chief diplomat, chief legislator and commander in chief), another “five extraconstitutional roles must be recognized: chief of [his] party, protector of the peace, manager of prosperity, world leader and voice of the people.” (Denton culls these roles from the poli-sci literature of the day; the references are given in his article.)
Occupancy of the office itself confers a great deal of persuasive force in exercising any given role. But it often requires playing a number of them simultaneously, and while a certain amount of authority may be delegated, the ultimate responsibility cannot. Denton also underscores the constant burden of “vast and complex” public expectation, both to meet of promises and to exhibit a suitable combination of leadership traits and personal morality.
“Many attitudes about the presidency stem from messages received in childhood about the virtues of various presidents,” Denton writes. “Studies continually find that the president is ordinarily the first public official to come to the attention of young children. Long before children are informed about the specific functions of the presidency, they view individual presidents as exceptionally important and benign.”
He mentions researchers who found children attributing to the president qualities of “honesty, wisdom, helpfulness” and related virtues. (All of the studies Denton cites were conducted before the mid-1970s, but comparable findings appear in a book on child psychology from 2005.)
The symbolic-interactionist approach would emphasize not only presidential roles and duties (as established by the Constitution or tradition) or the pressure of public expectations (still tinged with hero fantasies from childhood, perhaps) but also the inner experience of “adopting and adapting the self to the actions of others” through years of public life. The political learning curve “is adaptive,” Denton writes, “resulting from the capacity to change self depending on political environment, beliefs, values and expectations.”
Implied by Denton’s remarks on what he calls the “political self” is some normative sense of a successful candidate’s personality and career: a self conditioned by the experience of political action and debate, informed by some modeling of another’s leadership, and skilled at anticipating the impact of both words and deeds. The tempered political self -- so understood -- will presumably be as well prepared as anyone can be to incorporate “the trappings, powers and prerogatives of the presidency” into itself. And he suggests that the process is not without its risks, even then.
Our majestic treatment of presidents causes status inequality, inflation of self-concept and distorted perception of external events. Such exposure manifests distortion of social comparison processes, ‘overidentification’ with the office and misinformed decisions …. Presidents are constantly pressured to misrepresent or distort themselves to various national constituencies. Such a continual pressure causes further misrepresentations, erosion of truth norms and self-delusion.
It appears that the author had Richard Nixon in mind as the worst-case scenario, although Nixon had more than 20 years of political experience (including one previous presidential campaign) before taking office. In any event, Denton’s paper is something to chew on this week -- and to choke down in the months ahead.
American politics in the age of Donald Trump may yet make armchair psychopathologists of us all. The stream-of-consciousness quality of the candidate’s speeches now becomes a factor in governance. In the wee hours, while most of us sleep, the president-elect tweets. Stephen Dedalus’s description of history as “a nightmare from which I am trying to awake” feels less literary by the hour.
And so the public is compelled to play analyst: armed with diagnostic checklists and extensive Wikipedian training, we try to categorize his personality (as narcissistic, borderline, histrionic, etc.) in hopes that an adequate label might provide some hint of what to expect over the next four years. It won’t, of course, although the odds that a State of the Union speech will address the president’s penis size have increased considerably.
On a more substantial matter, it’s obvious that Trump’s affinity for the conspiratorial mind-set goes beyond a mutual appreciation of talk-show host Alex Jones. It forms the bedrock of Trump’s very existence as a political figure. His aspiration to something greater than mere celebritydom began in earnest only when Trump made himself a major player in the pseudo-controversy over President Obama’s birth certificate. (That racist melodrama assumed, even if it did not always emphasize, the existence of shadowy forces conspiring to put a Kenyan Muslim into office for their own un-American reasons.)
The penchant of Trump and some prominent figures in his entourage to resort to conspiratorial tropes seems like yet more evidence for the perennial value of Richard Hofstadter’s “The Paranoid Style in American Politics” (1964). And while I find Michael Paul Rogin’s critique of Hofstadter persuasive, there is no denying the essay’s almost irresistible quotability. Some passages sound as if the historian were making a summary of the themes appealing to the president-elect’s base:
“America has been largely taken away from them and their kind, though they are determined to try to repossess it and to prevent the final destructive act of subversion. The old American virtues have already been eaten away by cosmopolitans and intellectuals; the old competitive capitalism has been gradually undermined by socialistic and communistic schemers; the old national security and independence have been destroyed by treasonous plots, having as their most powerful agents not merely outsiders and foreigners but major statesmen who are at the very centers of American power.”
Add complaints about political correctness for seasoning, and the reader would have no reason to think this passage is not from a report on the 2016 election.
And that is, in a way, Hofstadter’s point. Hofstadter identifies the paranoid style as a recurrent if not permanent strain in American political thought and rhetoric -- but also as weaker, and less effective, over the long run, than its durability might imply. It appeals to established but aggrieved groups imagining themselves to be “the real America” under threat from change. Then the immigrants and upstarts become established, new demagogues emerge to exploit their discontent, and the whole thing starts again.
Rob Brotherton’s bookSuspicious Minds: Why We Believe Conspiracy Theories (Bloomsbury Sigma), originally published in late 2015, now appears in paperback as the Inauguration Day bleachers go up near the White House. While not a commentary on the Trump ascendancy, its timing may skew the reader’s attention in that direction even so.
The author, a psychologist and science writer, is more concerned than Hofstadter with the particular cognitive processes involved in the conspiratorial mentality. Rather than pointing to a paranoid mood that ebbs and flows with political currents, Brotherton treats conspiracy theories as part of a continuum of patterns of thought and behavior that are extremely common and not, for the most part, paranoid.
Much of it comes down to pattern recognition (the brain’s incessant but not always reliable drive to find order) combined with a tendency to overestimate the validity or completeness of the available information. Brotherton writes, “When we’re uninformed -- and we’re all ignorant about a lot of things -- our brain indiscriminately uses whatever is at hand to plaster over the intellectual blind spot.” The author adduces a number of lab experiments showing this, including research that suggests cognitive strain tends to heighten the capacity to imagine structure where none exists.
“By painting conspiracism as some bizarre psychological tic that blights the minds of a handful of paranoid kooks,” he writes, “we smugly absolve ourselves of the faulty thinking we see so readily in others. But we’re doing the same thing as conspiracists who blame all of society’s ills on some small shadowy cabal. And we’re wrong. Conspiracy thinking is ubiquitous, because it’s a product, in part, of how all of our minds are working all the time.”
This is persuasive, up to a point. But somewhere far beyond that point are whole milieus of people whose pattern-recognition software got stuck in the conspiratorial program and can’t be reset. There's David Icke, for one, an internationally famous author who believes that most political, social and cultural changes of recent decades are the work of shape-shifting interdimensional reptile people. (Icke makes Alex Jones sound like Walter Cronkite.)
Between Hofstadter’s cyclical rise and fall of paranoid politics and Brotherton’s rather genial vision as everyone being conspiracy-minded at one time or another, it’s almost possible to imagine the next few years as something other than cataclysmic. But I’m not entirely persuaded. Suppose this is just the beginning. After all, we still have no idea where the incoming administration stands on shape-shifting interdimensional reptile people. The president-elect hasn’t even uttered the words “shape-shifting interdimensional reptile people.” What is he trying to hide?