Attributing human characteristics to animals -- as in the case of Henri the Cat, the existentialist feline -- is a case of anthropomorphism. But the word is perhaps less suitable when the creatures in question are monkeys or apes. Anthropomorphizing disregards the vast difference between an animal’s world and our own. Watching primates is another matter.
Not that the gap is smaller, but it’s tangible and fascinating in its own right. Projecting human qualities onto primates can boomerang: we are close enough on the evolutionary tree to make every point of anatomical or behavioral resemblance a challenge to our egocentricity as a species. From a certain angle, it probably looks like we’re just a species of jumped-up chimpanzee.
Two camps have formed in the study of how intelligence evolved, according to Julia Fischer’s Monkeytalk: Inside the Worlds and Minds of Primates, published in Germany five years ago and now out in translation from the University of Chicago Press. One camp takes human beings as “the analytical point of departure” and “seeks to discover which other animal groups share competencies” with us. The anthropocentric researcher then goes in search of “a plausible explanation … for when a particular trait emerged in the course of evolution.”
In contrast, what Fischer calls the “evolutionary-ecological approach” starts out from an understanding of intelligence as one aspect of how animals engage with and adapt to their environment, raising questions about how “various species solved similar problems in the course of evolution” and what circumstances foster the power to learn or to generalize from experience. (Or, conversely, what factors might inhibit that power.)
Drawing on her own work in the field and the lab as well as that of other researchers, Fischer considers it “most productive to incorporate both perspectives” -- the anthropocentric and the evolutionary-ecological -- “to develop a comprehensive understanding of animal intelligence” and of primates especially. But my impression is that she inclines more to the evolutionary-ecological camp: much of the book reflects on her observation of three species (the Barbary macaque and two kinds of baboon) in different environments, and Fischer keeps the reader aware of the natural fit between behavioral pattern or social structure and immediate issues such as predator threats and food availability.
Fischer’s recollections of field research (where “strong nerves, grit and oftentimes a morbid sense of humor are essential”) and descriptions of monkey behavior are highly engaging. The account of babysitting among Barbary macaques is especially vivid and memorable. A male will snatch a newborn (not necessarily his own progeny) from its mother for use as a status symbol and icebreaker with the guys. Then:
He can more confidently approach another male and engage in mutual grooming than if he approaches alone. When two male Barbary macaques sit together holding an infant, they often engage in a peculiar ritual, lifting the baby up high, nuzzling it and thoroughly inspecting it. They chatter their teeth, smack their lips and emit deep grunting sounds. Sometimes they will bask in the afterglow, calmly remaining beside each other, while at other times one of the males will brusquely snatch the infant up and rush off to repeat the ritual with another male.
Eventually the baby gets hungry, making it less amusing, whereupon it is returned to the mother. From observation of chacma baboons, Fischer found that at the age of 10 weeks, youngsters did not respond to recordings of baboon calls. By four months, they did pay attention, without regard for what kind of call it was. And two months after that, “They reacted clearly to alarm calls and had learned to ignore contact calls, save for those produced by their mothers.” A learning process had transpired, though Fischer notes it is difficult for researchers to work out just how it happens in the wild.
Monkeytalk reports on findings concerning three dimensions of the primate mind: social behavior, cognition and communication. One of the arguments Fischer considers is “that intelligence has arisen as a consequence of life in complexly structured groups”; the other, “that intelligence and communicative ability are intimately interconnected.”
From our limb of evolutionary development, it’s tempting to consider them as all inextricably linked. The anthropocentrist would insist on a third link: one between communicative ability and social complexity, which work together like pistons in the engine of human cognition. (See Kenneth Burke’s “Definition of Man” for another formulation of this idea.) But from Fischer’s review of the evidence, the connections are much more loosely imbricated than we might think:
Primate intelligence is not limited to the social domain. Primates competently interpret objects and events in their physical surroundings and draw correct inferences about them -- or at least they do when the pertinent stimuli are not too misleading …. Yet indirect evidence and “invisible” causal connections remain completely alien to them. … While intelligence is tied to a rich representation of the social world, it by no means entails a sophisticated system of communication. At the same time, primates are evidently capable of perceiving the subtlest differences in the signaling behavior of their fellows and investing those nuances with distinctive meaning. In addition, they make use of, and adaptively respond to, a variety of information sources, such as contextual clues and signals.
Only on the final page (not counting acknowledgments and other apparatus) does Fischer make the reader fully aware of two very dark clouds hanging over the progress of knowledge concerning our fellow primates. One is that long-term research -- while necessary, since most species have long life spans -- is difficult given the scarcity of long-term funding. The other is that a majority of species are now endangered, and many are on the verge of extinction. Monkeytalk certainly leaves you with a feeling of the depths that loss will mean.
A monograph of long gestation, Peter J. Spiro’s At Home in Two Countries: The Past and Future of Dual Citizenship (NYU Press) is clearly not aimed at the readership of Americans who are considering an exit strategy right about now. A number of handbooks are already available, should that be your interest.
The author, a professor of law at Temple University, is more concerned with the logic of dual citizenship -- its evolution as a juridical concept and a practical option over the past 300 years or so -- than with the logistics involved in obtaining it. That said, Spiro notes that he and his children, while all born and residing in the United States, now also hold European passports. It’s a reminder of his larger point: that the tide of globalization in recent decades has turned dual citizenship from an anomalous and potentially dangerous condition into something almost commonplace -- or at least no big deal. Whether it will remain that way is another question.
The historical narrative in At Home in Two Countries has a fairly well-demarcated beginning, middle and end -- with each phase defined by how much strain dual citizenship places on the relationship between the individual and the nation-state. (Also by the potential for conflict it creates between the nation-states involved, but let’s leave that to the side for a moment.)
In the beginning, everything is reasonably straightforward. You were not the citizen of a nation-state but the subject of a sovereign. God had placed you in your respective positions -- tying you together on this earth for what were, presumably, good reasons that, in any case, were not up for discussion. It was “not in the power of any private subject to shake off his allegiance, and to transfer it to a foreign prince,” as the U.K.’s House of Lords declared in 1747, nor could “any prince, by naturalizing and employing a subject of Great Britain … dissolve the bonds of allegiance between subject and crown.”
Implicit in such an official statement of the doctrine of perpetual allegiance is the reality that it was being violated in practice. And within 30 years came the virtually unthinkable developments in the American colonies, where British subjects began “shak[ing] off … allegiance” to their sovereign without “transfer[ing] it to a foreign prince” but to their own republic instead.
Emigration was a constant drain on the sovereign’s human capital -- especially on military resources, since it provided a way to avoid conscription. So a variant of the doctrine of perpetual allegiance remained in effect even after the secular nation-state took over from divinely installed royalty. Becoming the naturalized citizen of another country did not necessarily bring an end to expectation that you should meet the motherland’s obligations and obey its laws. Nor would your children be exempt. That could make visiting family in the old country a risky enterprise. Dual citizenship of this sort was involuntary and unintentional, and it had potentially grave diplomatic consequences if the government of an individual’s adopted country tried to intervene.
The legal and political fights so occasioned throughout the 19th and early 20th centuries make for the most interesting pages in At Home in Two Countries. Laws and treaties took shape that made expatriation, naturalization and election (i.e., the choice of nationality by someone born to parents of different citizenships) more routine and less volatile -- as much as that was possible, anyway, amid wars and international tensions.
But the other side of this stabilizing trend was -- at least, until fairly recently -- a strong sense that dual citizenship itself was something to be avoided and prevented as much as possible. At best it would be a temporary condition, to be cured with the proper paperwork and no delay.
“On the one hand,” Spiro writes, “dual nationals represented a potential spark in the tinderbox, as issues relating to their protection or responsibility for their actions could readily escalate into interstate conflict. On the other hand, in a world premised on the fact of some level of interstate conflict, dual nationals could only be presumed to do an adversary’s bidding from within.”
In the United States, the peak of what Spiro calls “the consensus opprobrium” regarding dual citizenship came in the early 1950s, with Cold War nerves at their most taut. The timing is interesting, because it coincides with a rapid decline of the issue driving much of the 19th-century debate: the concern with foreign sovereigns trying to conscript naturalized citizens traveling abroad. It was no longer a problem routinely facing the American diplomatic corps, and by the 1960s, European and Latin American countries adopted conventions to end it as a source of friction among themselves.
“As states stopped fighting over dual nationals,” Spiro says, “there was much less incentive to combat the status.” What followed was the slow and uneven normalization of dual citizenship, as some countries ceased to require emigrants to renounce citizenship upon naturalizing elsewhere and others reaped benefits from absorbing immigrants who maintained their birthright citizenship. (“To the extent that a renunciation requirement deters naturalization,” writes Spiro, “society’s loss from the reduced rate of naturalization plainly overshadows the benefits of enforced renunciation.”)
So from the era of perpetual allegiance (in which dual citizenship was more or less a contradiction in terms) to the long decades of reducing the strains of expatriation and naturalization (when dual citizenship became an anomaly to avoid), we’ve reached the epoch of high globalization, with dual citizenship an established if not quite ubiquitous mode of transnational life. With dual citizenship “normalized as an incident of globalization,” Spiro devotes a chapter to the case for “the emergence of an articulated, protected right to the status” recognized by international law.
Here the author hits a note of expectancy that implies something almost historically inevitable: the result of forces moving in certain identifiable directions. For the course Spiro identifies moves in a recognizable direction. From epoch to epoch, the individual gains power in determining his or her status vis-à-vis instituted authorities. At the same time, conflict among those authorities tends to subside. Nationalism will grow kinder and gentler, to be replaced in time by a higher stage of cosmopolitan citizenship, as envisioned by Immanuel Kant or Thomas Friedman, albeit in somewhat different ways.
It will take much work and goodwill, but there’s no reason why things can’t keep moving forward in a virtuous circle. The potential for retrogression is not really a part of the scenario. It figures the normal global citizen of the future as someone choosing among citizenships -- rather than as a refugee without the option of claiming a single one, caught between nationalisms out for blood. In Spiro’s long-term perspective, the evolution of dual citizenship seems destined to keep on advancing, while at the moment it feels like we are at the edge of something, possibly a cliff.
“As nearly all scholars recognize,” we read in an article published in Presidential Studies Quarterly in 1983, “there is no apprenticeship or training an individual may obtain in preparation for the presidency. There is no convenient book or guide which provides a detailed step-by-step analysis of the requirements and demands of the office.”
How true! An acquaintance with the Constitution would surely be helpful, but it’s not as if you have to pass a test on it -- even one with simple questions, such as “Would requiring Muslims to register with the government follow the First Amendment (a) to the letter, (b) in spirit or (c) none of the above?” (It’s surprising how far you can get in public life without being able to answer that one correctly.)
But the whole point of the paper just quoted -- “On ‘Becoming’ President of the United States: The Interaction of the Office with the Office Holder” by Robert E. Denton Jr. -- is that coping with the lack of an orientation handbook is one of the simultaneous, urgent and inflexible demands over which the incoming chief executive must demonstrate a mastery, beginning almost immediately. The author is a professor of communications (and head of the department) at Virginia Tech, with a special interest in the “symbolic dimensions of the American presidency,” to borrow the title of the first of his more than two dozen books.
His vita shows that Denton has been analyzing presidential communications more or less in real time since the first Reagan administration, when “On ‘Becoming’ President” appeared. One of his earliest publications, it proves especially interesting just now -- despite having been written long before official speeches and press conferences were joined by such message-delivery formats as the tweet.
“On ‘Becoming’ President” takes its bearings from symbolic interactionism: a school of thought at the intersection of sociology and psychology, and well established even then. Its defining insight -- drawn largely from the American pragmatist philosophers, especially George Herbert Mead -- is that communication between human beings always involves considerably more than the content of a message. We also take in cues about one another’s roles, statuses, expectations and so on -- an ongoing process of learning to see oneself from other people’s vantage points.
They are doing so at the same time, of course. It can get complicated, even when the roles, beliefs and shared expectations are all reasonably clear or well established. Arguably the symbolic-interactionist researcher and the novelist or filmmaker each tries to depict and analyze the range of communicative multitasking constantly underway in life.
The Oval Office emerges as the scene where symbolic interactions of global consequence take place that are conditioned by “expectations and functions of the office [that] are often competing, conflicting and contradictory.” In addition to the president’s constitutionally specified roles (chief of state, chief executive, chief diplomat, chief legislator and commander in chief), another “five extraconstitutional roles must be recognized: chief of [his] party, protector of the peace, manager of prosperity, world leader and voice of the people.” (Denton culls these roles from the poli-sci literature of the day; the references are given in his article.)
Occupancy of the office itself confers a great deal of persuasive force in exercising any given role. But it often requires playing a number of them simultaneously, and while a certain amount of authority may be delegated, the ultimate responsibility cannot. Denton also underscores the constant burden of “vast and complex” public expectation, both to meet of promises and to exhibit a suitable combination of leadership traits and personal morality.
“Many attitudes about the presidency stem from messages received in childhood about the virtues of various presidents,” Denton writes. “Studies continually find that the president is ordinarily the first public official to come to the attention of young children. Long before children are informed about the specific functions of the presidency, they view individual presidents as exceptionally important and benign.”
He mentions researchers who found children attributing to the president qualities of “honesty, wisdom, helpfulness” and related virtues. (All of the studies Denton cites were conducted before the mid-1970s, but comparable findings appear in a book on child psychology from 2005.)
The symbolic-interactionist approach would emphasize not only presidential roles and duties (as established by the Constitution or tradition) or the pressure of public expectations (still tinged with hero fantasies from childhood, perhaps) but also the inner experience of “adopting and adapting the self to the actions of others” through years of public life. The political learning curve “is adaptive,” Denton writes, “resulting from the capacity to change self depending on political environment, beliefs, values and expectations.”
Implied by Denton’s remarks on what he calls the “political self” is some normative sense of a successful candidate’s personality and career: a self conditioned by the experience of political action and debate, informed by some modeling of another’s leadership, and skilled at anticipating the impact of both words and deeds. The tempered political self -- so understood -- will presumably be as well prepared as anyone can be to incorporate “the trappings, powers and prerogatives of the presidency” into itself. And he suggests that the process is not without its risks, even then.
Our majestic treatment of presidents causes status inequality, inflation of self-concept and distorted perception of external events. Such exposure manifests distortion of social comparison processes, ‘overidentification’ with the office and misinformed decisions …. Presidents are constantly pressured to misrepresent or distort themselves to various national constituencies. Such a continual pressure causes further misrepresentations, erosion of truth norms and self-delusion.
It appears that the author had Richard Nixon in mind as the worst-case scenario, although Nixon had more than 20 years of political experience (including one previous presidential campaign) before taking office. In any event, Denton’s paper is something to chew on this week -- and to choke down in the months ahead.
American politics in the age of Donald Trump may yet make armchair psychopathologists of us all. The stream-of-consciousness quality of the candidate’s speeches now becomes a factor in governance. In the wee hours, while most of us sleep, the president-elect tweets. Stephen Dedalus’s description of history as “a nightmare from which I am trying to awake” feels less literary by the hour.
And so the public is compelled to play analyst: armed with diagnostic checklists and extensive Wikipedian training, we try to categorize his personality (as narcissistic, borderline, histrionic, etc.) in hopes that an adequate label might provide some hint of what to expect over the next four years. It won’t, of course, although the odds that a State of the Union speech will address the president’s penis size have increased considerably.
On a more substantial matter, it’s obvious that Trump’s affinity for the conspiratorial mind-set goes beyond a mutual appreciation of talk-show host Alex Jones. It forms the bedrock of Trump’s very existence as a political figure. His aspiration to something greater than mere celebritydom began in earnest only when Trump made himself a major player in the pseudo-controversy over President Obama’s birth certificate. (That racist melodrama assumed, even if it did not always emphasize, the existence of shadowy forces conspiring to put a Kenyan Muslim into office for their own un-American reasons.)
The penchant of Trump and some prominent figures in his entourage to resort to conspiratorial tropes seems like yet more evidence for the perennial value of Richard Hofstadter’s “The Paranoid Style in American Politics” (1964). And while I find Michael Paul Rogin’s critique of Hofstadter persuasive, there is no denying the essay’s almost irresistible quotability. Some passages sound as if the historian were making a summary of the themes appealing to the president-elect’s base:
“America has been largely taken away from them and their kind, though they are determined to try to repossess it and to prevent the final destructive act of subversion. The old American virtues have already been eaten away by cosmopolitans and intellectuals; the old competitive capitalism has been gradually undermined by socialistic and communistic schemers; the old national security and independence have been destroyed by treasonous plots, having as their most powerful agents not merely outsiders and foreigners but major statesmen who are at the very centers of American power.”
Add complaints about political correctness for seasoning, and the reader would have no reason to think this passage is not from a report on the 2016 election.
And that is, in a way, Hofstadter’s point. Hofstadter identifies the paranoid style as a recurrent if not permanent strain in American political thought and rhetoric -- but also as weaker, and less effective, over the long run, than its durability might imply. It appeals to established but aggrieved groups imagining themselves to be “the real America” under threat from change. Then the immigrants and upstarts become established, new demagogues emerge to exploit their discontent, and the whole thing starts again.
Rob Brotherton’s bookSuspicious Minds: Why We Believe Conspiracy Theories (Bloomsbury Sigma), originally published in late 2015, now appears in paperback as the Inauguration Day bleachers go up near the White House. While not a commentary on the Trump ascendancy, its timing may skew the reader’s attention in that direction even so.
The author, a psychologist and science writer, is more concerned than Hofstadter with the particular cognitive processes involved in the conspiratorial mentality. Rather than pointing to a paranoid mood that ebbs and flows with political currents, Brotherton treats conspiracy theories as part of a continuum of patterns of thought and behavior that are extremely common and not, for the most part, paranoid.
Much of it comes down to pattern recognition (the brain’s incessant but not always reliable drive to find order) combined with a tendency to overestimate the validity or completeness of the available information. Brotherton writes, “When we’re uninformed -- and we’re all ignorant about a lot of things -- our brain indiscriminately uses whatever is at hand to plaster over the intellectual blind spot.” The author adduces a number of lab experiments showing this, including research that suggests cognitive strain tends to heighten the capacity to imagine structure where none exists.
“By painting conspiracism as some bizarre psychological tic that blights the minds of a handful of paranoid kooks,” he writes, “we smugly absolve ourselves of the faulty thinking we see so readily in others. But we’re doing the same thing as conspiracists who blame all of society’s ills on some small shadowy cabal. And we’re wrong. Conspiracy thinking is ubiquitous, because it’s a product, in part, of how all of our minds are working all the time.”
This is persuasive, up to a point. But somewhere far beyond that point are whole milieus of people whose pattern-recognition software got stuck in the conspiratorial program and can’t be reset. There's David Icke, for one, an internationally famous author who believes that most political, social and cultural changes of recent decades are the work of shape-shifting interdimensional reptile people. (Icke makes Alex Jones sound like Walter Cronkite.)
Between Hofstadter’s cyclical rise and fall of paranoid politics and Brotherton’s rather genial vision as everyone being conspiracy-minded at one time or another, it’s almost possible to imagine the next few years as something other than cataclysmic. But I’m not entirely persuaded. Suppose this is just the beginning. After all, we still have no idea where the incoming administration stands on shape-shifting interdimensional reptile people. The president-elect hasn’t even uttered the words “shape-shifting interdimensional reptile people.” What is he trying to hide?
On first reading the title of Timothy Recuber’s Consuming Catastrophe: Mass Culture in America’s Decade of Disaster (Temple University Press), my guess was that it would be about the 1970s -- that is to say, the era of my childhood, when movies like Earthquake, The Poseidon Adventure, The Towering Inferno and The Hindenburg were the talk of the playground. Besides the disaster movies (which were a genre unto itself, for a few years), there were best-selling books and TV fair of similar ilk.
It was all pretty formulaic -- even ritualistic. The strains of numerous crises in public life (Watergate, the oil embargo and inflation, plus aftershocks from the 1960s) were translated into the language of blockbuster melodrama. The spectacular disaster on the screen or the page enacted a kind of miniature social implosion, its destructive force revealing the inner strengths or vices of the characters who had to face it. Various embodiments of evil or dumb authority would perish. Survivors of the disaster would reunite with their families or reconnect with their values.
The genre’s chief weakness was that the supply of viable disaster scenarios was not unlimited. The point of exhaustion came, as I recall, with a made-for-TV movie-of-the-week involving a swarm of killer bees. In retrospect, the whole period looks like one big anxiety disorder. Ronald Reagan never appeared in a disaster movie, but his election in 1980 probably owed something to the genre insofar as the public could imagine him guiding it to safety through all the debris.
In Consuming Catastrophe, Recuber, a visiting assistant professor of communication at Hamilton College, has a another period and variety of spectacle in mind: the real-world disasters from the first decade of this century (Sept. 11, Hurricane Katrina, the Virginia Tech shootings, the BP offshore oil spill, the near collapse of the financial system in 2008), rather than symptomatic fictions churned out as entertainment.
The contrast is also one of levels of immediacy and saturation of the public attention. Very few news stories of 40 years ago unfolded with the intensity and duration of real-time coverage that has become the norm -- even when the occasion is something considerably less wrenching than a disaster. This tends to create a public sense of somehow participating in an event, rather than just being informed about it. The potentials and limits of that participation are the focus of much of Recuber’s interest.
The widest frame of his perspective takes in German sociologist and philosopher Jürgen Habermas’s argument that newspapers and magazines were foundational elements of the public sphere of information and reasoned debate that could challenge policies and opinions that derived their force only from established authority or the inertia of tradition. Besides the political and economic issues normally associated with Habermas’s understanding of the public sphere, Recuber notes that “disasters, crises, misfortunes and the suffering of distant others were central topics of discussion there, although [its] literate publics frequently disagreed about the moral and ethical acceptability of such macabre subjects.” The classic instance would be the Lisbon earthquake of 1755 (see this column from 2005, on the disaster’s sestercentennial).
Recuber quotes Adam Smith on what is involved in a sympathetic response to others’ misfortune: “The compassion of the spectator must arise altogether from the consideration of what he himself would feel if he was reduced to the same unhappy situation, and, what perhaps is impossible, was at the same time able to regard it with his present reason and judgment.” This seems carefully balanced -- a synthesis of much public-sphere argument, no doubt. But it is also demanding. It implies some obligation to find an effective means to alleviate the suffering as well as to determine if any part of it was preventable. Sympathy, to use the preferred 18th-century term, was not just a personal emotional response but also a communal force. It held society together and could, if strengthened, improve it.
Fast-forward two centuries and a few decades, and we find the contradictory and perverse situation that Recuber describes in a series of case studies. Means of communication exist that can expand our powers of sympathy and our capacity to intervene to reduce suffering -- and they do sometimes, but in problematic ways. It’s not just that the intensity and pervasiveness of media coverage of disasters can induce what’s become known as “compassion fatigue.” That is certainly a factor, but Recuber emphasizes the more subtle and insidious role of what he calls “the empathic gaze.”
Where sympathy means an awareness of another’s unhappiness as something that can and should be alleviated, empathy, in the author’s usage, “refers to an intersubjective understanding of the other’s plight devoid of the obligation to intervene.” It is a relationship to the other’s suffering that is of a “more passive, vicarious character.” The capacity for empathy is much praised in the contemporary literature of self-help and personnel management. Certainly it’s preferable to the psychopathic indifference which, of late, increasingly seems like the other main option on order. But in Recuber’s estimation it rests content with having reached a secure but passive position vis-à-vis suffering, if not a rather morbidly sensationalistic variety of pity.
My impression is that Recuber, far from chastising us as a generation of moral ghouls feasting on disaster, actually regards sympathy as our original or default mode of moral perspective (rather as some 18th-century thinkers did). His case studies of disasters from 2001 to 2010 are, in effect, accounts of sympathy being frustrated, exploited or otherwise short-circuited in diverse ways by the channels into which the media directs it.
One example stands out in particular and will stick in my memory. It concerns the April 2007 massacre at Virginia Tech, which left 32 dead, followed by the suicide of the gunman, Seung-Hui Cho. Cho sent a multimedia package explaining himself to NBC Nightly News, portions of which were shown on the program two days after the shootings. “We are sensitive to how all of this will be seen by those affected,” the news anchor said, “and we know that we are in effect airing the words of a murderer here tonight.”
No one could accuse him of lacking empathy, anyway; empathy can discharge its responsibilities simply by announcing itself. “The statement was an oddly unbalanced one,” Recuber comments, “… seemingly missing a second half that explained what the benefits of broadcasting the manifesto to be and why they outweighed the concerns of ‘those affected.’ Such a statement never came.”
But of course not! It’s not as if being “sensitive to how all of this will be seen by those affected” compelled the network to spare them anything. Those of us watching disaster movies in the 1970s were on higher moral ground: the entertainment was brainless but at least it involved no disregard for real suffering.
“We think about this as a studio apartment,” says one member of a couple living in a single-room occupancy. In better days the unit was a motel room. To see it as a studio apartment is a triumph of imagination and will, as the speaker is fully aware. “We have to,” she continues, “’cause if we continue to realize where we’re at in life, we would spiral into a massive depression. And the housekeeping we do would not get done …. I call it home and I cry on the thought of losing it ’cause this is all I have.”
Christopher P. Dum, an assistant professor of sociology at Kent State University, is careful to protect the identities of the ethnographic subjects he spoke to in researching Exiled in America: Life on the Margins in a Residential Hotel (Columbia University Press). In exchange he has been given access to some extraordinarily precarious and fragile domestic spaces -- dwellings that will seem to most readers just slightly more stable than a homeless shelter or living out of your car. But that is, by Dum’s reckoning, a blinkered view. The squalor is real and inescapable; what’s harder to see from a distance is the residents’ effort to find, or create, some kind of order in seriously damaged lives.
The author lived at the hotel he calls the Boardwalk for a year as fieldwork for his dissertation, gradually overcoming the residents’ (understandable) suspicion that he worked for law enforcement and intriguing some with the prospect of having their stories told. The hotel originally drew Dum’s attention while he was investigating the difficulties of registered sex offenders in finding housing. (In the public mind, “registered sex offender” has come to mean pedophile, although the label applies equally to those convicted of exhibitionism, soliciting prostitutes or even, believe it or not, public urination.)
Also staying at the Boardwalk during the author’s stay were recently released ex-prisoners and people with a range of mental-health issues, physical disabilities or substance-abuse problems, often in combination. It sounds like a population guaranteed to create even greater chaos than the sum of its dysfunctions. What Dum found instead is an emergent and fragile community of what he characterizes as “social refugees … impelled to relocate within their own country of citizenship because of the influence of social context and/or social policy.”
From studies of migration he adopts the notion of push and pull factors to discuss the two strong forces shaping life at the hotel. One is the overwhelming power of stigma: the area’s largely middle-class public “viewed motel residents as belonging to one or several devalued groups,” with the sex offenders among them being especially contaminating and marginalizing. Driving past the Boardwalk and yelling that its inhabitants were child molesters seems to have been a local pastime.
Membership in stigmatized groups pushed residents away from mainstream society and toward the Boardwalk, which in turn pulled them to its “sustaining habitat” -- a term from urban sociology akin to the real-estate mantra “location, location, location.” Boardwalk residents had ready access to bus stops, cheap food, a Laundromat and other “opportunit[ies] to engage in the same type of consumer relations that characterized the lives of their [better-off] detractors.” There also seems to have been some comfort in knowing that their landlord owned another nearby property called Park Place. (I’d guess that the real names of these motels were just as overblown as their Monopoly stand-ins.) Park Place was cheaper, in more serious disrepair and had a reputation for violence among the tenants. “They start drinking Milwaukee’s Best in the morning,” one Boardwalker explains, “and that makes them get wily.”
However uninviting Boardwalk might look from the author’s photographs, it’s hardly the worst place where life could leave you. Some tenants Dum interviewed managed to establish a degree of stability that included employment (one aspect of a sustaining habitat) as well as a certain amount of interior decoration. They felt a responsibility to care for other residents, particularly those with severe mental disorders. An informal but exacting code of etiquette governed the sharing of cigarettes, food and intoxicants. A certain amount of sexual jealousy and trash talking was inevitable, as was the occasional round of threats or punches, but Dum indicates that he heard of very little theft or predation. “It’s like any other community,” one resident told him, “it’s just people trying to get along.”
At the same time, even success in carving out livable circumstances could leave residents feeling trapped. Treating one’s room as a studio apartment entailed more than psychological strain. Rent “did not guarantee heat, air-conditioning, a fridge, kitchen or even drinkable water,” Dum writes. “Offsetting these conditions exacted so much material cost to buy fans, space heaters, refrigerators, microwaves and bottled water that once residents settled at the hotel they found it very hard to leave. They struggled to make monthly, even weekly, rent payments, and because of this, putting down a security deposit and first month’s rent for an apartment was nearly impossible.”
The author’s fieldwork turned out to coincide with the final phase of the motel’s existence. After years of code violations -- largely unnoticed by city inspectors for the simple reason that they often didn’t even show up -- the local government forced the closing of Boardwalk and Park Place. Exiled in America is on the whole an exemplary piece of social reportage and analysis, but while reading it I often wondered if calling his interview subjects “refugees” might not be pushing it. As it turns out, most of them found out they were being evicted a few hours before the deadline. So, yes, refugees.