Books

An academic press sues a librarian, raising issues of academic freedom

Smart Title: 

A university librarian finds himself sued for questioning the quality of an academic press.

New book challenges the idea that professors don't care about teaching

Smart Title: 

Research from University of Washington shows professors to be self-critical about and constantly struggling to improve their teaching.

Flat World's shift in gears and what it means for open textbook publishing

Smart Title: 

Flat World Knowledge will no longer publish versions of its textbooks at no charge. How big a setback does the company's change represent for the 'open' movement?

Author discusses his new book on vision, values and higher education

Smart Title: 

Author discusses new book about the importance of vision and values in higher education.

Essay on Ilan Stavans's "I Love My Selfie" and German researchers' paper "The Selfie Paradox"

Andy Warhol’s prediction about fame merits the occasional update. One that popped into my head not long ago after crossing paths with a gaggle of tourists holding their cellphones at arm’s length and smiling: “In the future, everyone will take a selfie every 15 minutes.”

After launching this random thought into the world via social media, I realized almost immediately that it wasn’t much of a prophecy. A poll in 2013 found that almost every third picture taken by someone between the ages of 18 and 24 was a selfie. The following year, participants in a Google developers’ conference heard that the users of one type of cellphone were snapping 93 million selfies per day. My reworking of Warhol’s point might not literally describe the status quo now, but it could certainly be taken for evidence of aging, as in fact my friends were not long in pointing out.

No longer a fad though not a tradition quite yet, the selfie is one of those cultural phenomena that almost everyone can recognize as probably symptomatic -- the result of social, psychological and technological forces too inexorable to escape but too troubling to think about for very long. (Other examples: reality television, sex robots, cars that drive themselves.)

Even the most ardent or compulsive selfie taker must have moments of uneasiness at how tightly the genre knots together self-expression and self-obsession, leaving not much room for anything else. A recent paper in the journal Frontiers in Psychology identifies a selfie-specific form of ambivalence unlikely to go away. It is called “The Selfie Paradox: Nobody Seems to Like Them Yet Everyone Has Reasons to Take Them. An Exploration of Psychological Functions of Selfies in Self-Presentation.”

More on that shortly. But first, a quick look at a book with a more compact and less literal title, I Love My Selfie, by the critic and essayist Ilan Stavans (Duke University Press). A few of the author’s selfies appear in the book, along with reproductions of self-portraits by Rembrandt, van Gogh and Warhol, but it would be an irony-impaired reader indeed who took him to be making any claim to equivalence. The book’s spirit is much closer to that of the Puerto Rican multimedia artist Adál Alberto Maldonado, whose work appears throughout its pages and who titled one photo series “Go Fuck Your Selfie: I Was a Schizophrenic Mambo Dancer for the FBI.” The seed for Stavans’s book was the preface he wrote for a collection of photos by Adál, as he prefers to be known. (Stavans is a professor of Latin American and Latino culture at Amherst College.)

“Richard Avedon once said that a portrait is a picture of someone who knows he is being portrayed,” writes Stavans. “… The self-portrait is that knowledge twice over.” Combined with the highly developed skills of a painter or a photographer, that redoubled awareness can reveal more than the creator’s idealized self-image. The late self-portraits of Robert Mapplethorpe, for instance, “emit a stoicism that is frightening … as if his statement was ‘The world around me is falling apart, but I’m still here, a chronicler of my times.’” Adál’s quietly surreal photographs of himself posing with various props are an oblique and sometimes comic reflection on being a Puerto Rican artist obliged to deal with whatever assumptions the viewer may bring to his work.

Selfies, by contrast, are what’s left of the self-portrait after all technique, discipline, talent and challenge are removed from the process. They exist to be displayed -- not to reveal the self but to advertise it. Stavans calls the selfie “a business card for an emotionally attuned world” and describes life in the public sphere of social media as “a mirage, a solipsistic exercise in which we believe we’re connecting with others while in truth we’re just synchronizing with the image we have of them in our mind.”

And as with other forms of advertising, too much truthfulness would damage the brand. Most selfies never go out into the world. “The trash icon in which we imprison them,” Stavans writes, “is the other side of our life, the one we reject, the one we condemn.”

The authors of “The Selfie Paradox,” Sarah Diefenbach and Lara Christoforakos, are researchers in the department of psychology of Ludwig-Maximilians-University in Munich. The participants in their study were 238 individuals living in Austria, Germany and Sweden between 18 and 63 years of age, recruited from email lists and at university events. They were asked about the frequency with which they took selfies and received them from other people, as well as a series of questions designed to elicit information about their personality and feelings about, and motivations for, taking and viewing selfies.

Not surprisingly, perhaps, people who stated that they were open about their feelings and prone to discussing their accomplishments also tended to enjoy taking selfies. And consistently enough, those inclined to downplay their own successes also tended to report “negative selfie-related affect” -- i.e., were decidedly nonenthusiastic about selfies.

The researchers found broad agreement with the idea that selfies could have unpleasant consequences (inciting derogatory comments, for example) but much less regarding what the positive effects might be. “The only aspect that reached significant agreement” the researchers found, “was self-staging, i.e., the possibility to use selfies for presenting an intended image to others.” Positive benefits such as expressing independence or connection with others were recognized by far fewer participants. And those who took selfies more often were more likely to identify positive consequences for the activity:

In a way, taking selfies may be a self-intensifying process, where one discovers unexpected positive aspects (besides self-staging) while engaging in the activity and this positive experience encourages further engagement. Nevertheless, the majority showed a rather critical attitude, and among the perceived consequences of selfies, negative aspects clearly predominate.

To put it another way, participants in the study tended to acknowledge that putting a selfie out into the world could backfire -- while the only broadly accepted benefit of a selfie they recognized was that of self-display or self-promotion. Though the researchers do not spell out the connection, these attitudes seem mutually reinforcing. If the most recognized motivation for posting a selfie is to benefit the ego, exposing its vulnerabilities would be an associated danger.

Another of the findings also seems in accord with this logic: participants were likely to explain their reasons for taking and posting selfies as ironic or self-deprecating -- while showing much less tendency to assume that other people were doing the same. They also expressed a preference for others to post more nonselfie photographs.

Indeed, people who reported taking a lot of selfies tended “not to like viewing others’ selfie pictures and rather wish for a higher number of usual photos.” It seems in accord with one of Stavans’s observations: “Looking at a favorite selfie is like entering into a world in which we, and nobody else, exist in an uninterrupted fashion.” At least until Narcissus falls into the pool and drowns.

Editorial Tags: 
Image Source: 
iStock/Rohappy
Is this diversity newsletter?: 

Review of James Q. Whitman, "Hitler's American Model: The United States and the Making of Nazi Race Law"

Finding himself in prison following the beer-hall fiasco in Munich in 1923, Adolf Hitler had time to put his thoughts about politics and destiny into order, at least as much as that was possible. The United States was part of his grand vision, and not as someplace to conquer.

“The racially pure and still unmixed German has risen to become master of the American continent,” he wrote in Mein Kampf, “and he will remain the master, as long as he does not fall victim to racial pollution.” He was encouraged on the latter score by what he had learned of American immigration policy. With its stated preference for Northern Europeans, its restrictions on those from Southern and Eastern Europe, and its outright exclusion of everyone else, the Immigration Act of 1924 impressed Hitler as exemplary. It manifested, “at least in tentative first steps,” what he and his associates saw as “the characteristic völkisch conception of the state,” as defined in some detail by the Nazi Party Program of 1920.

Revulsion is an understandable response to this little traipse through the ideological sewer, but it is wholly inadequate for assessing the full measure of the facts or their implications. The admiration for American immigration policy expressed in Mein Kampf was not a passing thought on the day’s news (Hitler had been in prison for about two months when Calvin Coolidge signed the act into law) nor a one-off remark. Its place in the full context of Nazi theory and practice comes into view in Hitler’s American Model: The United States and the Making of Nazi Race Law (Princeton University Press) by James Q. Whitman, a professor of comparative and foreign law at Yale Law School.

Many people will take the very title as an affront. But it’s the historical reality the book discloses that proves much harder to digest. The author does not seem prone to sensationalism. The argument is made in two succinct, cogent and copiously documented chapters, prefaced and followed with remarks that remain within the cooler temperatures of expressed opinion (e.g.: “American contract law, for example, is, in my opinion, exemplary in its innovativeness”).

Hitler’s American Model is scholarship and not an editorial traveling incognito. Its pages contain many really offensive statements about American history and its social legacy. But those statements are all from primary sources -- statements about America, made by Nazis, usually in the form of compliments.

“The most important event in the history of the states of the Second Millennium -- up until the [First World] War -- was the founding of the United States of America,” wrote a Nazi historian in 1934. “The struggle of the Aryans for world domination thereby received its strongest prop.” Another German author developed the point two years later, saying that “a conscious unity of the white race would never have emerged” without American leadership on the global stage following the war.

Examples could be multiplied. The idea of the United States as a sort of alt-Reich was a Nazi commonplace, at least in the regime’s early years. But it was not just a matter of following Hitler’s lead. The white-supremacist and eugenicist writings of Madison Grant and Lothrop Stoddard -- among the best-selling American authors of a 100 years ago -- circulated in translation in the milieu that spawned Hitler. (I don’t recall Hannah Arendt mentioning Grant or Stoddard in Origins of Totalitarianism, oddly enough.) A popular Nazi magazine praised lynching as “the natural resistance of the Volk to an alien race that is attempting to gain the upper hand.” European visitors noted the similarity between the Ku Klux Klan and fascist paramilitary groups like the Brownshirts, and they compared the post-Reconstruction order in the South to the Nazi system.

But the journalistic analogies and propaganda talking points of the day, while blatant enough, don’t convey the depth of American influence on Nazi race law. The claim of influence runs against the current of much recent scholarship arguing that Nazi references to the Jim Crow system were “few and fleeting” and that American segregation laws had little or no impact on the Nuremberg Laws. (At the Nuremberg rally of 1935, the Nazis proclaimed citizenship limited to those “of German blood, or racially related blood” and outlawed marriage or sexual relations between Jews and German citizens.)

While the Nazis did call attention to segregation in the United States -- so the argument goes -- it was to deflect criticism of German policy. The error here, as Whitman sees it, comes from treating the U.S. Supreme Court ruling in Plessy v. Ferguson as the primary or quintessential legal component of racial oppression in the United States, and presumably the one Nazi jurists would have looked to in reshaping German policy. But, according to Whitman, “American race law” in the 19th and much of the 20th century:

sprawled over a wide range of technically distinct legal areas … [including] Indian law, anti-Chinese and -Japanese legislation, and disabilities in civil procedure and election law …. Anti-miscegenation laws on the state level featured especially prominently … [as] did immigration and naturalization law on the federal level ….

Even before the outbreak of World War I, German scholars were fascinated by this teeming mass of American racist law -- with a particular interest in what one of them identified as a new category of “subjects without citizenship rights” (or second-class citizens, to put it another way) defined by race or country of ancestry. By the 1930s, the anti-miscegenation laws in most American states were another topic of great concern. While many countries regarded interracial marriage as undesirable, Nazi jurists “had a hard time uncovering non-American examples” of statutes prohibiting it.

A stenographic transcript from 1934 provides Whitman’s most impressive evidence of how closely Nazi lawyers and functionaries had studied American racial jurisprudence. A meeting of the Commission on Criminal Law Reform “involved repeated and detailed discussion of the American example, from its very opening moments,” Whitman writes, including debate between Nazi radicals and what we’d have to call, by default, Nazi moderates.

The moderates argued that legal tradition required consistency. Any new statute forbidding mixed-race marriages had to be constructed in accord with the one existing precedent for treating a marriage as criminal: the law against bigamy. This would have been a bit of a stretch, and the moderates preferred letting the propaganda experts discourage interracial romance rather than making it a police matter.

The radicals were working from a different conceptual tool kit. Juristic tradition counted for less than what Hitler had called the “völkisch conception of the state,” which demanded Aryan supremacy and racial purity. It made more sense to them to follow an example that had been tried and tested. One of the hard-core Nazis on the commission knew where to turn:

Now as far as the delineation of the race concept goes, it is interesting to take a look at the list of American states. Thirty of the states of the union have race legislation, which, it seems clear to me, is crafted from the point of view of race protection. … I believe that apart from the desire to exclude if possible a foreign political influence that is becoming too powerful, which I can imagine is the case with regard to the Japanese, this is all from the point of race protection.

The lawyers whom Whitman identifies as Nazi radicals seemed to appreciate how indifferent the American states were to German standards of rigor. True, the U.S. laws showed a lamentable indifference to Jews and Gentiles marrying. But otherwise they were as racist as anything the führer could want. “The image of America as seen through Nazi eyes in the early 1930s is not the image we cherish,” Whitman writes, “but it is hardly unrecognizable.”

Editorial Tags: 
Is this diversity newsletter?: 

Review of Tom Nichols, "The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters"

A survey of 7,000 freshmen at colleges and universities around the country found just 6 percent of them able to name the 13 colonies that founded the United States. Many students thought the first president was Abraham Lincoln, also known for “emaciating the slaves.” Par for the course these days, right?

It happens that the study in question was reported in The New York Times in 1943. The paper conducted the survey again during the Bicentennial, using more up-to-date methods, and found no improvement. “Two‐thirds [of students] do not have the foggiest notion of Jacksonian democracy,” one history professor told the Times in 1976. “Less than half even know that Woodrow Wilson was president during World War I.”

Reading the remark now, it’s shocking that he was shocked. After 40 years, our skins are thicker. (They have to be: asking the current resident of the White House about Jacksonian democracy would surely be taken as an invitation to reminisce about his “good friend,” Michael.)

The problem with narratives of decline is that they almost always imply, if not a golden age, then at least that things were once much better than they are now. The hard truth in this case is that they weren’t. On the average, the greatest generation didn’t know any more about why The Federalist Papers were written, much less what they said, than millennials do now. The important difference is that today students can reach into their pockets and, after some quick thumb typing and a minute or two of reading, know at least something on the topic.

How to judge all this is largely a question of temperament -- of whether you see their minds as half-empty or half-full. Tom Nichols conveys the general drift of his own assessment with the title of his new book, The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters, published by Oxford University Press. The author is a professor of national security affairs at the U.S. Naval War College and an adjunct professor at the Harvard Extension School.

He sees the longstanding (probably perennial) shakiness of the public’s basic political and historical knowledge as entering a new phase. The “Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers” is like a lit match dropped into a gasoline tanker-sized container filled with the Dunning-Kruger effect. (It may seem comical that I just linked to Wikipedia to explain the effect, but it’s a good article, and in fact David Dunning himself cites it.)

Nichols knows better than to long for a better time before technology shattered our attention spans. He quotes Alexis de Tocqueville’s observation from 1835: “In most of the operations of the mind, each American appeals only to the individual effort of his own understanding.” This was basic to Jacksonian democracy’s operating system, in which citizens were, Tocqueville wrote, “constantly brought back to their own reason as the most obvious and proximate source of truth. It is not only confidence in this or that man which is destroyed, but the disposition to trust the authority of any man whatsoever.”

The difference between a self-reliant, rugged individualist and a full-throated, belligerent ignoramus, in other words, tends to be one of degree and not of kind. (Often it’s a matter of when you run into him and under what circumstances.) Nichols devotes most of his book to identifying how 21st-century American life undermines confidence in expert knowledge and blurs the lines between fact and opinion. Like Christopher Hayes in The Twilight of the Elites, he acknowledges that real failures and abuses of power by military, medical, economic and political authorities account for a good deal of skepticism and cynicism toward claims of expertise.

But Nichols puts much more emphasis on the mutually reinforcing effects of media saturation, confirmation bias and “a childish rejection of authority in all its forms” -- as well as the corrosive effects of credential inflation and “would-be universities” that “try to punch above their intellectual weight for all the wrong reasons, including marketing, money and faculty ego.” Unable to “support a doctoral program in an established field,” Nichols says, “they construct esoteric interdisciplinary fields that exist only to create new credentials.”

Add the effect of consumerism and entertainment on the academic ethos, and the result is a system “in which students learn, above all else, that the customer is always right,” creating a citizenry that is “undereducated but overly praised” and convinced that any claim to authoritative knowledge may be effectively disputed in the words of the Dude from The Big Lebowski: “Yeah, well, you know, that’s just, like, your opinion, man.”

As a work of cultural criticism, The Death of Expertise covers a good deal of familiar territory and rounds up the usual suspects to explain the titular homicide. But the process itself is often enjoyable. Nichols is a forceful and sometimes mordant commentator, with an eye for the apt analogy, as when he compares the current state of American public life to “a hockey game with no referees and a standing invitation for spectators to rush onto the ice.”

But one really interesting idea to take away from the book is the concept of metacognition, which Nichols defines as “the ability to know when you’re not good at something by stepping back, looking at what you’re doing, and then realizing that you’re doing it wrong.” (He gives as an example good singers: they “know when they’ve hit a sour note,” unlike terrible singers, who don’t, even if everyone else winces.)

“The lack of metacognition sets up a vicious loop, in which people who don’t know much about a subject do not know when they’re in over their head talking with an expert on that subject. An argument ensues, but people who have no idea how to make a logical argument cannot realize when they’re failing to make a logical argument …. Even more exasperating is that there is no way to educate or inform people who, when in doubt, will make stuff up.”

The implications are grave. In 2015-16, Donald Trump ran what Nichols calls “a one-man campaign against established knowledge,” and he certainly pounded the expertise of most pollsters into the dirt. He is now in a position to turn the big guns on reality itself; that, more than anything else, seems to be his main concern at present. Nichols writes that research on the Dunning-Kruger effect found that the most uninformed or incompetent people in a given area were not only “the least likely to know they were wrong or to know that the others were right” but also “the most likely to try to fake it, and the least able to learn anything.” That has been shown in the lab, but testing now continues on a much larger scale.

Editorial Tags: 
Is this diversity newsletter?: 

Review of Abraham Flexner's 'The Usefulness of Useless Knowledge'

“All eras in a state of decline and dissolution are subjective,” said Goethe in a moment of sagely grumbling about the poets and painters of the younger generation, who, he thought, confused wallowing in emotion for creativity. “Every healthy effort, on the contrary, is directed from the inward to the outward world.”

I didn’t make the connection with Svend Brinkmann’s book Stand Firm: Resisting the Self-Improvment Craze until a few days after writing last week’s column about it. One recommendation in particular from the Danish author’s anti-self-help manual seems in accord with Goethe’s admonition. As Brinkmann sees it, the cult of self-improvement fosters a kind of bookkeeping mentality. We end up judging experiences and relationships “by their ability to maximize utility based on personal preferences -- i.e. making the maximum number of our wishes come true.” The world becomes a means to the ego’s narrow ends, which is no way to live.

Besides offering a 21st-century guide to the Stoic ethos of disinvestment in the self, Brinkmann encourages the reader to rediscover the world in all its intrinsic value -- its fundamental indifference to anybody’s mission statement. How? By spending time in museums and forests:

“A museum is a collection of objects from the past (near or distant), e.g. art or artifacts that say something about a particular era or an aspect of the human experience. Obviously, you learn a lot from a museum visit -- but the greatest joy lies in just reveling in the experience with no thought of how to apply the knowledge and information. In other words, the trick is to learn to appreciate things that can’t be ‘used’ for some other function....

Similarly, a walk in the woods gives us a sense of being part of nature and an understanding that it shouldn’t be seen as consisting of resources that exist merely to meet human needs and desires. ... There are aspects of the world that are good, significant, and meaningful in their own right -- even though you derive nothing from them in return.”

Making similar points from a quite different angle is The Usefulness of Useless Knowledge by Abraham Flexner (1866-1959), the founding director of the Institute for Advanced Study, in an edition from Princeton University Press with a long introduction by the institute’s current director, Robbert Dijkgraaf.

The essay giving the book its title first appeared in Harper’s magazine in October 1939 -- a few months into the New York World’s Fair (theme: The World of Tomorrow) and just a few weeks into World War II. “I [am] pleading for the abolition of the word ‘use,” Flexner wrote, “and for the freeing of the human spirit.” It must have seemed like one hell of a time for such an exercise. But the essay’s defense of the Ivory Tower was tough-minded and far-sighted, and Dijkgraaf’s introduction makes a case for Flexner as a major figure in the history of the American research university whose contribution should be remembered and revived.  

The germ of The Usefulness of Useless Knowledge was a memorandum Flexner wrote as executive secretary of the General Education Board of the Rockefeller Foundation in 1921.The principles it espouses were also expressed in his work bringing Albert Einstein and other European academic refugees to the Institute at Princeton in the early 1930s.The essay defends “the cultivation of beauty ... [and] the extension of knowledge” as “useless form[s] of activity, in which men [and, as he acknowledges a few sentences earlier, women] indulge because they procure for themselves greater satisfactions than are otherwise available.”

But the impact of Flexner’s argument does not derive primarily from the lofty bits. He stresses that the pursuit of knowledge for its own sake has in fact shown itself already to be a powerful force in the world -- one that the ordinary person may not be able to recognize while swept up in “the angry currents of daily life.” The prime exhibits come from mathematics (Maxwell’s equations or Gauss’s non-Euclidian geometry took shape decades before practical uses could be found for them), though Flexner also points to the consequential but pure curiosity-driven work of Michael Faraday on electricity and magnetism, as well as Paul Ehrlich’s experiments with staining cellular tissue with dye.

“In the end, utility resulted,” Flexner writes, “but it was never a criterion to which [researchers’] ceaseless experimentation could be subjected.” Hence the need for institutions where pure research can be performed, even at the expense of pursuing ideas that prove invalid or inconsequential. “[W]hat I say is equally true of music and art and of every other expression of the untrammeled human spirit,” he adds, without, alas, pursing the point further.

The untrammeled human spirit requires funding in any case. Although written towards the end of the Great Depression -- and published ten years to the month after the stock market crash -- The Usefulness of Useless Knowledge reads like a manifesto for the huge expansion of higher education and of research budgets in the decades to follow.

Flexner could point to the Institute for Advanced Study with justified pride as an example of money well-spent. He probably corrected the page proofs for his essay around the same time Einstein was writing his letter to President Roosevelt, warning that the Germans might be developing an atomic bomb. And as Robbert Dijkgraaf reminds us in his introduction, another Flexner appointee was the mathematician John von Neumann, who “made Princeton a center for mathematical logic in the 1930s, attracting such luminaries as Kurt Godel and Alan Turing.” That, in turn, led to the invention of an electronic version of something Turing had speculated about in an early paper: a machine that could be programmed to prove mathematical theorems.   

“A healthy and balanced ecosystem would support the full spectrum of scholarship,” Dijkgraaf writes, “nourishing a complex web of interdependencies and feedback loops.” The problem now is that such a healthy and balanced intellectual ecosystem is no less dependent on a robust economy in which considerable amounts of money are directed to basic research -- without any pressing demand for a return on investment. “The time scales can be long,” he says, “much longer than the four-year periods in which governments and corporations nowadays tend to think, let alone the 24-hour news cycle.”

That would require a culture able to distinguish between value and cost. Flexner’s essay, while very much a document from eight decades ago, still has something to say about learning the difference. 

Editorial Tags: 
Is this diversity newsletter?: 

Editors discuss new volume on the many fictional portrayals of higher education

Smart Title: 

Editors discuss their new essay collection on the portrayal of colleges, students and academics -- across genres and eras.

Review of Andrew McCarron’s ‘Light Come Shining: The Transformations of Bob Dylan’

In lists of winners of the Nobel Prize for Literature, an asterisk sometimes appears next to the name of the entry for 1964. That year Jean-Paul Sartre declined the award because, among other things, a writer must “refuse to let himself be transformed into an institution.” The refusal cannot be called all that effective, in part because Sartre already was an institution (on an international scale to which, so far as I know, no author today really compares) and in part because the Swedish academy did not give the award to anyone else that year. He remains on the list, marked as a sore winner.

That same year, a future Nobel laureate issued his third and fourth albums, The Times They Are a-Changin’ and Another Side of Bob Dylan. The second title in particular hints at the ambivalence that the songwriter formerly known as Robert Zimmerman was beginning to feel toward his most ambitious creation -- to whit, “Bob Dylan,” a persona shaped in part through his own borrowings from various folk-music legends (especially Woody Guthrie) and in part by the felt need of segments of the American public for someone to embody the voice of his generation. In acquiring an audience, he took on the weight of its expectations and demands. (Reasonable and otherwise: Dylan had what in 1960s were not yet known as stalkers.) “By many accounts, he’d shed his boyish charm and had become moody, withdrawn and dismissive of those who either stood in his way or who wanted something from him,” writes Andrew McCarron in Light Come Shining: The Transformations of Bob Dylan (Oxford University Press). In public he sometimes had to wear a disguise, just to be left alone.

A connection can be drawn between Sartre and Dylan not only through their shared Nobel status (something of a coincidence almost, given the literature committee’s caprice in recent years) but because Light Come Shining belongs to a genre to which Sartre devoted a great deal of attention over the years: the psychobiography. Indeed, McCarron’s whole perspective on Dylan’s life and work shows the influence of concepts from Sartre’s “existential psychoanalysis,” especially that of the project. McCarron, who heads the religion, philosophy and ethics department at Trinity School in New York City, draws on quite a few more recent developments in psychology. But the Sartrean component is central enough -- and nowadays unusual enough -- to be striking.

Psychobiography in this sense should not be confused with the hunt for formative family relationships, childhood traumas, personal secrets, etc.: the sort of diagnosis at a distance, licensed or otherwise, practiced by many if not most biographers over the past century. It combs the available information about a subject’s life -- especially his or her own recollections and interpretations of things -- not for symptoms or concealed truths but, McCarron writes, for “the themes and structures of a life narrative that shed light on the mind and life-world behind the story.” An inaccurate memory or an outright lie may prove more revealing than what it distorts: “Appropriating, embellishing, misrepresenting, fantasizing, projecting and contradicting are all par for the course within the narrative realm. … The psychological truth that a given story conveys is considerably more valuable from a study of lives perspective than its historical truth.” The search is for the deep pattern in how the subject has understood life and tried to steer it (accurately or not) in certain directions. The psychobiographer is interested in “what [someone] succeeds in making of what he has been made,” as Sartre put it in a passage McCarron quotes.

Bob Dylan has been famous for his massive changes of direction, both in songwriting style (folk to rock to country, on to every permutation thereof) and personal identity. Early in his career he claimed to have been a carny and a hobo, among other things, and his interviews across the decades have often been performances, deflecting questions as much as answering them. More dramatic even than his shift from anti-war and civil rights balladeer to introspective surrealist -- with the two albums from 1964 marking the transition -- was Dylan’s conversion to Christianity in the late 1970s. For a while his concerts became confrontational, both from his refusal to play old songs and his impromptu fire-and-brimstone preaching. Whatever his religious affiliation now, the proselytizing phase did not last. He’s had his quota of marital and romantic drama and career downturns. Light Come Shining was finished before Dylan received the Nobel, and it’s possible he has not seen his last metamorphosis.

The psychobiographer, then, faces an excess of material with Dylan, not to mention more than 50 years of investigation, speculation and exegesis by obsessive fans. McCarron sifts through it and finds “variations on a repetitive plotline” coming to the fore with particular clarity at a number of points: “I have lost my sense of identity and purpose. I feel anxious and vulnerable to death and destruction. I turn to the songs and artists of my youth for guidance. I feel a redeemed sense of self and purpose. I reflect upon the change and understand it as the process of developing into who I’m supposed to be.”

One case of anxious and unmoored feelings was Dylan’s sense of being crushed by celebrity circa 1964 -- a period culminating in his motorcycle crash in 1966. (If that’s what really happened, rather than a stint in rehab, for which there seems to be more evidence.) McCarron identifies similar phases of great personal strain in the late 1970s and ’80s, followed by, respectively, his religious conversion and the major revival of his creative powers shown in Dylan’s songwriting in the 1990s. At each turn, he escaped desolation row by reconnecting with his musical roots: the blues, gospel, Western swing, the sounds of New Orleans, the memory of seeing Buddy Holly a few days before his death.

“All of Sartre’s studies of lives reveal features characteristic of traditional religious narratives,” wrote Stuart L. Charmé in Meaning and Myth in the Study of Lives: A Sartrean Perspective (University of Pennsylvania Press, 1984). And that makes sense insofar as what the psychobiographer looks for in a subject’s life is a kind of private mythology: the self’s innermost sense of its origins and its course. (As mentioned earlier, Sartre calls this a project; perhaps “projectile” also fits, since there’s a definite sense of movement, of throwing, or being thrown, into the future.)

If what McCarron identifies as Dylan’s psychobiographical bedrock might also be called a story of death and resurrection, that’s not necessarily because of the songwriter’s midlife experience of being “born again” and driven to evangelize. A great deal of the music that Dylan loves and immerses himself in echoes biblical language and themes, and it turns out that any number of songs about worldly pleasures and follies were written by performers who did a bit of preaching, too. Dylan absorbed musical traditions so deeply that they became part of himself, then projected them forward, in constant awareness that -- in a lyric that McCarron oddly never cites -- “he not busy being born is busy dying.”

Editorial Tags: 
Is this diversity newsletter?: 

Pages

Subscribe to RSS - Books
Back to Top