books

Review of David R. Shumway, "Rock Star: The Making of Musical Icons from Elvis to Springsteen"

Most readers’ first response to David Shumway’s Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press) will be to scan its table of contents and index with perplexity at the performers left out, or barely mentioned. Speaking on behalf of (among others) Lou Reed, Joe Strummer, and Sly and the Family Stone fans everywhere, let me say: There will be unhappiness.

For that matter, just listing the featured artists may do the trick. Besides the names given in the subtitle, we find James Brown, Bob Dylan, the Rolling Stones, the Grateful Dead, and Joni Mitchell – something like the lineup for an hour of programming at a classic rock station. Shumway, a professor of English at Carnegie Mellon University, makes no claim to be writing the history of rock, much less formulating a canon. The choice of artists is expressly a matter of his own tastes, although he avoids the sort of critical impressionism (see: Lester Bangs) that often prevails in rock writing. The author is a fan, meaning he has a history with the music. But his attention extends wider and deeper than that, and it moves in directions that should be of interest to any reader who can get past “Why isn’t _____ here?”

More than a set of commentaries on individuals and groups, Rock Star is a critical study of a cultural category -- and a reflection on its conditions of existence. Conditions which are now, arguably, far along the way to disappearing.

The name of the first rock song or performer is a matter for debate, but not the identity of the first rock star. Elvis had not only the hits but the pervasive, multimedia presence that Shumway regards as definitive. Concurring with scholars who have traced the metamorphoses of fame across the ages (from the glory of heroic warriors to the nuisance of inexplicable celebrities), Shumway regards the movie industry as the birthplace of “the star” as a 20th-century phenomenon: a performer whose talent, personality, and erotic appeal might be cultivated and projected in a very profitable way for everyone involved.

The audience enjoyed what the star did on screen, of course, but was also fascinated by the “real” person behind those characters. The scare quotes are necessary given that the background and private life presented to the public were often somewhat fictionalized and stage-managed. Fans were not always oblivious to the workings of the fame machine. But that only heightened the desire for an authentic knowledge of the star.

Elvis could never have set out to be a rock star, of course – and by the time Hollywood came around to cast him in dozens of films, he was already an icon thanks to recordings and television appearances. But his fame was of a newer and more symbolically charged kind than that of earlier teen idols.

Elvis was performing African-American musical styles and dance steps on network television just a few years after Brown v. Board of Education – but that wasn’t all. “The terms in which Elvis’s performance was discussed,” Shulway writes, “are ones usually applied to striptease: for example, ‘bumping and grinding.’ ” He dressed like a juvenile delinquent (the object of great public concern at the time) while being attentive to his appearance, in particular his hair, to a degree that newspaper writers considered feminine.

The indignation Elvis generated rolled up a number of moral panics into one, and the fans loved him for it. That he was committing all these outrages while being a soft-spoken, polite young man – one willing to wear a coat and tails to sing “Hound Dog” to a basset hound on "The Milton Berle Show" (and later to put on Army fatigues, when Uncle Sam insisted) only made the star power more intense: those not outraged by him could imagine him as a friend.

Elvis was the prototype, but he wasn’t a template. Shumway’s other examples of the rock star share a penchant for capturing and expressing social issues and cultural conflicts in both their songs and how they present themselves, onstage and off. But they do this in very different ways – in the cases of James Brown and Bob Dylan, changing across the length of their careers, gaining and losing sections of their audience with each new phase. The shifts and self-reinventions were very public and sometimes overtly political (with James Brown's support for Richard Nixon being one example) but also reflected in stylistic and musical shifts. In their day, such changes were sometimes not just reactions to the news but part of it, and part of the conversations people had about the world.  

Besides the size of the audience, what distinguishes the rock star from other performers is the length of the career, or so goes Shumway’s interpretation of the phenomenon. But rewarding as the book can be – it put songs or albums I’ve heard a thousand times into an interesting new context – some of the omissions are odd. In particular (and keeping within the timespan Shumway covers) the absence of Jimi Hendrix, Janis Joplin, and Jim Morrison seems problematic. I say that not as a fan disappointed not to find them, but simply on the grounds that each one played an enormous role in constituting what people mean by the term “rock star.” (That includes other rock stars. Patti Smith elevated Morrison to mythological status in her own work, while the fact that all three died at 27 was on Kurt Cobain’s mind when he killed himself at the same age.)

I wrote to Shumway to ask about that. (Also to express relief that he left out Alice Cooper, my own rock-history obsession. Publishers offering six-figure advances for a work of cultural criticism should make their bids by email.)

“My choices are to some extent arbitrary,” he wrote back. “One bias that shaped them is my preference for less theatrical performers as opposed to people such as David Bowie (who I have written about, but chose not to include here) or Alice Cooper.” But leaving out the three who died at 27 “was more than a product of bias. Since I wanted to explore rock stars’ personas, I believed that it was more interesting to write about people who didn’t seem to be playing characters on stage or record. I agree with you about the great influence of Jim Morrison, Janis Joplin, and Jimi Hendrix, but I don’t think their personas have the complexity of the ones I did write about. And, they didn’t figure politically to the degree that my seven did. The main point, however, is that there is lots of work to be done here, and I hope that other critics will examine the personas the many other rock stars I did not include.”

The other thing that struck me while reading Rock Star was the sense that it portrayed a world now lost, or at least fading into memory. Rock is so splintered now, and the "technology of celebrity" so pervasive, that the kind of public presence Shumway describes might not be possible now. 

“The cause is less the prevalence of celebrity,” he replied, “than the decline of the mass media. Stars are never made by just one medium, but by the interaction of several. Earlier stars depended on newspapers and magazines to keep them alive in their fans hearts and minds between performances. Radio and TV intensified these effects. And of course, movie studios and record companies had a great deal of control over what the public got to see and hear. The result was that very many people saw and heard the same performances and read the same gossip or interviews. With the fragmentation of the media into increasingly smaller niches, that is no longer the case. The role of the internet in music distribution has had an especially devastating effect on rock stardom by reducing record companies’ income and the listeners’ need for albums. The companies aren’t investing as much in making stars and listeners are buying songs they like regardless of who sings them.”

That's not a bad thing, as such, but it makes for a more compartmentalized culture, while the beautiful thing with rock 'n' roll is when it blows the doors off their hinges.

 

 

Editorial Tags: 

Essay on study of ebook publishing

A technological visionary created a little stir in the late ‘00s by declaring that the era of the paper-and-ink book as dominant cultural form was winding down rapidly as the ebook took its place. As I recall, the switch-off was supposed to be complete by the year 2015 -- though not by a particular date, making it impossible to mark your day planner accordingly.

Cultural dominance is hard to measure. And while we do have sales figures, even they leave room for interpretation. In the June issue of Information Research, the peer-reviewed journal’s founder T.D. Wilson takes a look at variations in the numbers across national borders and language differences in a paper called “The E-Book Phenomenon: A Disruptive Technology.” Wilson is a senior professor at the Swedish School of Library and Information Science, University of Borås, and his paper is in part a report on research on the impact of e-publishing in Sweden.

He notes that the Book Industry Study Group, a publishing-industry research and policy organization, reported last year that ebook sales in the United States grew by 45 percent between 2011 and 2012 – although the total of 457 million ebooks that readers purchased in 2012 still lagged 100 million copies behind the number of hardbacks sold the same year. And while sales in Britain also surged by 89 percent over the same period, the rate of growth for non-Anglophone ebooks has been far more modest.

Often it’s simply a matter of the size of the potential audience. “Sweden is a country of only 9.5 million people,” Wilson writes, “so the local market is small compared with, say, the UK with 60 million, or the United States with 314 million.” And someone who knows Swedish is far more likely to be able to read English than vice versa. The consequences are particularly noticeable in the market for scholarly publications. Swedish research libraries “already spend more on e-resources than on print materials,” Wilson writes, “and university librarians expect the proportion to grow. The greater proportion of e-books in university libraries are in the English language, especially in science, technology and medicine, since this is the language of international scholarship in these fields.”

Whether or not status as a world language is a necessary condition for robust ebook sales, it is clearly not a sufficient one. Some 200 million people around the world use French as a primary or secondary language. But the pace of Francophone ebook publishing has been, pardon the expression, snail-like -- growing just 3 percent per year, with “66 percent of French people saying that they had never read an ebook and did not intend to do so,” according to a study Wilson cites. And Japanese readers, too, seem to have retained their loyalty to the printed word: “there are more bookshops in Japan (almost 15,000 in 2012) than there are in the entire U.S.A. (just over 12,000 in 2012).”

Meanwhile, a report issued not long after Wilson’s paper appeared shows that the steady forward march of the ebook in the U.S. has lately taken a turn sideways. The remarkable acceleration in sales between 2008 and 2012 hit a wall in 2013. Ebooks brought in as much that year ($3 billion) as the year before. A number of factors were involved, no doubt, from economic conditions to an inexhaustible demand for Fifty Shades of Grey sequels. But it’s also worth noting that even with their sales plateauing, ebooks did a little better than trade publishing as a whole, where revenues contracted by about $300 million.

And perhaps more importantly, Wilson points to a number of developments suggesting that the ebook format is on the way to becoming its own, full-fledged disruptive technology. Not in the way that, say, the mobile phone is disruptive (such that you cannot count on reading in the stacks of a library without hearing an undergraduate’s full-throated exchange of pleasantries with someone only ever addressed as “dude”) but rather in the sense identified by Clayton Christensen, a professor of business administration at the Harvard Business School.

Disruption, in Christensen’s usage, refers, as his website explains it, to “a process by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves up market, eventually displacing established competitors.” An example he gives in an article for Foreign Affairs is, not surprisingly, the personal computer, which was initially sold to hobbyists -- something far less powerful as a device, and far less profitable as a commodity, than “real” computers of the day.

The company producing a high-end, state-of-the-art technology becomes a victim of its own success at meeting the demands of clientele who can appreciate (and afford) its product. By contrast, the “disruptive” innovation is much less effective and appealing to such users. It leaves so much room for improvement that its quality can only get better over time, as those manufacturing and using it explore and refine its potentials – without the help of better-established companies, but also without their blinkers. By the time its potential is being realized, the disruptive technology has developed its own infrastructure for manufacture and maintenance, with a distinct customer base.

How closely the ebook may resemble the disruptive-technology model is something Wilson doesn’t assess in his paper. And in some ways, I think, it’s a bad fit. The author himself points out that when the first commercial e-readers went on the market in 1998, it was with the backing of major publishing companies (empires, really) such as Random House and Barnes & Noble. And it’s not even as if the ebook and codex formats were destined to reach different, much less mutually exclusive, audiences. The number of ebook readers who have abandoned print entirely is quite small – in the US, about five percent.

But Wilson does identify a number of developments that could prove disruptive, in Christensen’s sense. Self-published authors can and do reach large readerships through online retailers. The software needed to convert a manuscript into various ebook formats has become more readily available, and people dedicated to developing the skills could well bring out better-designed ebooks than well-established publishers do now. (Alas! for the bar is not high.)

Likewise, I wonder if the commercial barriers to ebook publishing in what Wilson calls “small-language countries” might not be surmounted in a single bound if the right author wrote the right book at a decisive moment. Unlike that Silicon Valley visionary who prophesied the irreversible decline of the printed book, I don’t see it as a matter of technology determining what counts as a major cultural medium. That’s up to writers, ultimately, and to readers as well.

Editorial Tags: 

How rumors spread via sloppy citation practices

Smart Title: 

New article points out that through lazy or fraudulent citations, scholars spread rumors -- at times creating "academic urban legends." The story of spinach and an allegedly misplaced decimal point shows how.

The Chegg-Ingram Partnership

Chegg's new relationship with the Ingram Content Group could be key to Chegg expanding in digital textbook markets, The New York Times reported. Chegg was founded as a textbook-rental business and of late has been pushing to grow in e-texts. The Times noted that a major challenge for Chegg has been buying and distributing print textbooks, and that the deal with Ingram -- which will provide the physical texts -- frees Chegg to expand elsewhere.

 

Ad keywords: 

Book argues that mentoring programs should try to unveil colleges' "hidden curriculum"

Smart Title: 

Are students evaluated on their academic work, or on how well they navigate the college environment? Both, a recent book argues -- which is why mentoring programs should aim to unmask the "hidden curriculum" for at-risk students.

New book argues that education schools aren't adequately preparing teachers

Smart Title: 

New book argues that education schools too often neglect teacher training -- leaving it up to teachers to figure things out on their own.

Review of Anna M. Young, "Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement"

Many a thick academic tome turns out to be a journal article wearing a fat suit. So all due credit to Anna M. Young, whose Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement was published by Southern Illinois University Press this year. Her premise is sound; her line of argument looks promising; and she gets right to work without the rigmarole associated with what someone once described as the scholarly, “Yes, I read that one too” tic. 

Indeed, several quite good papers could be written exploring the implicit or underdeveloped aspects of her approach to the role and the rhetoric of the public intellectual. Young is an associate professor of communication at Pacific Lutheran University, in Tacoma, Washington. Much of the book is extremely contemporary in emphasis (to a fault, really, just to get my complaint about it out front here). But the issue it explores goes back at least to ancient Rome -- quite a while before C. Wright Mills got around to coining the expression “public intellectual” in 1958, in any case.

The matter in question emerges in Cicero’s dialogue De Oratore, where Young finds discussed a basic problem in public life, then and now. Cicero, or his stand-in character anyway, states that for someone who wants to contribute to the public discussion of important matters, “knowledge of a vast number of things is necessary, without which volubility of words is empty and ridiculous.”

On the other hand -- as Cicero has a different character point out -- mere possession of learning, however deep and wide, is no guarantee of being able to communicate that learning to others. (The point will not be lost on those of you surreptitiously reading this column on your mobile phones at a conference.)

Nobody “can be eloquent on a subject that he does not understand,” says Cicero. Yet even “if he understands a subject ever so well, but is ignorant of how to form and polish his speech, he cannot express himself eloquently about what he does understand.”

And so what is required is the supplementary form of knowledge called rhetoric. The field had its detractors well before Cicero came along. But rhetoric as defined by Aristotle referred not to elegant and flowery bullshit but rather to the art of making cogent and persuasive arguments.

Rhetoric taught how to convey information, ideas, and attitudes by selecting the right words, in the right order, to deliver in a manner appropriate to a particular audience -- thereby convincing it of an argument, generally as a step toward moving it to take a given action or come to a certain judgment or decision. The ancient treatises contain not a little of what would later count as psychology and sociology, and modern rhetorical theory extends its interdisciplinary mandate beyond the study of speech, into all other forms of media. But in its applied form, rhetoric continues to be a skill of skills – the art of using and coordinating a number of registers of communication at the same time: determining the vocabulary, gestures, tone and volume of voice, and so on best-suited to message and audience.  

When the expression “public intellectual” was revived by Russell Jacoby in the late 1980s, it served in large part to express unhappiness with the rhetorical obtuseness of academics, particularly in the humanities and social sciences. The frustration was not usually expressed quite that way. It instead took the form of a complaint that intellectuals were selling their birthright as engaged social and cultural critics in exchange for the mess of pottage known as tenure. It left them stuck in niches of hyperspecialized expertise. There they cultivated insular concerns and leaden prose styles, as well as inexplicable delusions of political relevance.

The public intellectual was a negation of all of this. He or she was a free-range generalist who wrote accessibly, and could sometimes be heard on National Public Radio. In select cases the public intellectual was known to Charlie Rose by first name.

I use the past tense here but would prefer to give the term a subscript: The public intellectual model ca. 1990 was understood to operate largely or even entirely outside academe, but that changed over the following decade, as the most prominent examples of the public intellectual tended to be full-time professors, such as Cornel West and Martha Nussbaum, or at least to teach occasionally, like Judge Richard Posner, a senior lecturer in law at the University of Chicago.

And while the category continues to be defined to some degree by contrast with certain tried-and-true caricatures of academic sensibility, the 2014 model of the public intellectual can hardly be said to have resisted the blandishments of academe. The danger of succumbing to the desire for tenure is hardly the issue it once might have seemed. 

Professor Young’s guiding insight is that public intellectuals might well reward study through rhetorical analysis -- with particular emphasis on aspects that would tend to be missed otherwise. They come together under the heading “style.” She does not mean the diction and syntax of their sentences, whether written or spoken, but rather style of demeanor, comportment, and personality (or what’s publicly visible of it).

Style in Young’s account includes what might be called discursive tact. Among other things it includes the gift of knowing how and when to stop talking, and even to listen to another person’s questions attentively enough to clarify, and even to answer them. The author also discusses the “physiological style” of various public intellectuals – an unfortunate coinage (my first guess was that it had something to do with metabolism) that refers mostly to how they dress.

A public intellectual, then, has mastered the elements of style that the “traditional intellectual” (meaning, for the most part, the professorial sort) typically does not. The public perceives the academic “to be a failure of rhetorical style in reaching the public. He is dressed inappropriately. She carries herself strangely. He describes ideas in ways we cannot understand. She holds the floor too long and seems to find herself very self-important.” (That last sentence is problematic in that a besetting vice of the self-important that they do not find themselves self-important; if they did, they’d probably dial it down a bit.)

Now, generations of satirical novels about university life have made clear that the very things Young regards as lapses of style are, in fact, perfectly sensible and effective rhetorical moves on their own terms. (The professor who wears the same argyle sweater year-round has at least persuaded you that he would rather think about the possible influence of the Scottish Enlightenment on The Federalist Papers than the admittedly large holes.)

But she longs for a more inclusive and democratic mode of engagement of scholarship with the public – and of the public with ideas and information it needs. To that end, Young identifies a number of public-intellectual character types that seem to her exemplary and effective. “At different times,” she writes, “and in different cultural milieus, different rhetorical styles emerge as particularly relevant, powerful, and persuasive.” And by Young’s count, six of them prevail in America at present: Prophet, Guru, Sustainer, Pundit, Narrator, and Scientist.

“The Prophet is called by a higher power at a time of crisis to judge sinners in the community and outline a path of redemption. The Guru is the teacher who gains a following of disciples and leads them to enlightenment. The Sustainer innovates products and processes that sustain natural, social, and political environments. The Pundit is a subject expert who discusses the issues of the day in a more superficial way via the mass media. The Narrator weaves experiences with context, creating relationships between event and communities and offering a form of evidence that flies below the radar in order to provide access to information.” Finally, the Scientist “rhetorically constructs his or her project as one that answers questions that have plagued humankind since the beginnings….”

The list is presumably not meant to be exhaustive, but Young finds examples of people working successfully in each mode. Next week we'll take a look at what the schema implies -- and at the grounds for thinking of each style as successful.

 

Editorial Tags: 

New book argues faculty governance is under threat

Smart Title: 

New book argues that shared governance is under threat, along with future of American higher education, and professors must take up the fight.

Interview with Stephen Eric Bronner about "The Bigot: Why Prejudice Persists"

A documentary on prison gangs from a few years ago included an interview with a member of the Aryan Brotherhood about his beliefs, though one could easily guess at them at first sight. It is true that the swastika is an ancient symbol, appearing in a number of cultures, having various meanings. As a tattoo, however, it very rarely functions as a good-luck sign or evidence of Buddhist piety. (Well, not for the last 70 years anyway.)

But this Aryan Brotherhood spokesman wanted to make one thing clear: He was not a racist. He didn’t hate anybody! (Nobody who hadn’t earned his hate, anyway.) He simply believed in white separatism as a form of multicultural identity politics. I paraphrase somewhat, but that was the gist of it, and he seemed genuinely aggrieved that anyone could think otherwise. He was, to his own way of thinking, the victim of a hurtful stereotype. People hear “Aryan Brotherhood” and get all hung up on the first word, completely overlooking the “brotherhood” part.

The interviewer did not press the matter, which seemed wise, even with prison guards around. Arguing semantics in such cases accomplishes very little -- and as Stephen Eric Bronner argues in his new book, The Bigot: Why Prejudice Persists (Yale University Press), the bigot is even more driven by self-pity and the need for self-exculpation than by hatred or fear.

“To elude his real condition,” writes Bronner, a professor of political science at Rutgers University, “to put his prejudices beyond criticism and change, is the purpose behind his presentation of self…. But he is always anxious. The bigot has the nagging intuition that he is not making sense, or, at least, that he cannot convince his critics that he is. And this leaves him prone to violence.”

Reminiscent of earlier studies of “the authoritarian personality” or “the true believer,” Bronner combines psychological and social perspectives on the bigot’s predicament: rage and contempt toward the “other” (those of a different ethnicity, religion, sexuality, etc.) is the response of a rigid yet fragile ego to a world characterized not only by frenetic change but by the demands of the “other” for equality. Bronner is the author of a number of other books I've admired, including Of Critical Theory and Its Theorists (originally published in 1994 and reissued by Routledge in 2002) and Blood in the Sand: Imperial Ambitions, Right-Wing Fantasies, and the Erosion of American Democracy (University Press of Kentucky, 2005), so I was glad to be able to pose a few questions to him about his new book by email. A transcript of the exchange follows.

Q: You've taught a course on bigotry for many years, and your book seems to be closely connected -- for example, the list of books and films you recommend in an appendix seem like something developed over many a syllabus. Was it? Is the book itself taken from your lectures?  

A: The Bigot was inspired by the interests of my students and my courses on prejudice. Though it isn’t based on the lectures, I tried to organize it in a rigorous way. As Marx once put the matter; the argument rises “from the abstract to the concrete.”

The book starts with a phenomenological depiction of the bigot that highlights his fear of modernity and the rebellion of the Other against the traditional society in which his identity was affirmed and his material privileges were secured. I then discuss the (unconscious) psychological temptations offered by mythological thinking, conspiracy fetishism and fanaticism that secure his prejudices from criticism. Next I investigate the bigot’s presentation of self in everyday life as a true believer, an elitist, and a chauvinist.  

All of these social roles fit into my political analysis of the bigot today who (even as a European neo-fascist or a member of the Tea Party) uses the language of liberty to pursue policies that disadvantage the targets of his hatred.  The suggested readings in the appendix help frame the new forms of solidarity and resistance that I try to sketch.

Q: On the one hand there are various forms of bigotry, focused on hostility around race, gender, sexuality, religion, etc. But you stress how they tend to overlap and combine. How important a difference is there between "targeted" prejudice and "superbigotry," so to speak.

A: Prejudice comes in what I call “clusters.” The bigot is usually not simply a racist but an anti-Semite and a sexist (unless he is a Jew or a woman) and generally he has much to say about immigrants, gays, and various ethnicities. But each prejudice identifies the Other with fixed and immutable traits.

Myths, stereotypes, and pre-reflective assumptions serve to justify the bigot’s assertions. Gays are sexually rapacious; Latinos are lazy; and women are hysterical – they are just like that and nothing can change them. But the intensity of the bigot’s prejudice can vary – with fanaticism always a real possibility. His fears and hatreds tend to worsen in worsening economic circumstances, his stereotypes can prove contradictory, and his targets are usually chosen depending upon the context.

Simmering anti-immigrant sentiments exploded in the United States after the financial collapse of 2007-8; Anti-Semites condemned Jews as both capitalists and revolutionaries, super-intelligent yet culturally inferior; cultish yet cosmopolitan; and now Arabs have supplanted Jews as targets for contemporary neo-fascists in Europe. The point ultimately is that bigotry is about the bigot, not the target of his hatred 

Q: You've written a lot about the Frankfurt School, whose analyses of authoritarianism in Germany and the U.S. have clearly influenced your thinking. You also draw on Jean-Paul Sartre's writings on anti-Semitism and, in his book on Jean Genet, homophobia. Most of that work was published at least 60 years ago. Is there anything significantly different about more recent manifestations of prejudice that earlier approaches didn't address? Or does continuity prevail? 

A: Aside from their extraordinary erudition, what I prize in the Frankfurt School and figures like Sartre or Foucault is their intellectual rigor and their unrelenting critical reflexivity. I developed my framework through blending the insights of idealism, existentialism, Marxism, and the Frankfurt School. Other thinkers came into play for me as well. In general, however, I like to think that I too proceeded in relatively rigorous and critical fashion.

In keeping with poststructuralist fashions, and preoccupations with ever more specific understandings of identity, there has been a tendency to highlight what is unique and about particular forms of prejudice predicated on race, religion, gender, ethnicity, and the like. The Bigot offers a different approach, but then, most writers are prisoners of their traditions — though, insofar as they maintain their critical intellect, they rattle the cages.

Q: Much of the public understands “bigot” or "racist" mainly as insults, so that the most improbable folks get offended at being so labeled. People hold up pictures of Obama as a witchdoctor with a bone through his nose, yet insist that he's the one who's a racist. Sometimes it's just hypocrisy, pure and simple, but could there be more to it than that? How do you understand all of this?

A: Using the language of liberty to justify policies that disadvantage woman, gays, and people of color cynically enables him to fit into a changed cultural and political climate. It is also not merely a matter of the bigot demeaning the target of his prejudice but in presenting himself as the aggrieved party. That purpose is helped by (often unconscious) psychological projection of the bigot’s desires, hatreds, and activities upon the Other.

The persecuted is thereby turned into the oppressor and the oppressor into the persecuted. The bigot’s self-image is mired in such projection. "Birth of a Nation" (1915) -- the classic film directed by D.W. Griffith that celebrates the rise of the KKK -- obsesses over visions of freed black slaves raping white women, even though it was actually white slave owners and their henchmen who were engaged in raping black slave women.

In Europe during the 1920s and 1930s, similarly, anti-Semitic fascists accused Jews of engaging in murder and conspiracy even while their own conspiratorial organizations like the Thule Society in Germany and the Cagoulards in France were, in fact, inciting violence and planning assassinations. Such projection alleviates whatever guilt the bigot might feel and justifies him in performing actions that he merely assumes are being performed by his avowed enemy. Perceiving the threat posed by the Other, and acting accordingly, the bigot thereby becomes the hero of his own drama.

Q: Is there any reason to think prejudice can be "cured" while still at the stage of a delimited and targeted sort of hostility, rather than a full-blown worldview?

A: Fighting the bigot is a labor of Sisyphus. No political or economic reform is secure and no cultural advance is safe from the bigot, who is always fighting on many fronts at once: the anthropological, the psychological, the social, and the political. The bigot appears in one arena only to disappear and then reappear in another.

He remains steadfast in defending the good old days that never were quite so good – especially for the victims of his prejudice. Old wounds continue to fester, old memories continue to haunt the present, and old rumors will be carried into the future. New forms of bigotry will also become evident as new victims currently without a voice make their presence felt.

Prejudice can be tempered (or intensified) through education coupled with policies that further public participation and socioeconomic equality. But it can’t be “cured.” The struggle against bigotry, no less than the struggle for freedom, has no fixed end; it is not identifiable with any institution, movement, or program.  Resistance is an ongoing process built upon the guiding vision of a society in which the free development of each is the condition for the free development of all.

Editorial Tags: 

Review of Darin Weinberg, 'Contemporary Social Constructionism: Key Themes'

Like a t-shirt that used to say something you can’t quite read anymore, a piece of terminology will sometimes grow so faded, or be worn so thin, that retiring it seems long overdue. The threadbare expression “socially constructed” is one of them. It’s amazing the thing hasn’t disintegrated already.

In its protypical form -- as formulated in the late 1920s, in the aphorism known as the Thomas theorem – the idea was bright and shapely enough: “If men define situations as real, they are real in their consequences.” In a culture that regards the ghosts of dead ancestors as full members of the family, it’s necessary to take appropriate actions not to offend them; they will have a place at the table. Arguments about the socially constructed nature of reality generalize the Thomas theorem more broadly: we have access to the world only through the beliefs, concepts, categories, and patterns of behavior established by the society in which we live.

The idea lends itself to caricature, of course, particularly when it comes to discussion of the socially constructed nature of something brute and immune to argumentation like, say, the force of gravity. “Social constructivists think it’s just an idea in your head,” say the wits. “Maybe they should prove it by stepping off a tall building!”

Fortunately the experiment is not often performed. The counterargument from gravity is hardly so airtight as its makers like to think, however. The Thomas theorem holds that imaginary causes can have real effects, But that hardly implies that reality is just a product of the imagination.

And as for gravity -- yes, of course it is “constructed.” The observation that things fall to the ground is several orders of abstraction less than a scientific concept. Newton’s development of the inverse square law of attraction, its confirmation by experiment, and the idea’s diffusion among the non-scientific public – these all involved institutions and processes that are ultimately social in nature.

Isn’t that obvious? So it seems to me. But it also means that everything counts as socially constructed, if seen from a certain angle, which may not count as a contribution to knowledge.

A new book from Temple University Press, Darin Weinberg’s Contemporary Social Constructionism: Key Themes, struggles valiantly to defend the idea from its sillier manifestations and its more inane caricatures. The author is a reader in sociology and fellow at King’s College, University of Cambridge. “While it is certainly true that a handful of the more extravagant and intellectually careless writers associated with constructionism have abandoned the idea of using empirical evidence to resolve debates,” he writes, not naming any names but manifestly glaring at people over in the humanities, “they are a small and shrinking minority.”

Good social constructionist work, he insists, “is best understood as a variety of empirically grounded social scientific research,” which by “turn[ing] from putatively universal standards to the systematic scrutiny of the local standards undergirding specific research agendas” enables the forcing of “the tools necessary for discerning and fostering epistemic progress.”

The due epistemic diligence of the social scientists renders them utterly distinct from the postmodernists and deconstructionists, who, by Weinberg's reckoning, have done great damage to social constructionism’s credit rating. “While they may encourage more historically and politically sensitive intuitions regarding the production of literature,” he allows, “they are considerably less helpful when it comes to designing, implementing, and debating the merits of empirically grounded social scientific research projects.”

And that is being nice about it. A few pages later, Weinberg pronounces anathema upon the non-social scientific social-constructionists. They are “at best pseudo-empirical and, at worst, overtly opposed to the notion that empirical evidence might be used to improve our understanding of the world or resolve disputes about worldly events.”

Such hearty enthusiasm for throwing his humanistic colleagues under the bus is difficult to gainsay, even when one doubts that a theoretical approach to art or literature also needs to be “helpful when it comes to designing, implementing, and debating the merits of empirically grounded social scientific research projects.” Such criticisms are not meant to be definitive of Weinberg’s project. A sentence like “Derrida sought to use ‘deconstruction’ to demonstrate how specific readings of texts require specific contextualizations of them” is evidence chiefly of the author’s willingness to hazard a guess.

The book’s central concern, rather, is to defend what Weinberg calls “the social constructionist ethos” as the truest and most forthright contemporary manifestation of sociology’s confidence in its own disciplinary status. As such, it stresses “the crucially important emphases” that Weinberg sees as implicit in the concept of the social – emphases “on shared human endeavor, on relation over isolation, on process over stasis, and on collective over individual, as well as the monumental epistemic value of showing just how deeply influenced we are by the various sociohistorical contexts in which we live and are sustained.”

But this positive program is rarely in evidence so much as Weinberg’s effort to close off “the social” as something that must not and cannot be determined by anything outside itself – the biological, psychological, economic, or ecological domains, for example. “The social” becomes a kind of demiurge: constituting the world, then somehow transcending its manifestations.

It left this reader with the sense of witnessing a disciplinary turf war, extended to almost cosmological dimensions. The idea of social construction is a big one, for sure. But even an XXL can only be stretched just so far before it turns baggy and formless -- and stays that way for good.

Editorial Tags: 

Pages

Subscribe to RSS - books
Back to Top