A documentary on prison gangs from a few years ago included an interview with a member of the Aryan Brotherhood about his beliefs, though one could easily guess at them at first sight. It is true that the swastika is an ancient symbol, appearing in a number of cultures, having various meanings. As a tattoo, however, it very rarely functions as a good-luck sign or evidence of Buddhist piety. (Well, not for the last 70 years anyway.)
But this Aryan Brotherhood spokesman wanted to make one thing clear: He was not a racist. He didn’t hate anybody! (Nobody who hadn’t earned his hate, anyway.) He simply believed in white separatism as a form of multicultural identity politics. I paraphrase somewhat, but that was the gist of it, and he seemed genuinely aggrieved that anyone could think otherwise. He was, to his own way of thinking, the victim of a hurtful stereotype. People hear “Aryan Brotherhood” and get all hung up on the first word, completely overlooking the “brotherhood” part.
The interviewer did not press the matter, which seemed wise, even with prison guards around. Arguing semantics in such cases accomplishes very little -- and as Stephen Eric Bronner argues in his new book, The Bigot: Why Prejudice Persists (Yale University Press), the bigot is even more driven by self-pity and the need for self-exculpation than by hatred or fear.
“To elude his real condition,” writes Bronner, a professor of political science at Rutgers University, “to put his prejudices beyond criticism and change, is the purpose behind his presentation of self…. But he is always anxious. The bigot has the nagging intuition that he is not making sense, or, at least, that he cannot convince his critics that he is. And this leaves him prone to violence.”
Reminiscent of earlier studies of “the authoritarian personality” or “the true believer,” Bronner combines psychological and social perspectives on the bigot’s predicament: rage and contempt toward the “other” (those of a different ethnicity, religion, sexuality, etc.) is the response of a rigid yet fragile ego to a world characterized not only by frenetic change but by the demands of the “other” for equality. Bronner is the author of a number of other books I've admired, including Of Critical Theory and Its Theorists (originally published in 1994 and reissued by Routledge in 2002) and Blood in the Sand: Imperial Ambitions, Right-Wing Fantasies, and the Erosion of American Democracy (University Press of Kentucky, 2005), so I was glad to be able to pose a few questions to him about his new book by email. A transcript of the exchange follows.
Q: You've taught a course on bigotry for many years, and your book seems to be closely connected -- for example, the list of books and films you recommend in an appendix seem like something developed over many a syllabus. Was it? Is the book itself taken from your lectures?
A:The Bigot was inspired by the interests of my students and my courses on prejudice. Though it isn’t based on the lectures, I tried to organize it in a rigorous way. As Marx once put the matter; the argument rises “from the abstract to the concrete.”
The book starts with a phenomenological depiction of the bigot that highlights his fear of modernity and the rebellion of the Other against the traditional society in which his identity was affirmed and his material privileges were secured. I then discuss the (unconscious) psychological temptations offered by mythological thinking, conspiracy fetishism and fanaticism that secure his prejudices from criticism. Next I investigate the bigot’s presentation of self in everyday life as a true believer, an elitist, and a chauvinist.
All of these social roles fit into my political analysis of the bigot today who (even as a European neo-fascist or a member of the Tea Party) uses the language of liberty to pursue policies that disadvantage the targets of his hatred. The suggested readings in the appendix help frame the new forms of solidarity and resistance that I try to sketch.
Q: On the one hand there are various forms of bigotry, focused on hostility around race, gender, sexuality, religion, etc. But you stress how they tend to overlap and combine. How important a difference is there between "targeted" prejudice and "superbigotry," so to speak.
A: Prejudice comes in what I call “clusters.” The bigot is usually not simply a racist but an anti-Semite and a sexist (unless he is a Jew or a woman) and generally he has much to say about immigrants, gays, and various ethnicities. But each prejudice identifies the Other with fixed and immutable traits.
Myths, stereotypes, and pre-reflective assumptions serve to justify the bigot’s assertions. Gays are sexually rapacious; Latinos are lazy; and women are hysterical – they are just like that and nothing can change them. But the intensity of the bigot’s prejudice can vary – with fanaticism always a real possibility. His fears and hatreds tend to worsen in worsening economic circumstances, his stereotypes can prove contradictory, and his targets are usually chosen depending upon the context.
Simmering anti-immigrant sentiments exploded in the United States after the financial collapse of 2007-8; Anti-Semites condemned Jews as both capitalists and revolutionaries, super-intelligent yet culturally inferior; cultish yet cosmopolitan; and now Arabs have supplanted Jews as targets for contemporary neo-fascists in Europe. The point ultimately is that bigotry is about the bigot, not the target of his hatred
Q: You've written a lot about the Frankfurt School, whose analyses of authoritarianism in Germany and the U.S. have clearly influenced your thinking. You also draw on Jean-Paul Sartre's writings on anti-Semitism and, in his book on Jean Genet, homophobia. Most of that work was published at least 60 years ago. Is there anything significantly different about more recent manifestations of prejudice that earlier approaches didn't address? Or does continuity prevail?
A: Aside from their extraordinary erudition, what I prize in the Frankfurt School and figures like Sartre or Foucault is their intellectual rigor and their unrelenting critical reflexivity. I developed my framework through blending the insights of idealism, existentialism, Marxism, and the Frankfurt School. Other thinkers came into play for me as well. In general, however, I like to think that I too proceeded in relatively rigorous and critical fashion.
In keeping with poststructuralist fashions, and preoccupations with ever more specific understandings of identity, there has been a tendency to highlight what is unique and about particular forms of prejudice predicated on race, religion, gender, ethnicity, and the like. The Bigot offers a different approach, but then, most writers are prisoners of their traditions — though, insofar as they maintain their critical intellect, they rattle the cages.
Q: Much of the public understands “bigot” or "racist" mainly as insults, so that the most improbable folks get offended at being so labeled. People hold up pictures of Obama as a witchdoctor with a bone through his nose, yet insist that he's the one who's a racist. Sometimes it's just hypocrisy, pure and simple, but could there be more to it than that? How do you understand all of this?
A: Using the language of liberty to justify policies that disadvantage woman, gays, and people of color cynically enables him to fit into a changed cultural and political climate. It is also not merely a matter of the bigot demeaning the target of his prejudice but in presenting himself as the aggrieved party. That purpose is helped by (often unconscious) psychological projection of the bigot’s desires, hatreds, and activities upon the Other.
The persecuted is thereby turned into the oppressor and the oppressor into the persecuted. The bigot’s self-image is mired in such projection. "Birth of a Nation" (1915) -- the classic film directed by D.W. Griffith that celebrates the rise of the KKK -- obsesses over visions of freed black slaves raping white women, even though it was actually white slave owners and their henchmen who were engaged in raping black slave women.
In Europe during the 1920s and 1930s, similarly, anti-Semitic fascists accused Jews of engaging in murder and conspiracy even while their own conspiratorial organizations like the Thule Society in Germany and the Cagoulards in France were, in fact, inciting violence and planning assassinations. Such projection alleviates whatever guilt the bigot might feel and justifies him in performing actions that he merely assumes are being performed by his avowed enemy. Perceiving the threat posed by the Other, and acting accordingly, the bigot thereby becomes the hero of his own drama.
Q: Is there any reason to think prejudice can be "cured" while still at the stage of a delimited and targeted sort of hostility, rather than a full-blown worldview?
A: Fighting the bigot is a labor of Sisyphus. No political or economic reform is secure and no cultural advance is safe from the bigot, who is always fighting on many fronts at once: the anthropological, the psychological, the social, and the political. The bigot appears in one arena only to disappear and then reappear in another.
He remains steadfast in defending the good old days that never were quite so good – especially for the victims of his prejudice. Old wounds continue to fester, old memories continue to haunt the present, and old rumors will be carried into the future. New forms of bigotry will also become evident as new victims currently without a voice make their presence felt.
Prejudice can be tempered (or intensified) through education coupled with policies that further public participation and socioeconomic equality. But it can’t be “cured.” The struggle against bigotry, no less than the struggle for freedom, has no fixed end; it is not identifiable with any institution, movement, or program. Resistance is an ongoing process built upon the guiding vision of a society in which the free development of each is the condition for the free development of all.
The teaching of introductory economics at the college level remains substantively unchanged from the college classroom of the 1950s, more than 60 years ago. The teaching of other introductory courses, from psychology to biology, has changed dramatically -- with new knowledge and, more importantly, new pedagogical techniques. Today's students are also very different, not accustomed to sitting through 50-minute lectures, taking detailed notes of material and techniques, the value of which has yet to be demonstrated to them.
Thus, it is little wonder that more students do not elect introductory economics or, following the course, do not take more economics. Grades tend to be lower in introductory economics, discouraging many students from taking additional courses. The concern is paramount. In today’s complicated world, the design of sound policy requires an understanding of economic principles. Yet, so many who are deciding on policy, particularly voters, as it is they who elect policymakers in democracies, are frequently ignorant of economic principles. Now many students do major in economics, but frequently for what is perceived to be enhanced employment practices in the business world. Love of the study of economics does not seem to be manifested by many students. If a school has an undergraduate business major, the number of economics majors fall precipitously. Fewer and fewer graduates of liberal arts colleges go on into economics Ph.D. programs that are increasingly populated by very able international students.
Also, our traditionally underrepresented groups are truly underrepresented as students of economics. Women, though more than a majority of today’s college students, still shy from economics, as shown by a recent study done by Professor Claudia Goldin of Harvard University. African Americans and Latinos also are not well-represented in college economics classrooms. Why? Different hypotheses, each of which probably has some significance, include a particular alienation from the teaching methods, lack of role models in the classroom, difficult material and low grades combined with the additional challenge of being a minority student or a first-generation student. Also, normative issues such as poverty and discrimination are frequently marginalized, reducing the relevance of the course to many students.
Recently I chaired a meeting that had faculty members from over 60 undergraduate economics programs to discuss both Advanced Placement economics and the future of the college introductory course. There was consensus that the course seemed to be structured and taught, consistent with the first edition of Paul Samuelson’s famous and dominating textbook, Economics: An Introductory Analysis in 1948 (to continue toward 20 editions!). Texts and the course seem to mirror the major theoretical components of basic microeconomics and macroeconomics. Much has been added (e.g., game theory and rational expectations) with little subtracted (maybe labor unions and Keynesian fixed price case). The result is a rather encyclopedic textbook with 30 or more chapters.
Faculty race through as much of this material as is possible. With such breadth of material, depth is frequently sacrificed. Students are more frequently memorizing and not as frequently learning. Grades tend to be among the lowest in introductory college courses and student satisfaction highly variable. Most distressingly, students are not necessarily learning to think like economists and understand the power of economics as an explanatory tool for human behavior.
Thus, there is momentum to address the deficiencies of this extraordinarily important introductory course. More faculty are aware of these problems and recognize some lack of student enthusiasm. The College Board has initiated the discussion that I mentioned with groups conveyed to look at both introductory microeconomics and introductory macroeconomics. Under the guidance of respected economist John Siegfried (of Vanderbilt University) and a blue-ribbon committee of university economists, the National Council for Economic Education has developed 20 standards for economic understanding and literacy, applicable to differing levels of education. Textbook companies are now offering customized books so that a faculty member need not present 30-plus chapters to a student, but rather can customize 10 or 20 that will form the basis of a streamlined course, one in which students can truly learn economic concepts.
With such positive momentum, will the worthy objective of a newly inspired and improved economics courses become a reality any time soon? Obstacles still exist. Tradition and lethargy can be powerful brakes on new methods and ideas. Also, a course with less breadth means the elimination of some topics. Which will they be? For some economics faculty, labor markets can be eliminated; for others, labor markets form the heart of both microeconomics, and certainly macroeconomics. Yet the payoff is potentially so high.
From my perspective, students who take introductory economics should complete the course with some understanding of 1) why income inequality exists and how to address it, 2) the means by which negative externalities like pollution should be addressed, 3) international economic exchanges can be mutually beneficial, 4) what were causes of our most recent Great Recession, 5) how to address long-term unemployment, and 6) the causes of global inequality. Such a course would be of greater interest to our students in 2014.
In a world full of excruciatingly complex and dangerous problems (from income inequality to environmental degradation), economics, as a discipline, must be a central player as orderly resolution is sought. As mentioned, students might actually study in an economics course the causes and consequence of the Great Economic Recession of 2008. Today, such a topic is often too esoteric and not part of the mainstream cannon of economics. A generation of students, from varying backgrounds and experiences, should be taught to appreciate and even admire the power and the logic of economic analysis. Parents, students, and voters, you all must help to ensure that this opportunity for important educational analysis is not lost.
Clark G. Ross is Johnston Professor of Economics and dean of faculty emeritus at Davidson College.
Telltale fast clicks of laptop arrow keys gave away my distracted student from 30 feet off.
So engrossed was he in a 1980s role-playing game that he barely noticed when I leaned in to whisper how entirely inappropriate his behavior was during my digital humanities class at Dartmouth College. As a noted visiting technology and culture speaker held forth on participatory culture and Wikipedia — in which my students had expressed an avid interest — I was shocked as he and many others openly engaged with their Facebook pages.
Why would he play a game in a class he insisted he enjoyed? He had been playing the game before he went to bed, so when he opened his computer in class, "the game was just there, so I started playing," he explained to me during office hours. He didn't intend to check out from a class he likes.
But he did.
This incident is particularly ironic as we had discussed in the previous class the myth of multitasking and dialogued about the Stanford University professor Clifford Nass. Now, some class members can handle themselves with their technology; about a third cannot. Multitasking makes us poor learners, studies show. It not only hurts the perpetrator by splitting their focus and attention, but it hurts those sitting around the multitasker and lowers everyone’s overall performance on each task. While millennials may think they have better multitasking chops than older generations, data show this assumption to be false. Unfortunately, science tells us the human brain is not meant to multitask and succeed — even if people truly believe they are good at it.
As a digital humanities professor, I spend many waking hours communicating and creating on a computer. I deeply understand the exciting possibilities offered by digital tools because I spend a great deal of time designing them: in fact, I’ve designed a class that minimizes lecture time to create engaging activities involving mapping, making diagrams and hosting debates on virtual experiences. I also know there's a time and place to engage with actual people. There is a rising concern among faculty members across the country on the need to investigate classroom culture regarding technology so students might actually benefit.
Clearly, the lure of the laptop is too compelling to resist.
Some people might say we should use technology for activities that “flip” the traditional lecture class. In line with this thinking, I have run hands-on activities one to two times a week with great success. During those activities, students are focused and use technology to further learning.
But most days, there will come a time where faculty or guest speakers actually speak, or dialogue happens or provocative points are raised. It is then that students with technology-control issues immediately check out and check into Facebook or online games or shoe shopping. Unless they are directly involved in a hands-on activity for which they will be accountable in public by the end of class, it is much easier to give in to the presence of technology and lose the experience of direct engagement.
Is this chronic narcissism? Or is this phenomenon a desire to escape the confines and taxing nature of concentration? Does this "checking out to check in" represent an insatiable need for immediate news in the "fear of missing out"? In 2013, an international team of researchers designed a way to measure fear of missing out and found people under age 30 were more affected, as were those who reported low levels of connectedness, competence and self-autonomy. This research supports earlier findings that the lonely and bored were more apt to rely on social media and feel left out if they miss out.
Yet students also miss out if they cannot listen and engage with the world in front of them. The presence of technology in a class may not work unless the active learning is active all of the time or, I would suggest, our culture changes such that the live, in-person exchange is more valuable than whatever the student is doing with his or her own technologies.
In higher education, themes of dialogue, listening and presence are a core part of the college experience. We know from research that employing "embodied cognition" -- that is, learning from all of the senses-- is a more holistic and effective way to learn. There is so much to the human experience that is reflected in a face to face meeting, where body language, expression, tone of voice, and others around create a whole-body experience. If higher education continues its mission to support experiential learning, this may mean we must reestablish forms of learning centered back in bodily experience, and lean a little less on technology to transform ourselves.
We need a culture change to manage our use of technology, to connect when we want to and not because we psychologically depend on it. Enough is enough. We need strategies for unplugging when appropriate to create a culture of listening and of dialogue. Otherwise, $20,000 to $60,000 a year is a hefty entrance fee to an arcade.
Mary Flanagan is a distinguished professor of digital humanities at Dartmouth College and a fellow of The OpEd Project.