This year is the 50th anniversary of Anti-Intellectualism in American Life by Richard Hofstadter, whose greatest achievement, someone once said, was keeping it to just the one volume.
As discussed here a short while ago, the revisionist interpretation of American populism appearing in Hofstadter’s book The Age of Reform (1955) has taken a lot of positivistic hits by subsequent historians. He over-generalized on the basis of a (very) narrowly selected pool of primary sources -- and in the final analysis, he wasn’t really writing about the 1890s at all, but rather his own times, equating the mood and worldview of McCarthyism with the agrarian radicals of the People’s Party. Hofstadter was more conscious of the pressure of contemporary affairs in Anti-Intellectualism, which he wrote was “conceived in response to the political and intellectual conditions of the 1950s.”
It was “by no means a formal history,” Hofstadter wrote, “but largely a personal book, whose factual details are organized and dominated by my views.” I take that to be a concession, of sorts, to historians who were finding The Age of Reform problematic. His strength was the essay more than the monograph. A passage such as the following is remarkable for – among other things -- how its urbane diction just barely subdues the remembered experience of dread:
“Of course, intellectuals were not the only targets of McCarthy’s constant detonations -- he was after bigger game -- but intellectuals were in the line of fire, and it seemed to give special rejoicing to his followers when they were hit. His sorties against intellectuals and universities were emulated throughout the country by a host of less exalted inquisitors.”
It is also remarkable for needing only the slightest change of wording to sound uncomfortably applicable to more recent events. The problem lies not with this or that demagogue but with something deeper. Hofstadter spent 400 pages sounding it out. But the American science and science-fiction writer Isaac Asimov condensed it into one sentence of a column for Newsweek in 1980: “The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.’ ”
Neither as broad in historical scope as Anti-Intellectualism in American Life nor as trenchant as Asimov’s zinger, Aaron Lecklider’s Inventing the Egghead: The Battle Over Brainpower in American Culture (University of Pennsylvania Press) is explicitly framed as a response to another decade – the ‘00s. While challenging Hofstadter’s ideas, Lecklider, an assistant professor of American studies at the University of Massachusetts Boston, follows his lead in responding to a past that, while recent, somehow already seems distinctly periodized.
It was the worst of time, full stop. Figures in the Bush administration were openly contemptuous of “experts,” with all their “knowledge” about “reality.” The Bush-bashers called the president stupid, and his supporters called the Bush-bashers stupid, and there was a TV game show called “Are You Smarter Than a Fifth Grader?” which hinted that the whole country was stupid, and that’s O.K. (It did well in the ratings.) The culture war was fought with the bluntest of weapons -- not between the intelligentsia and the ignorati, but between anti-intellectuals and anti-anti-intellectuals. The latter expression, while clumsy at best, acknowledges something important: anti-anti-intellectual ≠ intellectual. Laughing at a satirical interview with a creationist on “The Daily Show” entails no substantial engagement with the life of the mind.
Much of the edifying conflict was fought out in popular culture and the mass media – terrain that, Lecklider argues, historians and social critics of Hofstadter’s era either neglected, at best, or regarded as stupefying and regressive. Hence their interpretations of American cultural history tended to be narratives of decline.
In reply, Inventing the Egghead presents a series of case studies from the first six decades of the 20th century in which conflicts over the power and the possession of intellectual capital were fought out in a wide range of popular venues: cartoons, movies, jokes, Tin Pan Alley songs, newsmagazines, posters, popular science journals, handbooks on efficient housekeeping, etc.
The chapters proceed chronologically, from the rechristening of a Coney Island park in 1909 as an institute of science (to skirt blue laws) to Einstein as cult figure, to aspects of the Harlem Renaissance and the New Deal “brain trust,” and on up to the stress-inducing utopia of Oak Ridge and the coining of “egghead” as pejorative. The effect is one of cultural history as collage. Running through all these topics and cultural forms is an uneasy and constantly shifting set of attitudes towards what Lecklider calls “brainpower.”
In the author’s usage “brainpower” means the power to acquire or to stake a claim to knowledge and expertise, whether respected and professionally credentialed or not. Conflicts over who possesses brainpower (and who doesn’t) are continuous. So are disputes over its value and limitations. And that flux comes, in part, from the frequently changing needs of an economy that requires technological advances as well as a steady supply of human brains, serviceable as a factor of production.
In short, there were grounds for ambivalence about brainpower -- for reasons more various and complex than some notion of an unchanging American cultural disposition toward anti-intellectualism. “Competing representations of intelligence” in popular culture, Lecklider writes, “could alternately smash the pretensions of an intellectual elite, position ordinary men and women as smarter than experts, appeal to intellectual culture to validate working-class positions, and dismantle intellectual hierarchies. These representations were often uncomfortable and contradictory, sometimes even self-defeating, particularly when the value of intelligence was diminished in order to level the intellectual playing field.”
But other strains of popular culture – a number of distinctively WPA-era posters promoting libraries, for example -- served to recognize and foster peoples’ self-respect regarding their own mental capacities.
Earlier I suggested that “Are You Smarter Than a Fifth Grader?” implied the viewer probably wasn’t. On reflection, that may have been too dour a view. Perhaps the title gives the viewer something to which to aspire. Be that as it may, in the representations of brainpower that Lecklider inspects, the expressions of ambivalence tend more often to have more hostility or disparagement in the mix than respect for self or others. While written, and blurbed, as an alternative to Anti-Intellectualism in American Life, the book ends up seeming like the extensive elaboration and updating of a point Hofstadter made there in passing: "As the demand for the rights of the common man took form in 19th-century America, it included a program for free elementary education, but it also carried with it a dark and sullen suspicion of high culture, as a creation of the enemy."
Some people will bristle at the expression "high culture." They always do. (I'm not overly fond of it.) But that response misses the point, which, again, was put quite plainly by Isaac Asimov a while back. "I believe that every human being with a physically normal brain can learn a great deal and can be surprisingly intellectual," he wrote. "I believe that what we badly need is social approval of learning and social rewards for learning." That is as cheerful a face as can be put on our situation.
This summer, the faculty of Shimer College held a discussion of Jacques Rancière's book The Ignorant Schoolmaster: Lessons in Intellectual Emancipation. In it, he discusses the educational theory and practice of Joseph Jacotot, who claimed that one could teach a subject one didn't even know in the first place. For Jacotot, teaching isn't a matter of expertise, but of determination. It isn't about transmitting knowledge to the student, but about holding students accountable to the material that they are working on.
Though the method Rancière described was more radical than anything we would actually try, the general approach resonates with what we try to do at Shimer, a small liberal arts college in the Great Books tradition. Our classes are all discussion-based, centered on important texts, artworks, and scientific experiments, and the professor serves not to instruct the students but to keep them on task and nudge them in the right direction. A handful of our faculty members have actually taught the whole curriculum, covering the humanities, social sciences, and natural sciences, and all are required to teach in at least two areas.
All of this remained mostly theoretical for me until this semester, however, when I began teaching Humanities 1: Art and Music. I am knowledgeable about fine arts and am a passable classical pianist, but my Ph.D. is in theology and philosophy and the course is the first I've ever taught where the primary object of study was something other than texts. The challenge of the course is to find a way of talking about art that is neither purely impressionistic and personal nor overly technical and scholarly.
The problem is most pronounced with music, where students often express a need for something called “music theory” that will permit them to talk about the experience of music in an intelligent and informed way. For the visual arts, there’s a more immediate intelligibility, given that the majority of the works we discuss in the course are representational (or at least suggestive of representation) — yet even there, students can feel that they don’t know what to say beyond assessing whether the painting represents what it’s supposed to in a way that is somehow “realistic.”
Our approach is to give students a handful of “hooks” that allow them to point out certain aspects of a given artwork. We begin with the format of the class, which is centered on Ovid's Metamorphoses, a work that has inspired artists for generations. As a result of this framing, the majority of works that we study are somehow representative or narrative in form, giving the students a basic orientation. The ready availability of different works on the same subject also gives us the
opportunity to highlight the differences between different media and the types of choices that artists make within the same medium.
On the level of form, we try to give students a ready familiarity with a few basic concepts. In music, the most important goals include being able to hear consonance vs. dissonance (which is fairly intuitive once it’s pointed out), knowing how to talk about melody and harmony, and being broadly familiar with the distinction of major vs. minor. With painting, we focus on the use of perspective, the interaction of colors, and the composition of the piece as a whole. These tools give them enough to begin thinking about how the expressive content of the artwork can reinforce, compliment, or complicate the emotional content of the narrative being portrayed.
By halfway through the semester, I had developed a certain level of confidence on art and music. Yet the syllabus threw me a curve ball when I was required to introduce a new art form by taking the students on an architectural tour in downtown Chicago. Here I felt my ignorance much more acutely, and instead of trying to create my own tour, I asked a senior faculty member to demonstrate the tour he had given the previous year, which I simply repeated.
We had only a short time for the tour, and so I could only point out a handful of extremely basic points. I showed them a few buildings that were built before the skyscraper technique was developed (basically, the outer walls had to be load-bearing before the skyscraper technique allowed for an internal distribution of the weight) as well as some early skyscrapers. I talked about the ways that the architect can get us to “read” a building — how the eye is drawn upward, how a building can be “capped” with a different design on the top floors, how the base of the building can provide indications of where the entrances are and how the facade can reinforce that. We saw some buildings that were highly ornamented and some that were very stark. We also looked at lobbies in a similar variety of styles. Finally, I tried to point out to them the way buildings interact with each other.
None of this was very advanced, and indeed, I was most often simply pointing out to the students what my colleague had pointed out to me on our tour. Yet the students reported that they had benefited from simply being told to step back and actually look at the buildings and from being given certain rough-and-ready indications of what to look for. Some reported they had never really thought about architecture at all, that it had always faded into the background. Even a more knowledgeable student said that being asked to look at buildings in the context of the cityscape rather than in isolation was a step beyond what he’d done before.
None of this resulted from any special skill I brought to the table — even the mechanical execution of the tour was pretty inept, and I’m known to mumble. (My students strongly discouraged me from pursuing a career as a tour guide.) It was simply a matter of being told to look and being given a few specific things to look at. It made them want to look more closely in the future, as indeed preparing for the tour made me want to look more closely as well.
While it was most pronounced with architecture, I've been learning along with the students throughout the semester. During trips to the Art Institute of Chicago, I've found that my way of looking at paintings has changed. A recent visit to the symphony with my students revealed that I'm getting better at following and thinking about classical music — after the concert, I found that I really wanted to talk about it and even investigate it further, in a way that wouldn’t have been true before. I see similar progress in my students, as they become more and more comfortable with talking about the formal elements of the artworks and relate them in more sophisticated way to their representational or narrative content. In fact, one of my students who transferred from a local art school claims that she has had more and better discussion of art in our class than she did in art school.
At this point, my reader may be skeptical. Perhaps I am giving students an adequate introduction to the fine arts, making up for my ignorance with my enthusiasm — but wouldn't they be better off with a more knowledgeable professor? In some ways, I'm sure they would. Yet I would turn the tables and point out the disadvantages of having an accomplished expert teach an introductory course. Too often, such classes consist in the delivery of scholarly knowledge that only serves to exacerbate the distance that the students feel from the material itself. Instead of learning how to look at an artwork or listen to a piece of music, students learn how to categorize them: this is early Renaissance, this is Impressionist....
The two skills don't have to be mutually exclusive, but on a practical level, they most often are — and I would rather that my students begin by gaining the confidence to analyze and respond to a work and only then delve into the historical and scholarly background according to their interest. We live in a time where there's no shortage of access to facts, but college may be their one chance to develop a real understanding of how art and music work. From that perspective, my inability to supply “the right answer” or to indulge my students' curiosity about historical trivia that distracts our attention from the work before us counts as a positive advantage.
This isn't to say, of course, that I must never teach in my own area of expertise. Indeed, my experience as an “ignorant schoolmaster” has already changed the way I think about teaching things within my comfort zone as well. It has pushed me to think more about holding students accountable for the ways they reach their own answers than about how best to give them — or Socratically help them stumble upon — the “right answer.” Even in classes where I bring much more to the table, the focus is and must be the material we're working on together, not all the information I'm bringing from the outside. More than that, though, all that information must be put to the test of the material itself, so that I always have to be open to the possibility that the interpretation I brought to the table is wrong, or at least not the whole story.
The approach I'm describing here goes against many of the deeply engrained habits that academics develop in graduate school and carry over into their teaching. While Rancière and others would cast moral aspersions on the expertise-centered approach to education, I view it more as a failure of imagination. Robert Hutchins, the University of Chicago president whose approach forms the basis for Shimer's curriculum, once said that liberal arts colleges tend to imitate graduate programs because at least graduate programs have a clear idea of what they're doing — namely, producing experts. An undergraduate education, however, neither can nor should achieve that goal. The liberal arts approach in particular provides a unique opportunity to form broad-minded critical and creative thinkers who have the right combination of intellectual boldness and intellectual humility to enter a wide variety of professions and explore many bodies of knowledge. A crucial part of that formation is learning to have the courage to admit one's own ignorance, and I believe students would be better served if faculty members were more commonly called upon to display that same courage.
America's public research universities face a challenging economic environment characterized by rising operating costs and dwindling state resources. In response, institutions across the country have looked toward the corporate sector for cost-cutting models. The hope is that implementing these “real-world” strategies will centralize redundant tasks (allowing some to be eliminated), stimulate greater efficiency, and ensure long-term fiscal solvency.
Recent events at the University of Michigan (suggest that faculty should be proactive in the face of such “corporatization” schemes, which typically are packaged and presented as necessary and consistent with a commitment to continued excellence. The wholesale application of such strategies can upend core academic values of transparency, and shared governance, and strike at the heart of workplace equity.
Early this month our university administration rolled out the “Workforce Transition” phase of its “Administrative Services Transformation” (AST) plan. From far on high, with virtually no faculty leadership input, 50 to 100 staff members in the College of Literature, Science, and the Arts (LS&A) departments were informed that their positions in HR and finances (out of an anticipated total of 325) would be eliminated by early 2014. Outside consultants, none of whom actually visited individual departments for any serious length of time, reduced these positions to what they imagined as their “basic” functions: transactional accounting and personnel paperwork.
It became clear that many of those impacted constitute a specific demographic: women, generally over 40 years of age, many of whom have served for multiple decades in low- to mid-level jobs without moving up the ranks. A university previously committed to gender equity placed the burden of job cuts on the backs of loyal and proven female employees.
These laid-off employees found little comfort in learning that they would be free to apply for one of 275 new positions in HR or finance that will be contained at an off-campus “shared services” center disconnected from the intellectually vital campus life.
The resulting plan reveals no awareness of how departments function on an everyday basis. Such “shared services” models start with the presumption that every staff member is interchangeable and every department’s needs are the same. They frame departments as “customers” of centralized services, perpetuating the illusion that the university can and should function like a market. This premise devalues the local knowledge and organic interactions that make our units thrive. Indeed, it dismisses any attribute that cannot be quantitatively measured or “benchmarked.” Faculty members who reject these models quickly become characterized as “change resisters”: backward, tradition-bound, and incapable of comprehending budgetary complexities.
The absence of consultation with regard to the plan is particularly galling given that academic departments previously have worked well with the administration to keep the university in the black. Faculty members are keenly aware of our institution’s fiscal challenges and accordingly have put in place cost-cutting and consolidating measures at the micro level for the greater good.
Worries about departmental discontentment with AST and shared services resulted in increasing secrecy around the planned layoffs. In an unprecedented move, department chairs and administrators were sworn to silence by “gag orders” prohibiting them from discussing the shared services plan even with each other. Perturbed, close to 20 department chairs wrote a joint letter to top university executives expressing their dismay. As one department chair said, "The staff don't know if they can trust the faculty, the faculty don't know if they trust the administration.”
Within a few days, at least five LS&A departments had written collective letters of protest, signed by hundreds of faculty members and graduate students. Over the past few weeks, that chorus of opposition has only intensified as faculty members from all corners of our campus have challenged AST. Some have called for a one- to two-year moratorium and others for an outright suspension of the program.
The outcry against the planned transition itself reflects the growing rift between departmental units and the central administration at the University of Michigan. Championed as an astute financial fix by a cadre hidden away in the upper-level bureaucracy, the shared-services model is the brainchild of Accenture, an outside consulting firm which our university has also contracted for a multimillion-dollar IT rationalization project.
Caught off-guard by the strong pushback, the administration has issued several messages admitting that their communication strategies around these changes were inadequate, stating that for now layoffs will be avoided, and assuring us that there will be greater consultation and transparency going forward.
While these definitely are hopeful signs, important questions about institutional priorities and accountability have arisen.
Initially, the university’s consultants claimed that AST would render a savings of $17 million. Over time that figure shrunk to $5 million, and by some accounts now is reputed to be as low as $2 million. Yet the university has already reportedly spent at least $3 million on this effort with even more spending on the horizon.
Where are the cost savings? How much more will the university spend on Accenture and other outside consultants? How will replacing or shifting valued employees, even at lower numbers and salaries, from their departmental homes to what essentially is a glorified offsite “call center” actually enhance efficiency? How can a university ostensibly committed to gender equity justify making long-serving and superb female employees pay the price of AST? What credible proof is there that centralized management will provide any budgetary or administrative benefits to the specialized needs of individual departments?
The implications of these questions are thrown into starker relief when considering that almost to the day of the announced layoffs, the university launched its most ambitious capital campaign, “Victors for Michigan,” with festivities costing more than $750,000 and a goal of raising $4 billion.
Whether or not the collective protest initiated by a critical mass of faculty will result in change or reversal remains to be seen. Nevertheless, the past few weeks have been a wake-up call. Faculty must educate themselves about the basic fiscal operations of the institution in these changing times and reassert their leadership. Gardens, after all, require frequent tending.
Otherwise, we remain vulnerable to opportunistic management consultants seeking to use fiscal crisis as a source of profit. Public institutions that remain under the spell of misleading corporate promises will ultimately save little and lose a great deal.
Anthony Mora is associate professor of American culture and history at the University of Michigan. Alexandra Minna Stern is professor of American culture and history, and a professor of obstetrics and gynecology at the University of Michigan.
For some reason I have become aware that it is possible to take photographs of bass guitar players in mid-performance and, by digital means, to replace their instruments with dogs, so that it then appears the musicians (who very often wear a facial expressions suggesting rapture or deep concentration) are tickling the dogs. Yes, yes it is.
I am not proud of this knowledge and did not seek it out, and would have forgotten about it almost immediately if not for something else occupying my attention in the past few days: a couple of new books treating the phenomenon with great and methodical seriousness. Not, of course, the dog-tickling bass player phenomenon as such, but rather, the kind of online artifact indicated by the titles of Karine Nahon and Jeff Hemsley’s Going Viral (Polity) and Limor Shifman’s Memes in Digital Culture (MIT Press).
The authors differentiate between the topics of the two volumes. Despite a common tendency to equate them, memes don’t always “go viral.” Things that do (say, video shot during a typhoon, uploaded while the disaster is still under way) are not always memes. The distinction will be clarified shortly -- and there is indeed some value in defining the contrast. It corresponds to different kinds of behavior or, if you prefer, different ways of mediating social and cultural life by means of our all-but-inescapable digital device.
Still, the line should be drawn only just so sharply. It seems bright and clear when the authors bring their different methods (one more quantitative than qualitative and vice versa) to the job. I don’t mean that the difference between viral and memetic communication is simply one of perspective. It seems to exist in real life. But so does their tendency to blur.
“Virality,” write Nahon and Hemsley in a definition unlikely to be improved upon, “is a social information flow process where many people simultaneously forward a specific information item, over a short period of time, within their social networks, and where the message spreads beyond their own (social) networks to different, often distant networks, resulting in a sharp acceleration in the number of people who are exposed to the message.” (Nahon is an associate professor, and Hemsley a Ph.D. candidate, at the Information School of the University of Washington.
Here the term “information item” is used very broadly, to cover just about any packet of bytes: texts, photographs, video, sound files, etc. It also includes links taking you to such material. But unlike a computer virus -- an unwanted, often destructive such packet – a message that has “gone viral” doesn’t just forward itself. It propagates through numerous, dispersed, and repeated decisions to pay attention to something and then circulate it.
The process has a shape. Charting on a graph the number of times a message is forwarded over time, we find that the curve for a news item appearing at a site with a great deal of traffic (or a movie trailer advertised on a number of sites) shoots up at high speed, then falls just about as rapidly. The arc is rapid and smooth.
By contrast, the curve for an item going viral is a bit more drawn-out -- and a lot rougher. It may show little or no motion for a while before starting to trend upwards for a while (possibly followed by a plateau or downturn or two) until reaching a certain point at which the acceleration becomes extremely sharp, heading to a peak, whereupon the number of forwards begins to fall off, more or less rapidly -- with an occasional bounce upwards perhaps, but nothing so dramatic as before.
So the prominently featured news item or blockbuster ad campaign on YouTube shoots straight up, like a model rocket on a windless day, until the fuel (newsworthiness, dollars) runs out, whereupon it stops, then begins to accelerate in the opposite direction. But when something goes viral, more vectors are involved. It circulates within and between clusters of people -- individuals with strong mutual connections with each other. It circulates through the networks, formal or informal, in which those clusters are embedded.
And from there, onward and outward – whether with a push (when somebody with a million Twitter followers takes notice), or a pull (it begins to rank among top search-engine results on a certain topic), or both. The authors itemize factors in play in decisions about whether or not to share something: salience, emotional response, congruence with the person’s values, etc. And their definition of virality as “a social information flow process” takes into account both the horizontal dimension of exchange (material circulating spontaneously among people familiar with one another) and the roles of filtering and broadcasting exercised by individuals and online venues with a lot of social capital.
None of which makes virality something that can be planned, however. “Content that we create can remain stubbornly obscure even when we apply our best efforts to promote it,” they write. “It can also grow and spread with an apparent life and momentum of its own, destroying some people’s lives and bringing fame and fortune to others, sometimes in a matter of days.”
An Internet meme, as Limor Shifman sums things up, is “(a) a group of digital items sharing common characteristics of content, form, and/or stance; (b) that were created with awareness of each other; and (c) were circulated, imitated, and/or transformed via the Internet by many users.”
As with virality, the concept rests on a biological metaphor. Coined by Richard Dawkins in 1976, “meme” began in a quasi-scientific effort to identify the gene-like elements of behavior, cultural patterns, and belief systems that caused them to persist, expand, and reproduce themselves over very long periods of time. As reincarnated within cyberculture, the meme is a thing of slighter consequence: a matter of endless variation on extremely tenacious inside jokes, occupying and replicating within the brains of bored people in offices.
Shifman's point that memetic communication (which for the most part involves mimicry of existing digital artifacts with parodic intent and/or "remixing" them with new content) is an exemplary case of Web 2.0 culture seems to me sound, which probably also explains why much in the book may seem familiar even to someone not up on LOLcats studies. Yes, memes are a form of active participation in digital communication. Yes, they can carry content that (whether the meme goes viral or not) questions or challenges existing power structures. I have seen my share of Downfall parody videos, and am glad to know that Bruno Gantz is okay with the whole thing. But every so often that line from Thoreau comes to mind -- "as if we could kill time without injuring eternity" -- and it seems like a good idea to go off the grid for a while.