Author discusses her new book on how colleges can help at-risk students succeed

Smart Title: 

Author discusses her new book about promoting success of at-risk students.

Essay on George A. Romero, 'godfather of the zombie film'

George A. Romero, the artist who did more than anyone to enrich our nightmares over the past half century, died last week at the age of 77. Working outside Pittsburgh with local talent and a small budget, Romero directed Night of the Living Dead (1968), the horror film that introduced a new element into the genre’s visual and narrative vocabulary: the cannibalistic zombie horde.

The notion of reanimated corpses driven by insatiable hunger clearly owed something to earlier screen terrors. It was akin to Dracula, for one, albeit with a less restricted diet and no lascivious overtones, or even to Frankenstein’s monster, minus the emotional volatility. In the hands of a lesser filmmaker, the new creature would have been a profitable gimmick, at best. And it’s possible that Romero himself only grasped the potential of his own material through an afterthought of casting: Ben, the most cool-headed and intelligent of the characters trapped in a farmhouse under siege by the living dead, was played by Duane Jones, an African-American actor. Nothing in the script referred to race, and in Tony Williams’s collection George A. Romero: Interviews (University Press of Mississippi, 2011), the director mentions that he originally pictured Ben as a white truck driver.

But Jones proved the best actor to try out for the part. (Besides performing on stage and in film, he went on to become a professor of literature and a theater director.) The color-blind casting brought out more fully the implications of an apocalyptic scenario produced during years of massive upheaval. Romero’s characters form a social microcosm; they find refuge in a structure that just barely holds together under the impact of unexpected and unpredictable forces. Without giving too much away, I can say that a happy ending never feels likely. Night of the Living Dead reached the movie screen not quite six months to the day after the assassination of Martin Luther King Jr. -- and the sheriff’s posse pictured in stills at the end of the film represents what was being called, in the new catchphrase of the day, “law and order.”

In an interview with Film Comment upon the release of the sequel, Dawn of the Dead (1978), Romero indicated that he had originally conceived Night as set during the first of three stages in the zombie apocalypse -- the phase in which an “operative society” still exists, “even though there’s a lot of chaos and people don’t know how to handle it.” Dawn of the Dead shows stage two: the living and the undead are “an equal balance, with the outcome undecided.”

The balance was not just one of power. In a move that elevates Dawn from the status of sequel to a well-made film in its own right, Romero sets the action in a shopping mall and depicts humans and zombies as disconcertingly similar. (“Why do they come here?” asks one character, to which another replies, “Some kind of instinct. Memory of what they used to do. This was an important place in their lives.”)

Romero envisioned the final phase of his metanarrative arc many years before he was able to make Day of the Dead (1985). In the Film Comment interview, he sketched “a layered society where the humans are little dictators, down in bomb shelters, and they fight their wars using zombies as soldiers … feeding the zombies, controlling them, and keeping law and order.” If Night was almost inadvertently a commentary on the civil rights struggle, Dawn was much deliberate in its treatment of consumerism, while Day’s vision of the military-industrial complex ran very much against the mood of the Reagan years.

Romero returned with Land of the Dead in 2005 and released two more zombie titles by 2009 -- films of interest mainly as evidence that the original trilogy both created a genre and set its standard. Romero’s zombies can be reduced to a formula, as countless imitators have shown, but not his wit. Nor, for that matter, his theological undertones, as discussed in this column some years ago.

Upon learning that Romero had passed, I considered trying to compile a brief bibliographical essay on the critical literature concerning zombie films -- plus, if it could be managed, a survey of whatever had been published on Martin (1978), Romero’s original and disquieting take on vampirism. (Surely one of his two or three best films, it tends to be overlooked given his status as “godfather of the zombie movie.”)

But my plan was preposterous. Sarah Juliet Lauro puts it best in her editor’s introduction to Zombie Theory: A Reader, due from the University of Minnesota Press in October: “I soon found that by the time one compiles a reading list and works one’s way, methodically, through it, a whole new crop have sprouted that need to have their heads kicked off. It’s rather like the way they are never not painting the Golden Gate Bridge, or so my grandmother tells me, for as soon as they finish at one end, they have to begin again at the other.”

The Sisyphean nature of the effort also yields diminishing returns. An awful lot of the commentary tends to resemble the zombies themselves, shuffling around in too-familiar circles and emitting similar noises. But Lauro includes a paper that is worth reading at whichever end of the bridge you find yourself: Steven Shaviro’s “Contagious Allegories: George Romero.” Originally a chapter of a book published in 1993, its treatment of the trilogy precisely registers the qualities of both the films themselves and the experience of watching them. He captures the simultaneously horrific and satirical nature of Romero’s zombies:

They are slower, weaker and stupider than living humans; their menace lies in numbers, and in the fact that they never give up. Their slow-motion voracity and continual hungry wailing sometimes appear clownish but at other times emerge as an obsessive leitmotif of suspended and ungratified desire … They continue to participate in human, social rituals and processes -- but only just enough to drain them of their power and meaning. For instance, they preserve the marks of social function and self-projection in the clothes they wear, which identify them as businessman, housewife, nun, Hare Krishna disciple and so on. But this becomes one of the films’ running jokes: despite such signs of difference, they all act in exactly the same way. The zombies are devoid of personality, yet they continue to allude to personal identity. They are driven by a sort of vestigial memory, but one that has become impersonal and indefinite, a vague solicitation to aimless movement.

It suggests both the director’s depth of vision and the critic’s knack for apt characterization that this passage is also bound to apply to the viewer, too, in certain moments: stuck in ruts, going through the motions, a little too close to self-parody. Both laughter and fear are suitable responses to this condition, and Romero held up a mirror to it.

Editorial Tags: 
Image Caption: 
Theatrical poster for “Night of the Living Dead” (1968)
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Review of Mark Stein, 'Vice Capades: Sex, Drugs and Bowling From the Pilgrims to the Present'

Among the less dismal items to appear in my news feed in recent days have been a couple of articles from the International Business Times -- distinguished, in each case, by a combination of words I’d not seen in a headline before. In one, it was “sex robots,” in the other, “snortable chocolate.” After months of “fake news” and “presidential tweets,” a departure from the norm is welcome.

O brave new world, that has such commodities in it! Amazon confirms that snortable chocolate, also called “raw cacao snuff,” is available for purchase. The sex robots are not, as yet, though you can preorder Robot Sex: Social and Ethical Implications, edited by John Danaher and Neil McArthur, due from MIT Press in October. It is a matter of time until the robots themselves are listed, presumably under “appliances.”

Neither product turns up in Mark Stein’s Vice Capades: Sex, Drugs and Bowling From the Pilgrims to the Present (University of Nebraska Press). But they seem to fall into place, readily enough, among the suspect pleasures it catalogs.

A partial list would include alcohol, tobacco, cocaine, dancing, theater going, boxing, games of chance and the use of foul or blasphemous language. Also fornication, of course, whether actual or pornographically vicarious. (Sex robots blur that last distinction, while snortable chocolate, a stimulant, falls somewhere between snuff and the illegal sort of nose candy.)

So what do all these indulgences share? I’ve gone back through Vice Capades in search of the author’s explanation -- to no avail. It is a loose, rowdy sort of book -- packed with historical anecdotes but scholarly in neither intention or method. Stein is a screenwriter and playwright, and one of his earlier books was the basis for American Panic: A History of What Scares Us and Why, a series on the History Channel.

Vice Capades reads like a staging area for another such program, which may explain why he can afford to leave the central term, “vice,” undefined except through a nod to Supreme Court Justice Potter Stewart’s memorable formulation about obscenity: “I know it when I see it.”

The audience is likely to concur -- which sets things up nicely for the ensuing ramble through history. For the problem, of course, that vice is not solely in the eye of the beholder. The vice of one century, such as bowling or playing cards, can become the wholesome family recreation of another. There is something amusing, but also intriguing, about seeing examples from history of people working themselves up into a righteous frenzy of alarm and condemnation about, say, juggling, which was illegal in some of the early American colonies.

Something really counts as a vice, then, when it is denounced and (to the degree possible) prevented by people in a position to make their judgments stick. That power waxes and wanes over time as different groups make claims to hold authority. When they do, both the vices causing concern and the rationales for proscribing them can change sharply. In 17th-century Boston, juggling looked to clergy like the slippery slope to witchcraft, while Eisenhower-era psychiatrists were convinced that violent comic books would spawn a generation of homicidal maniacs. For all the difference between theological and medical vocabularies, a certain shared urge to secure and strengthen public order is obvious.

Stein’s examples and narratives are a good deal more interesting than his analysis. Protectors of the status quo who feel confident in their cause tend to reveal their own self-interest in blatant ways. Consider the remarks of a temperance crusader from 1917:

“Saloon keepers are frequently the most effective leaders of the new industrial immigrants. There is hardly a drinking place in a large foreign colony which does not have its political club. The brewers do everything possible to create a feeling of antagonism among the units of the new immigration against the ‘Puritanism’ of the AngloSaxons.”

Concern with the newcomers’ sobriety, however heartfelt, is pretty clearly mixed up with anxiety over the neighborhood bar as a political hub. And in the case of a few vices, the panic they generated in previous generations is almost unbelievable now. Old warnings about the violent behavior of marijuana addicts are difficult to credit by anyone who has seen a stoner attempt to locate a TV remote control.

Vices tend to be redefined, Stein suggests -- destigmatized, even if not fully legalized -- when the economic profits they generate outstrip their social costs (real or imagined) or the expense of enforcing their prohibition.

However widely that claim can be generalized, it puts the news I mentioned earlier into a certain perspective. Snortable chocolate is new and unregulated -- a “suspect product [with] no clear health value,” in the words of Senator Chuck Schumer. “I can't think of a single parent who thinks it is a good idea for their children to be snorting over-the-counter stimulants up their noses.” Its status as a potential vice is clear.

And likewise with the sex robot, that other affront to lingering Puritan sensibilities. The International Business Times reporter lists any number of worrisome potential side effects of a technology that “it's likely … [will] become more widespread,” such as “greatly reinforc[ing] a pornographic and objectified vision of women” and “potentially decreas[ing] people's understanding of consent, as there is no need to ask for the robots permission to engage in a sexual act.”

You’d scarcely know that from how the article is framed, though. “How Sex Robots Could Help Heal Society,” the headline reads, with the subhead explaining: “They could treat sexual dysfunction and help people overcome trauma.” The profits on snortable chocolate may not be big enough to overcome suspicion. But erotic robotics sounds halfway free of the presumption of vice already, strange as that may seem.

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Author discusses ideas in his new book, 'The Toxic University'

Smart Title: 

Author discusses his new book, which argues that politicians and “zombie leadership” in higher education are destroying academic values.

Mentioned in YouTube interview, dormant music theory book takes off

Smart Title: 

A brief mention in an interview by a British artist breathes life and sales into a niche music theory book.

Third survey of university press titles for fall 2017-winter 2018 (essay)

The last fall-books preview noted a cluster of forthcoming studies of what we might call surveillance society and its discontents. The expression “surveillance society” still has a slightly futuristic resonance -- with the sci-fi world of Minority Report (where the authorities not only keep an eye on everything in the present but anticipate individual behavior before it’s even carried out) as its imaginary culmination.

But it’s too late for cautionary tales now. The proactive digital panopticon is up and running, though still, presumably, in the beta version. Also catching my attention while looking over next season’s offerings from university presses were a number of titles zeroing in on social order and the forces assigned to handle its crises. One example is Writing the World of Policing: The Difference Ethnography Makes (University of Chicago Press, October), edited by Didier Fassin, which “brings together an international roster of scholars who have conducted fieldwork studies of law enforcement in disadvantaged urban neighborhoods on five continents.” (All material appearing in quotation marks in this week’s column is taken from publishers’ catalogs.)

While Fassin’s contributors focus on the “one of the most problematic institutions in contemporary societies,” Bernardo Zacka takes a wider view in When the State Meets the Street: Public Service and Moral Agency (Harvard University Press, September). Analyzing “the complex moral lives of street-level bureaucrats: the front-line social and welfare workers, police officers and educators who represent government’s human face to ordinary citizens,” he finds them too often compelled to “settle for one of several reductive conceptions of their responsibilities, each by itself pathological in the face of a complex, messy reality.”

Sometimes contradictory or incoherent norms are what create the complex mess in the first place. That seems to be the perspective Ingrid Walker brings to High: Drugs, Desire and a Nation of Users (University of Washington Press, October), which challenges the endless and clearly unwinnable war on drugs. Likewise, Alexandra Cox’s Trapped in a Vice: The Consequences of Confinement for Young People (Rutgers University Press, November) examines the contradictory and self-defeating aspects of “a juvenile justice system that is aimed at promoting change in the lives of young people, yet ultimately relies upon tools and strategies that enmesh them in a system that they struggle to move beyond.”

With a steady stream of death-penalty convictions overturned by improved forensic methods, capital punishment has come to exemplify “how inept lawyering, overzealous prosecution, race discrimination, wrongful convictions and excessive punishments undermine the pursuit of justice,” as Brandon L. Garrett put it in End of Its Rope: How Killing the Death Penalty Can Revive Criminal Justice (Harvard, September). In many places, capital punishment has been replaced with “what amounts to a virtual death sentence -- life without possibility of parole.” But arguably worse than that is the confinement Terry Allen Kupers documents in Solitary: The Inside Story of Supermax Isolation and How We Can Abolish It (University of California Press, September), based “some of the thousand inmates he’s interviewed while investigating prison conditions over the last 40 years.” (I wrote about another book on this subject here last year.)

Pitched at the level of cultural critique more than of policy analysis, Jackie Wang’s Carceral Capitalism (MIT Press, October) collects the author’s essays on “contemporary incarceration techniques that have emerged since the 1990s.” The topics include “the biopolitics of juvenile delinquency, predatory policing, the political economy of fees and fines, cybernetic governance and algorithmic policing” -- in short, the surveillance society at this stage of its development.

Also forming their own constellation among the fall titles were a few books on mood and emotion, with a bias toward the painful. Giulia Sissa tries to rehabilitate Jealousy: A Forbidden Passion (Polity, December) by putting an emphasis on the noun in its subtitle. The feeling has somehow acquired a reputation as “a symptom of feeble self-esteem … a disease to be treated, a moral vice to be eradicated, an ugly, premodern, illiberal, proprietary emotion to be overcome.” Instead, a tour of ancient and modern thought unveils how jealousy, “far from being a ‘green-eyed’ fiend, reveals the intense and apprehensive nature of all erotic love, which is the desire to be desired.” It might be interesting to see why the author thinks these are mutually exclusive options, though the argument wouldn’t do Desdemona much good in any case.

Another kind of suffering interests Peter N. Stearns in Shame: A Brief History (University of Illinois Press, September). An enormous amount of interdisciplinary work has been done on shame, especially in the past two or three decades, and if there’s anything like a consensus it would probably be that shame and sociality are tightly linked in human experience. And the author concurs. “Groups establish identity and enforce social behaviors through shame and shaming,” we read in the book’s description. “Yet historians often neglect shame’s power to complicate individual, international, cultural and political relationships.” In trying to remedy that situation, Stearns may have written a timely book, though I wonder if one on shamelessness might not be even more so.

Today the word connotes poignancy more than suffering, but Thomas Dodman’s What Nostalgia Was: War, Empire and the Time of a Deadly Emotion (Chicago, December) reminds us that nostalgia once referred to a severe and potentially fatal kind of melancholy. A less dangerous and more sociable variety emerged under French imperialism: “Frenchmen [who] worried about excessive creolization came to view a moderate homesickness as salutary.”

Finally, if ever the description of a book made clear how culturally specific (and to outsiders, puzzling) feelings of nostalgia can be, it’s the one for Richard Power Sayeed’s 1997: The Future that Never Happened (Zed, October). It was, it seems, the best of times, “now remembered by many as a time of optimism and vibrancy, quickly lost … when it seemed like Britain was becoming a more tolerant, cosmopolitan, freer and more equitable country.” By contrast, Sayeed “cuts through the nostalgia” and interprets 1997 as “a missed opportunity, a turning point when there was a chance to genuinely transform British culture and society that was sadly lost.” Does that really count as “cut[ting] through nostalgia”? To me it just sounds like melancholy in a slightly different key.

Editorial Tags: 
Image Source: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Interview with Ceri Dingle, director of "Every Cook Can Govern: The Life, Impact and Works of C.L.R. James"

The word went around a few years ago that someone in England was working on a documentary about the West Indian historian, revolutionary political theorist and pan-African eminence C. L. R. James (1901-1989). Like the long-promised dramatic film based on The Black Jacobins, James's book on the Haitian revolution, this seemed to me an excellent idea -- and, like it, probably doomed.

For one thing, James is a very large subject. He wrote fiction and plays, lectured on Shakespeare and Hegel, and argued with Leon Trotsky and Kwame Nkrumah. He was a magnetic lecturer, able to hold an audience for hours while speaking without notes. The intelligence agencies of a number of countries kept busy monitoring him. It was a life of ideas as much as of activism (he would have protested that distinction, I suspect), but putting ideas on screen is a challenge seldom taken and even more rarely met.

It was also hard not to be dubious about the funding for such a venture. It certainly proved elusive whenever anyone tried to film The Black Jacobins, which James adapted for the stage in the 1930s. Paul Robeson, the star of that production, later tried in vain to get a film made. Rumors that one was in the works seem to have revived in the 1970s. For some reason, Hollywood never warmed to the idea of a slave revolt led by anyone but Kirk Douglas. At last report, the project’s best chances were for the film to be produced in Venezuela, with funding by Hugo Chávez. The idea may not have died with him, but progress on it seems unlikely any time soon.

Meanwhile, however, and against all odds, we have Every Cook Can Govern: The Life, Impact and Works of C. L. R. James, now available for rent or purchase through Vimeo. The title comes from Lenin’s The State and Revolution, via an essay celebrating ancient Athenian democracy that James wrote in 1956. Here is the trailer, in which I appear briefly, looking very much in need of a shave and haircut.

Released to much acclaim in London last year, Every Cook Can Govern is making the rounds of film festivals and has in recent months been screened in South Africa and the Caribbean. It was nominated for an award from the British Universities Film & Video Council earlier this year; while it didn’t win, the judges praised “the incredible range of content explored and its excellent use of archive material.” It includes rare footage of James speaking, as well as personal recollections by numerous friends and associates.

My participation was slight though sufficient to reveal how exhaustively the filmmakers prepared for their work. When Ceri Dingle, the director, first got in touch, I assumed she wanted a broad overview. My interest in James has focused mainly on his activity in the United States from 1938 until his expulsion in 1953, under McCarthyism -- but I anticipated making a few general remarks, not getting into the implications of one slice of his long and rich life. Suffice to say that my guess was wrong. The questions were numerous, specific and demanding. Peer-reviewed articles on James have been published on the basis of less research than Dingle and her colleagues put into preparing for that interview.

If the goal were to cover James’s whole long life in such detail, it was hard to see how the documentary could ever be released -- except perhaps as a miniseries. In its final form, Every Cook Can Govern comes in at about two hours. I’d kept my doubts about the project’s viability to myself while it was underway; even with it finished, the very idea still seemed daunting and left me curious how it came to fruition.

Turning the tables, I hoped to interview the director about Every Cook Can Govern around the time it was released last year. But she was racing to edit it down to manageable length almost up to the moment it premiered. For months afterward, she was swamped with fund-raising for WORLDwrite, the education charity in London that produced the documentary.

With Every Cook Can Govern being made available for streaming online, it seemed like a good time to try again. Now engaged in filming another documentary, Dingle made time to discuss the making of Every Cook Can Govern via email, quoted here with her permission.

By 2010, WORLDwrite’s “citizen television” project WORLDbytes had made a number of what Dingle calls “little videos about history makers,” including a documentary about British suffragist and revolutionary Sylvia Pankhurst released the following year.

“I asked our volunteers for ideas of further lesser known heroes and heroines we should cover,” Dingle says. “One lad suggested C. L. R. James, and having read and loved The Black Jacobins years ago, I thought it a brilliant idea. We agreed to reread the book and meet in a few weeks, expecting this to be a short six-week project.”

While making the Pankhurst documentary, Dingle and her coworkers “learned the hard way by picking up cameras too soon” in the process. Only after conducting a number of interviews did they discover that what they had filmed “didn’t represent her at all and [we] were just wrong, ghastly, in fact, willfully misrepresenting all the unacceptable stuff -- her setting up the first Communist party in the United Kingdom, for example.”

They had to start over. With C. L. R. James, due diligence meant a lot of reading -- far more than the documentary crew ever expected. They assembled and read 11 books by James plus “many papers and articles (we counted 834)” before turning to the secondary literature, which included another 34 books. “It was a massive learning curve for all our crew, as most of us were not academics,” Dingle says. “It became evident you could almost cover key moments in the history of the 20th century through the life and works of one man. That meant of course constantly reading up on context, too. It was a long, slow haul, but we felt worth it.”

Eventually they felt prepared to identify interview subjects for the documentary -- talking heads who could answer their questions. “Surprisingly,” Dingle says, “we also met a lot of self-proclaimed ‘experts’ who hadn’t even read as much of James as our volunteers -- which was quite shocking.” While admittedly tempted to compare lists, I instead pressed on to ask about how the effort was financed. Dingle mentions that some 200 volunteers contributed their labor in the course of six years, but there were expenses even so.

Fund-raising began “in dribs and drabs,” she says, through “secondhand book sales, cake sales, jumble sales, friends chipping in, appeals on Facebook, which does sound a bit naff for a production with professional aspirations.” A grant of 25,000 pounds (about $32,000) a year for three years from Britain’s Heritage Lottery Fund “was quite a breakthrough” for the project, though not enough even to cover rent for WORLDwrite’s Volunteer Centre, where young people are trained in filmmaking. “We ran about 15 camera courses and a lot of workshops,” says Dingle, since most of the crew had “never used a camera before, or presented or interviewed anyone. We had endless discussions on books and scripts and questions and themes, did endless copyright research, and wrote many begging letters for images, too.”

Begging worked only up to a point. “The biggest single cost was the archive footage,” Dingle says. “We felt it was worth [going into] debt to use the footage not publicly seen before.” Another possible menace to the budget resulted from the international scope of James’s life and work. “Raising the cash to take a crew to the USA or Trinidad -- although everyone wanted to go, of course -- would have made the film a 10-year project.”

But one of WORLDbytes’s former tutors, Ian Foster, is now a cinematographer working on this side of the Atlantic who volunteered to record interviews between assignments. (It was when Foster set up his professional-grade video equipment in my living room that I started really to regret not going to the barber.)

In the end, Dingle and her colleagues generated more than 350 hours of footage, which had to be whittled down to two very busy hours of documentary. They also gained access to scores of pages of British state surveillance records on James and have made them available to the public through the film’s website. The finished film is not exhaustive. It seems that an occasional viewer has felt the need to point this out.

“Everyone obsesses on the ‘gaps,’” Dingle says. “Maybe they want the box set … But we hope we’ve done enough in the film to raise an interest and help inspire people new to James. If so, it’s all been more than worthwhile, and we’ll have done James proud.” The mobilization of scores of people who’d never made a film before would surely have met with his approval. “Every cook can, of course, film, too,” the director says, “but it takes a hell of a lot of work to get there.”

Editorial Tags: 
Image Caption: 
C. L. R. James
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Second survey of university press titles for fall 2017-winter 2018 (essay)

My last dispatch, some weeks back, was a survey of various interesting books forthcoming from university presses (mostly) in the fall and winter months -- sorting them into topical or thematic clusters, with enough connective tissue to make it more than a list. Readers approved; some chimed in with titles they’d noticed. And a few long-suffering university-press staff seem to have appreciated the effort, incomplete and unsystematic as it was. I had to leave out scores of books under about a dozen headings -- and that’s without all the presses having issued their fall catalogs yet.

Many still haven’t. In any event, let’s continue with more of the next publishing season’s offerings. Quotations and publishing dates given here are taken from the presses’ descriptions of the books; the publishing dates may vary from what online book vendors indicate, but the presses’ own websites tend to be more reliable.

The conditions and prospects of the university itself are always up for discussion. Paul H. Mattingly takes a long view in American Academic Cultures: A History of Higher Education (University of Chicago Press, November) by treating “its current state [as] the product of different, varied generational cultures, each grounded in its own moment in time and driven by historically distinct values that generated specific problems and responses.” Former New York Times columnist Randall Stross uses his knowledge of Silicon Valley to make the case for A Practical Education: Why Liberal Arts Majors Make Great Employees (Redwood Press, distributed by Stanford University Press, September). Norm Friesen examines two enduring yet mutating pedagogical instruments in The Textbook and the Lecture: Education in the Age of New Media (Johns Hopkins University Press, December). Announcing itself as a departure from the monographic norm, Interacting With Print: Elements of Reading in the Era of Print Saturation (Chicago, October) is the work of the Multigraph Collective, “a team of 22 scholars at 16 universities in the U.S. and Canada,” who “have assembled an alphabetically arranged tour of key concepts for the study of print culture, from anthologies and binding to publicity and taste.”

The university is the institution where disability studies and disability policy necessarily come into closest proximity -- a convergence reflected in at least three new books. Two from the University of Michigan Press are Jay Timothy Dolmage’s Academic Ableism: Disability and Higher Education and the collection Negotiating Disability: Disclosure and Higher Education, edited by Stephanie L. Kerschbaum, Laura T. Eisenman and James M. Jones. Both volumes are due in December. Aimi Hamraie inspects the history and presuppositions an important movement in architecture in Building Access: Universal Design and the Politics of Disability (University of Minnesota Press, November).

Next year marks the 50th anniversary of the biggest and most widespread wave of student protest in history, and the outpouring of memoirs and reflections will undoubtedly be both copious and international. Among the first will be A Time to Stir: Columbia ’68, edited by Paul Cronin (Columbia University Press, January). Roderick A. Ferguson’s We Demand: The University and Student Protests (University of California Press, August) maintains that cuts to humanities and interdisciplinary programs at public universities are not just “a reactionary move against the social advances since the ’60s and ’70s, but part of the larger threat of anti-intellectualism in the United States.”

Faced with reduced grants and increased tuition fees, student demonstrators in Britain made 2010 a memorable year -- without, however, changing the tide. Matt Myers collects the perspectives of a range of participants -- “activists, students, university workers and politicians” -- in Student Revolt: Voices of the Austerity Generation (Pluto Press, distributed by Chicago, October) to record “both the deep divisions of the movement and the intense energy generated by its players.” More recent (and still ongoing) disputes inform Erwin Chemerinsky and Howard Gillman’s Free Speech on Campus (Yale University Press, September). The authors are both “constitutional scholars who teach a course in free speech to undergraduates.”

Speech and action of whatever kind is now always subject to at least the possibility of recording and retrieval. Randolph Lewis’s Under Surveillance: Being Watched in Modern America (University of Texas Press, November) charts “the ethical, aesthetic and emotional undercurrents that course through a high-tech surveillance society.” About the figures named in its subtitle, Geoffroy de Lagasnerie’s The Art of Revolt: Snowden, Assange, Manning (Stanford, September) proclaims that “they have inaugurated a new form of political action and a new identity for the political subject.” Considerably less celebratory of Snowden et al., I would imagine, are George Perkovich and Ariel E. Levite, the editors of Understanding Cyber Conflict: 14 Analogies (Georgetown University Press, November), whose intended audience includes “policy makers, scholars and students” who need the lessons of “past military-technological problems” to get a handle on the more recent variety.

Nachman Ben-Yehuda and Amalya Oliver-Lumerman apply criminological concepts to what we might call white-lab-coat crime in Fraud and Misconduct in Research: Detection, Investigation and Organizational Response (Michigan, November). Culling “insights from diverse fields, including philosophy, computer science and biology,” Geoff Mulgan’s Big Mind: How Collective Intelligence Can Change Our World (Princeton University Press, November) reflects on the emergence of big-data, multicollaborator research. Combine Fraud and Misconduct with Big Mind, and you’d get most of the raw material for a cyberpunk novel.

As happened last time, this selection of topics and books ranges from the sober to the grim. Likewise, then, I’ll round the column off with a few volumes that might break the mood.

Having spent thousands of hours listening to Patti Smith’s music over the years, I’m pretty clearly one of the people destined to read the artist’s “detailed account of her own creative process, inspirations and unexpected connections,” which is how Yale University Press describes Smith’s Devotion (September). Smith, in turn, seems a likely reader of Nicholas Frankel’s Oscar Wilde: The Unrepentant Years (Harvard University Press, October), covering the period following its subject’s release from Reading Gaol. Wilde treated personal publicity as one of the fine arts. By contrast, Pamela Bannos’s illustrated biography Vivian Maier: A Photographer’s Life and Afterlife (Chicago, October) is the portrait of an artist “extremely conscientious about how her work was developed, printed and cropped, even though she also made a clear choice never to display it.” It took decades for scholars to challenge the posthumous myths and editorial decisions surrounding Emily Dickinson. Things move faster now. Barely 10 years since Maier’s work was discovered in an abandoned storage space in Chicago, we already have a revisionist account of “how the photographs have been misconstrued or misidentified.” It’s a book to look forward to, certainly.

Editorial Tags: 
Image Source: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 

Author discusses his new history of American higher education

Smart Title: 

Author explains his history of American higher education, told through specific periods in the development of various sectors.

What Orwell says to us about America today (essay)

It’s not every day that an almost 70-year-old book catapults up the best-seller charts. George Orwell’s 1984 has topped various Amazon best-seller lists several times since mid-January, on the heels of the U.S. presidential inauguration. It’s also been featured at brick-and-mortar stores. For instance, in my neighborhood in Pittsburgh, the owner of an independent store has an Orwell display in his window and reports he’s sold a stack of copies akin to a new Harry Potter.

Who knew that Donald Trump would be good for the book trade?

Assigned in most American high schools, 1984 has sold continuously since its publication in 1949, but now, at a time when one of the president’s press officers declares that there are “alternative facts,” it has struck a renewed chord. It seems as if we have gone through the looking glass and entered a world where, in the words of 1984, “War is peace” and history is rewritten each day.

Still, the analogy can be a bit too easy. How does 1984 fit our world, and how not?

No doubt 1984 captures some sense of living in the modern era, with extensive government, military, technology and media. But in Orwell’s imagined Oceania, the state is monolithic, overseeing all activity with total control. It provides all goods and supervises all work. It sees what you do, tells you what to do, monitors what you think and punishes any variance.

A chilling vision, but that misses perhaps the most distinctive sense of our contemporary world: consumer capitalism provided by a phalanx of corporate sponsors. Conservatives might complain that government extends too far, but if one looks around one’s home, one can immediately see the reach of Apple, Google, Starbucks, Verizon, Amazon, General Foods, Exxon, Citibank and on and on. There are no corporations in Orwell’s world, and very few goods. There is only state-distributed watery coffee and foul-tasting gin -- a far cry from the soy-foam, half-decaf macchiato and the artisanal cocktail.

Orwell’s state exists for the sake of its own power, in a kind of sadomasochistic relationship that grinds down its citizens to perpetuate its power. In our society, it is easy to denigrate government because it provides a single symbol for the control we experience, but our government is more like a referee to make the market and its juggernaut of enterprises function.

Thus, a more apt vision for our day might foreground those businesses, extending across national borders and delivering pleasure, entertainment and ever newer goods. Aldous Huxley’s 1932 Brave New World captures that better, with mood-improving drugs and sex at the touch of a screen. (In a small-world coincidence, Huxley was one of Orwell’s schoolteachers.) Or William Gibson’s 1984 Neuromancer -- a book that, though a bit clunky in its sci-fi narrative, seems spot-on in depicting an internet that permeates our lives, as well as the companies that control it and deliver our products.

Orwell wrote in a time when totalitarian governments controlled a good part of Europe, notably Germany, Italy and Spain. And even in Great Britain and the United States, society had united in a concerted war effort. It was a time of total government, so in many ways 1984 reflects that moment. Instead, it seems as if we now live in a time of the total market, when major political figures aim to use business as a model for government.

Perhaps the chief thing that Orwell divined, before the advent of television, is media running through our lives. If you’ll recall from 1984, video screens are in every room at home and at work, and they are on all the time. They wake you up, tell you when to exercise and give you news about the state.

Still, there is only one channel, and it is entirely a state apparatus. In our time, so my Xfinity bill keeps telling me, we have hundreds of channels to choose from. My TV is not controlled by Big Brother; it’s spurred by the cornucopia of advertisers and products.

Orwell’s view of media followed World War II, a time of active propaganda, and Orwell knew the workings of propaganda firsthand. He worked in the Eastern Service of the BBC during the war, parlaying British news to India. But more so than propaganda, we live in a time of ads -- accumulating thousands of hours by the time one is 10 years old.

One of the creepier details in 1984 is that the screens can also watch the inhabitants. The social theorist Michel Foucault held that a central feature of modern society is the soft control of surveillance. It informs our sensibility, disciplining us without overt force and compelling us to adhere to normative behavior. Now, with the National Security Agency perusing our phones (hi!), Google combing through our search engines, and our high-tech TVs able to watch us, Orwell was all too prescient.

Still, the surveillance predominantly aims to capture us for a market. If you are reading this on a screen, then you are probably ignoring the ads in a sidebar. How did they know that you are a single 40-something? Or a woman who wants running shoes? Or a man who might wear Brooks Brothers?

In imagining a society of political lockstep, Orwell’s satiric target is usually assumed to be communism. Indeed, Orwell is a hero of the right for being an anti-Communist, as well as of the liberal left. That is why 1984 became an iconic book in the 1950s and ’60s, offering a confirmation of the ills of the Soviets.

However, it is a mistake to see it as a confirmation of the politics of the United States. From the mid-1930s onward, Orwell was an avowed anti-capitalist and anti-imperialist. If you read Animal Farm (1945) in junior high, his literary effort immediately preceding 1984, you will recall that the story parodies the U.S.S.R. under Stalin, as the main pig, Comrade Napoleon, takes control, rewrites history and finally declares that some pigs “are more equal than others.”

But remember that the farmers expelled at the beginning of the book were capitalists who had grossly exploited and abused the animals. They are not the good guys, and the revolution is justified. The problem with the Communists is not Communism; it is that they become corrupted. During a brief moment after the takeover of the farm, things are good, led by a Lenin figure, with a fairer distribution of work and more plentiful food than under the capitalists.

Rather than Communism per se, Orwell’s general target is what he saw as the rise of “managerial society.” That is a term that James Burnham, a prominent social commentator in midcentury, promoted -- seeing it as a sign of progress toward a more rational society. (In some ways, he was the Thomas Friedman of his day.) Although he declared himself a socialist after 1937, Orwell was not a party man and bristled against bureaucracy.

Orwell reviewed several of Burnham’s books and blanched at Burnham’s vision. While attuned to the politics of his time, Orwell retained nostalgia for the bucolic pleasures of the countryside, of the fields, fishing ponds and village pubs before the mechanistic effects of modern society. In 1984, one of the few pleasant moments is when the protagonist and his lover take a day trip outside London.

My bet is that Orwell would detest our day of big box stores and truly mass media. At one time he set up a small shop in a village north of London. It turned out that he was a much better writer than shopkeeper -- he shut it down after a fairly short period -- but on one of his travel visas, he identified himself as a grocer.

One aspect of 1984 that is rarely commented on is its appreciation for work. In his essay “Why I Write,” Orwell declared that he focused on politics from the late 1930s on, but he might be at his most instructive when describing work.

The grind of work is usually glossed over in fiction or film. If a protagonist has a job, their tasks are in the background or summarized in a quick scene. To be truly realistic, if work takes up nearly half of most people’s waking hours, one might expect more description of it, whereas narratives usually focus on a protagonist’s relationships, out-of-the-ordinary events or personal turmoil.

Unlike the majority of writers of his generation, such as the poets Stephen Spender or W. H. Auden, who traveled a fairly direct path from Cambridge or Oxford to London and higher cultural circles, Orwell had held a number of hardscrabble jobs as a British imperial police officer, dishwasher, schoolteacher and bookstore clerk. All of them found their way into his writing, particularly his early novels.

In 1984, the protagonist Winston works in a cubicle, handling memos and other paperwork in the Ministry of Information. However bleak otherwise, he finds some satisfaction in doing his daily tasks. Animal Farm also spends a good bit of time recounting the acceleration on the farm after the Stalin stand-in takes over, with the most honorable character, a horse, finally dying of overwork. Work is a good thing; the problem is not a day of work but overwork, or the exploitation of work.

One of the more poignant facts of Orwell’s life is that, after himself working relentlessly through the 1930s and early ’40s with little money and poor health, he gained financial comfort only in the late 1940s, after the publication of Animal Farm. It was his fifth novel and 10th book in a dozen years, and for the first time in his career, he had the luxury of writing without taking on other jobs. It afforded him time to draft 1984, but he was ill, troubled with the lung problems that would soon take him.

He had also lost his first wife, Eileen O’Shaughnessy, with whom he had gone to fight in Spain and who helped run the grocery, to a presumably safe surgery gone wrong. (The anesthesia caused heart failure.) One could see 1984 as a response to his personal despair as well as the state of the world, after a decade of full-blown fascism and massive destruction, followed by the rubble and squalor of the immediate postwar years.

Our time has a much different character, one of overflowing plenty, ubiquitous images on screen and shopping 24-7. Rather than the gray, pinched air of 1984, we live in an era of cultural ADD, and rather than suppression, we have the rampant personal expression of Facebook, Twitter and Snapchat. In this moment, President Trump is a much more fitting figure than Big Brother, more a distinctly American promoter like P. T. Barnum than a Grand Inquisitor. Big Brother, after all, stays focused and runs things with an implacable force, whereas Barnum promises to give people what they want, even if appealing to their less cerebral instincts. It’s gonna be amazing.

Jeffrey J. Williams’s most recent book is How to Be an Intellectual: Essays on Criticism, Culture and the University. He is a professor of English at Carnegie Mellon University and co-editor of the Norton Anthology of Theory and Criticism (third edition, 2018).

Editorial Tags: 
Is this diversity newsletter?: 
Is this Career Advice newsletter?: 


Subscribe to RSS - Books
Back to Top