Among the passengers disembarking from a ship from that reached Philadelphia in the final days of December 1941 was one Mark Zborowski -- a Ukrainian-born intellectual who grew up in Poland. He had lived in Paris for most of the previous decade, studying at the Sorbonne. He was detained by the authorities for a while (the U.S. had declared war on the Axis powers just three weeks earlier, so his visa must have been triple-checked) and then released.
Zborowski's fluency in several languages was a definite asset. By 1944 he was working for the U.S. Army on a Russian-English dictionary; after that that he joined the staff of the Institute for Jewish Research in New York, serving as a librarian. And from there the émigré’s career took off on an impressive if not meteoric course.
He joined the Research in Contemporary Culture Project at Columbia University, launched just after World War II by the prominent anthropologists Ruth Benedict and Margaret Mead with support from the Office of Naval Research. Zborowski oversaw an ethnographic study of Central and Eastern European Jewish culture, based on interviews with refugees. It yielded Life Is With People: The Culture of the Shtetl, a book he co-authored in 1952. Drawing on Zborowski’s childhood memories more than he acknowledged and written in a popularizing style, it sold well and remained in print for decades.
The volume’s reputation has taken some hits over the years -- one scholar dubs it “the book that Jewish historians of the region loathe more than any other” – but Zborowski enjoyed the unusual distinction of influencing a Broadway musical: the song “If I Were a Rich Man” in Fiddler on the Roof was inspired, in part, by a passage in Life Is With People. He later turned to research on cultural differences in how pain is experienced and expressed, culminating in his book People in Pain (1969). Once again his published work got mixed reviews in the professional journals, while the author himself enjoyed a kind of influence that citation statistics do not measure: a generation of medical anthropologists studied with him at the Pain Institute of Mt. Zion Hospital in San Francisco. He died in 1990.
If the details just given represented an honest account of Mark Zborowski’s life, he would now be remembered by scarcely anyone except specialists working in his fields of interest. The narrative above is all factually correct, to the best of my knowledge. But it omits an abundance of secrets. Some were revealed during his lifetime, but even they come wrapped in the mystery of his motives.
The fullest account now available is “Mark ‘Etienne’ Zborowski: Portrait of Deception” by Susan Weissman, a two-part study appearing in the journal Critique. Weissman, a professor of politics at Saint Mary’s College in Moraga, Calif., published the first half in 2011 and expected the second to follow shortly, though in fact it will appear in print only later this year. (Both can be downloaded in PDF from her page at Academia.edu.)
Etienne was the name Zborowski used while infiltrating anti-Stalinist radical circles in France for the GPU and the NKVD (forerunners of the KGB) during the 1930s, and he continued surveillance on opponents of the Soviet Union during his first few years in the United States.
“He is remembered by his students and colleagues as warm, generous and erudite,” writes Weissman. “Personally he neither stole documents nor directly assassinated people, but he informed Stalin’s teams of thugs where to find the documents or the people they sought. Zborowski infiltrated small leftist circles, made friends with its cadres and then reported on them. He always ratted on his ‘supposed’ friends. He saw [one woman] daily for nearly five years, and she helped him in countless ways. What did he give her in return? Only her survival, something not afforded to other Zborowski ‘friends.’ Once his orders switched and he no longer needed to report on her activities (or that of her husband), Zborowski simply stopped calling this constant friend, who defended him, gave him money and helped him with that precious commodity denied to so many, the visa to the United States.”
Weissman chronicles Etienne’s destructive role among the anti-Stalinist revolutionaries in Europe while also showing that his precise degree of culpability in some operations remains difficult to assess. Important missions were sometimes “nearly sabotaged by conflicting aims and lack of coordination between Soviet espionage teams.” And spy craft is not immune to a kind of office politics: reports to “the center” (intelligence headquarters) were not always accurate so much as aspirational or prudent.
Overviews of Zborowski’s covert life have been available for some time – among them, his own testimony to a Senate subcommittee on internal security, which was not especially candid. Weissman’s study draws on earlier treatments but handles them critically, and in the light of a wider range of sources than have been brought to bear on his case until now.
Besides material from Stalin-era archives (consulted when she was in Russia during the 1990s) and the decoded Venona intercepts of Soviet cable communications from the 1940s, Weissman obtained court transcripts from Zborowski’s trials for perjuring himself before Congress. (He received a retrial after appealing his first conviction, but lost and served four years in prison.)
She also used the Freedom of Information Act to request the pertinent files from the Federal Bureau of Investigation. There were surveillance reports, of course, and interviews conducted by FBI agents -- with some pages all but entirely blacked out -- but also a piece of evidence about Zborowski that has been hiding in plain sight for 50 years.
The Feb. 28, 1965, issue of the Sunday magazine of The New York Times contained an article called “The Prison ‘Culture’ -- From the Inside.” The author identified himself as an anthropologist (“and as far as I know the first member of my profession to study a prison culture from the inside”) and used the pseudonym “M. Arc.” They seem like pretty clear hints to his identity, but no one seems to have made the connection until Weismann opened the dossier.
“The article is a scholarly, well-written account of life inside,” she notes, “with a critical look at the criminal justice system … and [it] has been widely cited and reprinted in prison sociology texts.”
Part of his hidden curriculum vitae, then. “True to form,” Weismann writes, “Zborowski put the focus entirely on the subject at hand, revealing virtually nothing of himself.”
And that really is the mystery within the mystery here. It’s difficult to square Professor Zborowski (amiable, conscientious, a little bland, perhaps) with the sinister career of Etienne, a man who made himself the closest friend of Trotsky’s son Leon Sedov and quite possibly set him up for murder. (Afterward he tried to wrangle an invitation to the Russian revolutionary’s compound in Mexico, but another assassin got there first.)
In a conversation with Weissman by phone, I mentioned being both fascinated by her research (mention Trotsky in something and I’ll probably read it) and left puzzled by the figure she portrayed. And puzzled in a troubling way, with no sense of his intentions -- of how he had understood his own actions, whether while carrying them out or across the long years he had to reflect on them.
“While in prison,” she told me, “he kept insisting to the FBI that he was good citizen. He never expressed remorse. There’s nothing in his papers about his politics, nothing about his own beliefs.” The reader perplexed by Weissman's “portrait of deception” is in the same position as the scholar who investigated him: “He’s a puzzle I couldn’t solve.”
Submitted by Anonymous on August 17, 2015 - 3:00am
I teach at a member institution of the Council of Christian Colleges and Universities. I also happen to be gay. A friend’s early morning text alerted me to announcements from Eastern Mennonite University and Goshen College, both CCCU and Mennonite colleges, that they will add sexual orientation and gender identity to their nondiscrimination hiring statements.
EMU’s nondiscrimination policy will now state: “Eastern Mennonite University does not discriminate on the basis of race, color, national or ethnic origin, sex, disability, age, sexual orientation, gender identity, or any legally protected status. As a religious institution, EMU expressly reserves its rights, its understandings of and its commitments to the historic Anabaptist identity and the teachings of Mennonite Church USA, and reserves the legal right to hire and employ individuals who support the values of the university.” The announcement adds that faculty members who are married to same-sex spouses will be hired. A similar announcement was issued by Goshen College.
The announcements surprised me. I had been aware of the vote at the recent Mennonite Church USA conference not to sanction same-sex marriages, so I had anticipated that Mennonite schools would keep the status quo. I was stunned to read about the changes.
Two days before the Supreme Court announced its decision, a group gathered at a Washington restaurant for dinner. Some of us at the dinner currently teach or have taught at CCCU institutions and one was an administrator. Gay alumni of religiously affiliated institutions also attended. We are members of different Christian denominations, and some of us were active in the evangelical organizations, Young Life and InterVarsity Christian Fellowship, in former lives. Some have migrated out of conservative Christian churches into faith communities that welcome and affirm LGBTQ persons. We all have different stories but were united that evening in our hope for a good outcome from the court, and, in fact, toasted the court. Imagine ….
The contrast among CCCU institutions regarding human sexuality issues comes at a time when some Christian institutions are mounting a rearguard action regarding the teaching of evolution. At Northwest Nazarene University a professor lost his job because he affirmed that the Christian faith and evolution are compatible. Bryan College “‘clarified’ its statement of faith in ways many faculty members said made the historicity of Adam and Eve so narrow that they could no longer agree with it.” At Bethel College (Ind.) a statement was adopted that states that “Adam was created by an immediate act of God and not by process of evolution.” Faculty may teach other viewpoints, but “are not to advocate for, nor hold leadership positions” in professional organizations that have a different view.
Conservative Christian higher education views on evolution and human sexuality are not unrelated; they are of a piece because these views turn on a literal hermeneutic to interpret the Bible. Christian ethicist David Gushee, in his book Changing Our Mind, has pointed out that fashioning a Christian position on same-sex relations is a “faith/science integration issue.” New evidence emerged about the earth’s origins; new evidence is now emerging about human sexuality that now must be taken into consideration with biblical texts.
Christian higher education has accepted Copernicus and Galileo, however, Darwin remains iffy. Fortunately, institutions don’t burn people at the stake anymore, but they do fire them if they do not interpret Genesis 1 and 2 in a literal way. It is perplexing that some Christian colleges that implicitly accept evolution in their STEM programs deploy a different hermeneutic when it comes to interpreting the Bible regarding sexual ethics. Whereas Genesis 1 and 2 are interpreted as a metaphorical account of how the world came into being, these same biblical texts are interpreted literally regarding human sexuality. As Gushee suggests, the creation accounts should not be taken as “scientific self-descriptions.”
Old Testament scholar Peter Enns, in his book The Evolution of Adam: What the Bible Does and Doesn’t Say About Human Origins, writes, “The most faithful, Christian reading of sacred Scripture is one the recognizes Scripture as a product of times in which it was written and/or the events that took place -- not merely so, but unalterably so …. Unless one simply rejects scientific evidence (as some continue to do), adjustments to the biblical story are always necessary. The only question is what sorts of adjustments best account for the data.” It is not, as some insist, a matter of biblical authority; it is a matter of the interpretive principle one uses -- a literal/historical one or a metaphorical/symbolic one.
The literal interpretation of Scripture and the lack of attention to new evidence about human sexuality have led some Christian universities and colleges to tie themselves into knots when it comes to forming policies on LGBTQ issues. The casuistry is stunning. Take, for example, Hope College, not a CCCU institution, but a college affiliated with the Reformed Church in America (RCA). Shortly after the Supreme Court decision, Hope announced that it would extend benefits to same-sex couples. Many, including myself, rejoiced, however, Hope soon clarified (or made things murkier, depending on one’s point of view) -- no same-sex couple can be married in the Hope chapel because the RCA position is that marriage is to be between a man and a woman (Genesis again). Also, a 2011 Hope statement both affirms that RCA position and states that there will not be a student club that “promote[s] homosexuality.” It is not clear that Hope would hire an openly gay, married person. If that is the case, then benefits will never have to be offered.
One’s eyes begin to cross when trying to make sense of the situations at Baylor and Pepperdine, both affiliates of the CCCU. At Baylor, the phrase “homosexual acts” has been taken out of a student sexual misconduct statement, and the new policy states that “physical sexual intimacy is to be expressed in the context of marital fidelity,” but to know what “marital fidelity” means, one is referred to a 1963 Baptist position paper that defines marriage as between a man and a woman.
Pepperdine’s law and business schools have officially recognized LGBTQ student groups, which are limited to discussion of LGBTQ issues, networking and professional opportunities. But in 2011, Pepperdine denied official recognition for a LGBTQ undergraduate group that was perceived as an “identity group” rather than a professional networking group. The former did not fit with the Christian mission; professional networking does. I leave it to the reader to decipher the reasons why.
To navigate the tortured terrain of LGBTQ policies at Christian colleges, one must know the difference between sanctioned and unsanctioned student clubs, the difference between support and advocacy (when does a support group for LGBTQ students morph into an unacceptable advocacy group?), and whether a student handbook rule is referenced in a faculty handbook, therefore making the student rule applicable to faculty. What is crystal clear is that some CCCU institutions accept the tuition dollars of LGBTQ students, tell them that they are loved, provide small groups and support groups for them, train RAs to be more sensitive to LGBTQ issues, but will not hire them should they want to work at their alma mater. On commencement day, LGBTQ students are celebrated; the day after they will not be hired because they are openly gay and/or want to have a life partner. No longer at Eastern Mennonite and Goshen.
Christian colleges face a foreboding future. The most obvious challenge is one shared by any private institution -- namely, cost. Gordon College, a CCCU member institution, for example, is facing a $3.8 million budget deficit due to low enrollment. But if Christian higher education is perceived as dyspeptic and anachronistic, then younger millennials, fewer of whom are identifying as religious, will go elsewhere. If conservative boards of trustees, parents, donors and presidents are more concerned about the “brand,” “the optics,” then perhaps lines in the sand will be drawn and some Christian colleges will survive only because they become fortresses against the world.
At that point they will cease to be institutions of free inquiry, no longer universities. The changes at Eastern Mennonite and Goshen give me hope that more Christian colleges will be courageous, grapple with new evidence, hold on to a hermeneutic that is life giving and not life denying, and be prophetic in positions they take. I was moved to tears when I read that the student government at one Christian college passed a resolution asking that sexual orientation be included in the university’s nondiscrimination hiring policy.
Nancy Heisey, professor of biblical studies at Eastern Mennonite, stated of her university's willingness to hire gay and lesbian people in same-sex marriages, “We have a strong commitment to Christian principles, including that justice is central to the Scripture's teaching.” I am reminded of jazz great Sam Cooke’s song “A Change Is Gonna Come.” May other CCCU institutions recognize that to be Christ centered is to be justice centered and decide to be more inclusive and change, as Eastern Mennonite and Goshen have.
The author asked to be anonymous to avoid endangering employment at the college where the author teaches.
Franz Kafka left explicit directions concerning the journals, letters and manuscripts that would be found following his death: they were to be burned -- all of them -- unread. Whether he expected Max Brod, the executor of his estate, to follow through with his instructions is a matter of some debate. In any case, Brod refused, and the first volume of Kafka’s posthumous works came out shortly after the author’s death in 1925.
The disregard for his wishes can be explained, if not justified, on a couple of grounds. For one thing, Kafka was a lawyer, and he must have known that expressing his intentions in a couple of notes wouldn’t be binding -- it takes a will to set forth a mandate in ironclad terms. And, too, Brod was both Kafka’s closest friend and the one person who recognized him as a writer of importance, even of genius. Expecting Brod not to preserve the manuscripts -- much less to leave them unread! -- hardly seems realistic.
On the other hand, Kafka himself destroyed most of his own manuscripts and did so in the same way he told Brod to do it, by setting them on fire. It is reasonable to suppose he meant what he said. If so, world literature has been enriched by an act of blatant disloyalty.
“Don’t pull the Max Brod trick on me,” Michel Foucault is said to have admonished friends. The philosopher and historian did Kafka one better by including a blunt, categorical line in his will: “No posthumous publications.”
Be that as it may, in late spring the University of Minnesota Press issued Language, Madness, and Desire: On Literature, a volume of short texts by Foucault originally published in France two years ago and translated by Robert Bonnono. The same press and translator also turned the surviving pages of an autobiographical interview from 1968 into a little book with big margins called Speech Begins After Death. The title is kind of meta, since Foucault, like Kafka, seems to be having an unusually wordy afterlife.
Foucault died in June 1984, the very month that the second and third volumes of The History of Sexuality appeared. He left a fourth volume in manuscript, but given the circumstances, it was destined only for the archives. And so things stood for about a decade. There was the occasional lecture or transcript of an interview he had given permission to publish, with claims made it was the “final” or “last” Foucault. After a while this started to get kind of silly, and it only made the thinker’s absence more palpable. Daniel Defert, the administrator of his estate, had also been Foucault’s lover for many years, and he seems to have taken the ban on posthumous works to heart in a way that Max Brod never did.
But by 1994, Defert relented enough to allow a four-volume collection of Foucault’s essays and interviews to be published in France. (A few years later, the New Press brought out an abridged translation as the three-volumeEssential Works of Michel Foucault.) By the 20th anniversary of the thinker’s death in 2004, the situation had changed dramatically. Six of Foucault’s 13 courses of lectures at the Collège de France had been published and the rest were on the way. In September, Palgrave Macmillan is bringing out On the Punitive Society, at which point the whole series will be available in English. That adds another shelf’s worth of stout, dense and rich volumes to the corpus of Foucault’s work -- overlapping in various ways with the books he published (e.g., the Punitive Society lectures were given as he was working on Discipline and Punish) but developing his ideas along different trajectories and in front of an audience, sometimes in response to its questions.
In a paper published last year, John Forrester, a professor of history and philosophy at the University of Cambridge, expresses a mingled appreciation and dismay at how what he calls Foucault’s “pithy and ultra-clear command, ‘Pas de publication posthume,’” has been breached in the case of the Collège de France courses. The paper appears in Foucault Now: Current Perspectives in Foucault (Polity).
“Because these were public lectures,” writes Forrester, “they had already been placed in the public domain ‘dans son vivant,’ as the French language says, in his lifetime. Their transcription and editing therefore is not the production of posthumous texts, but the translation from one already published medium -- for instance, the tape recorder -- to another, the book.” While grateful that Brod and Defert “found a way to publish what Kafka and Foucault forbade them to publish,” he says, “that doesn’t mean to say I think they were right. They did right by me and many, very many, others. But I can’t see how they obeyed the legal injunction placed on them.”
Language, Madness, and Desire consists of six items it was not difficult to squeeze through that dans son vivant loophole, since they were delivered to audiences as radio broadcasts or lectures between 1963 and 1970. Speech Begins After Death is another matter entirely. It consists of the opening exchanges from a series of interviews Foucault gave to Claude Bonnefoy, a literary critic, in 1968. The plan had been to produce a book. It never came together for some reason (1968 was a big year for getting distracted), none of it was published and most of the transcript has been lost.
In short, there’s no real wiggle room for rationalizing Speech Begins After Death as permissible under the terms of Foucault’s will. And this is where things get interesting. To be blunt about it, Language, Madness, and Desire is not going to come as much of a revelation to anyone who has read, say, the literary essays in Language, Counter-Memory, Practice (the Cornell University Press anthology of Foucault’s work from the 1960s and early 1970s that’s still one of the best things out there). It would not be surprising if it turns out there are dozens of other such pieces which could slip past Foucault’s ban without adding much to the body of work he saw through the press.
By contrast, Speech Begins After Death is (1) a clear violation of the author’s wishes and (2) a pretty good example of why violating them might be a good idea. In later years Foucault was used to giving interviews but in 1968 he was uncomfortable with the whole process. Being treated as an author or a literary figure (rather than an academic) only makes him more nervous. As sometimes happens, the performance anxiety, once he gets it under control, inspires him to think out loud in a way that seems to surprise him.
One passage almost jumps off the page:
“As long as we haven’t started writing, it seems to be the most gratuitous, the most improbable thing, almost the most impossible, and one to which, in any case, we’ll never feel bound. Then, at some point -- is it the first page, the thousandth, the middle of the first book, or later? I have no idea -- we realize that we’re absolutely obligated to write. This obligation is revealed to you, indicated in various ways. For example, by the fact that we experience so much anxiety, so much tension if we haven’t finished that little page of writing, as we do each day. By writing that page, you give yourself, you give to your existence, a form of absolution. That absolution is essential for the day’s happiness.”
Like Kafka's demand for a book that “must be the ax for the frozen sea within us,” these lines are worth whatever guilt was incurred by whoever rescued them for us.
Amid calls for his termination, Central Connecticut State suspends professor who's had skirmishes with the law -- even though none of the crimes and alleged crimes relate to teaching or publications. When professors break the law, what should a college do?
OneLogin’s recent recruitment campaign showing diverse engineers on billboards in the San Francisco Bay Area inspired a viral hashtag: #ILookLikeAnEngineer.
Frustrated by the microaggressions we experience as “nontraditional” faculty, we started a new hashtag: #ILookLikeAProfessor. The flurry of photos, retweets and horror stories since last Thursday suggests that we are not alone in experiencing entrenched stereotypes and bias -- both subtle and explicit.
The female professor mistaken for an undergraduate. She was grading homework, not doing it.
Male teaching assistants assumed to be the professor.
Faculty members of color assumed to be the custodian.
Asian professors assumed to be Chinese food delivery drivers.
We are not making this up.
These are real posts from real people -- real professors in diverse fields across the United States -- who do not fit the stereotype of a 60-something, white male professor, usually in tie and tweed. Extra credit if glasses and a beard came to mind.
With the start of the new academic year just around the corner, it’s worth remembering how much the professoriate has changed over the past half century. The civil rights movement, feminism, gay rights, the Americans With Disabilities Act and more transformed many aspects of society, including the academy. It’s time for our assumptions about faculty to catch up with reality.
So, who are we?
We are economists and art historians, musicians and engineers, chemists and sociologists, poets and mathematicians.
We are black, brown and white -- and every shade in between.
We come in all shapes, sizes and proportions.
We are feminine, masculine and androgynous -- and sometimes we look different one day to the next.
We are queer, straight and questioning.
We speak many languages, and some of us have accents.
We have voices high and low, loud and soft.
We wear suits and jeans, hiking boots and high heels.
We have dreads and dyed hair -- and yes, some of us do have beards.
We wear glasses and contacts, ties and scarves, kipot and hijabs.
We have earrings, tattoos and piercings -- only some of which you can see.
We are partnered and single, parents and child-free, caregivers and neighbors.
We are Christian and atheist, Muslim and Jewish, Hindu and Buddhist, pagan and agnostic.
We are athletes and bookworms, hikers and artists, musicians and chefs, gardeners and dog walkers.
In other words, we look just like you.
We look like professors because we are professors. It’s long past time that we ditch the stereotype.
In 2011, Paul Clemens, a writer from Detroit published an up-close and personal look at deindustrialization called Punching Out: One Year in a Closing Auto Plant. It recorded the year he spent on a work team hired to dismantle and gut one of the city’s remaining factories. I wrote about the book when it came out, and won’t recycle anything here, but recall a few paragraphs expressing a particular kind of frustration that a non-Detroiter can only sympathize with, not really share.
The cause was a tendency by outsiders – or a subset of them at any rate – to treat the city’s decline as perverse kind of tourist attraction or raw material for pontification. Clemens had lost all patience with arty photographs of abandoned buildings and pundits’ blatherscate about the “creative destruction” intrinsic to dynamic capitalism. He also complained about the other side of the coin, the spirit of “we’re turning the corner!” boosterism. “No Parisian is as impatient with American mispronunciation,” he wrote, “no New Yorker as disdainful of tourists needing directions, as is a born-and-bred Detroiter with the optimism of recent arrivals and their various schemes for the city’s improvement.”
Dora Apel, a professor of art history and visual culture at Wayne State University, has, in effect, gathered everything that dismays and offends Clemens between the covers of Beautiful Terrible Ruins: Detroit and the Anxiety of Decline (Rutgers University Press).
She calls Detroit “the quintessential urban nightmare in a world where the majority of people live in cities.” But nightmare imagery can be seductive, making Detroit “a thriving destination for artists, urban explorers, academics, and other curious travelers and researchers who want to experience for themselves industrial, civic, and residential abandonment on a massive scale.”
That lure is felt most keenly by people who, after the “experience,” enjoy the luxury of going home to someplace stable, orderly, and altogether more pleasant. It’s evident Apel finds something ghoulish about taking pleasure from a scene of disaster, “feeding off the city’s misery while understanding little about its problems, histories, or dreams.” But the aesthetic appeal of ruins – the celebration of old buildings crumbling picturesquely, of columns broken but partly standing, of statuary fractured and eroded by time –- goes back at least to the 18th century, and it can’t be reduced to mere gloating. The author makes a brief but effective survey of “ruin lust,” a taste defined by “the beautiful and melancholic struggle between nature and culture,” as well as the feelings of contrast between ancient and modern life that ruins could evoke in the viewer, in pleasurable ways. She also points out how, in previous eras, this taste often involved feelings of national superiority, as with well-off travelers enjoying the sight of another country’s half-demolished architecture. (At least a tinge of gloating, in that case.)
It’s not difficult to recognize classical elements of the ruins aesthetic as "The Tree" by James D. Griffioen. One of a number of images reproduced in the book, Griffioen took the photograph in the Detroit Public Schools Book Depository, in which a sapling has taken root in the mulch created by a layer of decomposing textbooks – an almost schematic case of “the beautiful and melancholic struggle between nature and culture.” But Apel underscores the differences between the 21st century mode of “ruination” and the taste cultivated in earlier periods. For one thing, ““modernist architecture refuses the return of culture to nature in the manner of ancient ruins in large part because the building materials of concrete, steel, and glass do not delicately crumble in the picturesque way that stone does.”
More importantly, though, the fragments aren’t poking up from some barely imaginable gulf in time and culture: Detroit was, in effect, the world capital of industrial society within living memory. In his autobiography published in the early 1990s, then-Mayor Coleman Young wrote: “In the evolutionary urban order, Detroit today has always been your town tomorrow.” The implications of that thought are considerably more gloomy than they were even 20 years ago. One implication of imagery such as "The Tree" is that it’s not a reminder of the recent past so much as a glimpse at the ruins of the not too distant future.
Photography is not the only cultural register in which the fascination with contemporary ruins makes itself evident: There are “urban exploration,” for example: a subculture consisting (it seems) mainly of young guys who trespass on ruined property to take in the ambience while also enjoying the dangers and challenges of moving around in collapsing architecture. Apel also writes about artists in Detroit who have colonized depopulated areas both to reclaim them as living space and to incorporate the ruins into their creative work.
The effects are not strictly local: “the borders between art, media, advertising, and popular culture have become increasingly permeable,” Apel writes, “as visual imagery easily ranges across these formats and as people produce their own imagery on websites and social media.” And the aestheticized ruination of Detroit feeds into a more widespread (even global) “anxiety of decline” expressed in post-apocalyptic videogame scenarios, survivalist television programs, zombie movies, and so on. Not that Detroit is the inspiration in each case, but it provides the most concrete, real-world example of dystopia.
“As the largest deteriorating former urban manufacturing center,” Apel writes, contemporary Detroit is a product of an understanding of society in “rights are dependent on what people can offer to the state’s economic well-being, rather than vice-versa,” and “the lost protection of the state means vastly inadequate living conditions and the most menial and unprotected forms of labor in cities that are divested of many of their social services and left to their own devices.”
Much of the imagery analyzed in Beautiful Terrible Ruins seems to play right along with that social vision. The nicely composed photographs of crumbling buildings are usually empty of any human presence, while horror movies fill their urban landscapes with the hungry undead -- the shape of dreaded things to come.
I have spent the last several years participating in the collective hand-wringing that has occupied humanists and liberal arts educators everywhere. There is no point in rehashing the indignities that academe has suffered at the hands of legislators, administrators, corporations, and student-consumers. You know the lament all too well.
There seems to be some sense among us that what we are experiencing is an unprecedented problem; that somehow profit incentives, patriarchal administrators, corporate values – in short, “the Man” – have only recently taken over American education. We like to believe that once upon a time higher education had a golden age that was due, not simply to the nation being flush with cash or to growing populations, not to bull markets or boards full of generous millionaires, but to high-minded, honorable prevailing philosophies about democracy and justice that have since fallen by the wayside.
But, as philosopher Stan Goff points out, the idea of education-for-all didn’t enter American culture until well after the Civil War (even then it remained heavily segregated), and this was for somewhat suspect reasons. Progressives at the turn of the century “were concerned about the feminization of men, the recent influx of non-English speaking immigrants, the temptations to vice of urban life for boys, and a general lack of discipline among the young. The compulsory public school … was a ready-made solution. Progressives equated ‘good citizenship’ with respect for authority.” Widespread education was designed to produce manly men, and obedient women and workers, who would answer their nation’s call in peace and wartime. Football, Boy Scouts, and the National Rifle Association were parallel projects of this era. A flourishing of land grant universities and private institutions – supplementing the already-existing elite institutions – began producing a steady supply of human capital so that America could enhance its economic and military dominance.
In other words, American education has always owed its primary existence to the Man and has never really challenged his dominance. Not everyone is equally invested; there have been student uprisings here and there, and certainly particular persons on the margins have called for radical change. But by and large higher education has never demanded a fundamental re-thinking of the American project.
For example, on the whole, the educational sector doesn’t call for the return of the continent to Native Americans (my house!), payment of reparations to descendants of slaves (my taxes!), the end of industrial economies (my iPhone!), or the radical revision of state or national borders (my scary neighbors!). On the whole, we don’t question the concepts of nation-states, economic and social progress, the primacy of individual choice, or the use of state force – instead quibbling over their limits. Such concepts are the water we swim in and the air we breathe; except for an extremely small number of us who truly live off the proverbial grid, we hardly notice these assumptions, much less interrogate them.
And even those of us who are radical enough to challenge governmental or corporate sectors are almost certain to rebel against any wholesale revision of higher education. We may call for tweaks – more diversity, more tenure-track lines, fewer administrators, better family leave, better need-based financial aid. But the end goal of democracy (not to mention getting/keeping my job) stays the same. It’s not just Arne Duncan who sees educators as “nation builders.” Many times have I heard colleagues bring up “citizenship” when pressed to defend the work that we do.
While we may hope good citizens will speak truth to those in power, we must also admit that most of our students will end up – like us – not as revolutionaries but as more or less comfortable (and eminently replaceable) cogs in the global economic machine. Even in flagship institutions of liberal arts, a mainly white Western canon prevails that is designed to shape students who will foster some variation of American-style democracy, at home and abroad.
This is not the mythology we live by, of course. I, for one, am conscious to include readings in my classes that will anger nice white liberals and Fox News devotees alike. And I like to think of myself as counter-cultural in my educational ideals of “learning to think” or “awakening human beings,” which often involve a soft-focus image of toga-clad ancient Greeks or medieval monks, mingled in with brochure-worthy photographs of diverse and smiling young people doing good works. Such images are what motivated many of us to work in education, and are among the reasons (along with summers, health benefits, and retirement funds) that many of us stay on even after we’ve become disillusioned.
But in the end, I’m pretty dedicated to colleges and universities continuing to exist mostly as they are; the liberal arts education that has shaped me is, in very real ways, my religion. I’m unlikely to renounce it as such. Thus, all the stories I’ve told myself about changing the world are probably indicative of my wishes and best intentions rather than my reality.
What if revolution, not mere reform, is called for? What if we – yes, even those on the margins – have been so indoctrinated into the putative value of education-for-freedom that we can no longer see the ways in which educators – as educators – are part of the problem? If, as Audre Lorde says, the master’s tools will never dismantle the master’s house, what makes us think we can somehow make the institutions of American higher education work for something other than the master? Is it at least possible that, just maybe, the American educational system is so corrupt at its roots that we should welcome its passing?
Don’t get me wrong. If the Ivy Leagues and other billionaires are all that’s left when the rest of us crumble, I will be furious. But perhaps, if we take the long view, we could rejoice in the opportunity that this crisis presents – if not for us as individuals, then at least for future generations on the earth. What if our demise will make room for, be the mulch that nourishes, something even better? Perhaps instead of institutions imprisoned by endowments, academic calendars, boards, legislators, tuition discounts, or profit margins, there will be “flying universities,” “artisanal” colleges, online-residential hybrids, or various kinds of micro or macro institutions actually run by the people and for the people, not yet invented or even imagined.
As someone once said, “Everything. Everyone. Everywhere. Ends.” Why not us?
Kate Blanchard is associate professor of religious studies at Alma College.
As we approach the anniversary of Steven Salaita’s “unhiring” by the University of Illinois at Urbana-Champaign, it is worth reflecting on what has and has not happened over the past year. We know much more than we did one year ago about the decision-making process that led to the Palestinian-American scholar losing his job after tweeting during the Israeli assault on Gaza in the summer of 2014. We also know more about the balance of powers within the university, but many questions remain unresolved -- as do the crises precipitated by the decision.
What looked at first like an ill considered, quickly made decision taken without due consultation looks today like a conscious choice to cast aside the usual processes of deliberation and the customary deference to scholarly expertise. The relevance of donor and political pressure on the decision remains one of the uncertainties in the case. What we do know now is that the university had already hired lawyers before sending the fateful letter to Salaita (and that they have since spent hundreds of thousands of dollars on those lawyers, with no clear end in sight).
We also know that the chancellor and provost did consult very selectively with faculty members; they simply did not consult with those who had any standing in the hiring or tenure process, nor with those who had any expertise in the various areas that the Salaita controversy — or for that matter Salaita’s scholarship — concerned. In contrast to the first moments of the controversy, today it is hard to see the unhiring as a simple blunder; it appears rather as the result of a calculation — or, better, miscalculation — about the relative “costs” to the university of hiring or not hiring Salaita.
Regardless of how one frames that original decision, it remains the case that we are yet to see redress for the many injustices that were precipitated by the August 1, 2014 letter to Steven Salaita from Chancellor Phyllis Wise and Vice President for Academic Affairs Christophe Pierre. Among those injustices are the ones done to Salaita’s career and well being as well as his freedom of expression; to colleagues in American Indian studies, who had their search overturned and their program irreparably damaged; and to those of us in the greater Urbana-Champaign campus,, who have suffered the violation of shared governance and the erosion of our ability to maintain an engaged and open intellectual community that many faculty members have spent years (and even decades) building.
Just as the lawsuits emerging from the case remain open, so does the attempt to think through its implications. In a previous essay, I called the Salaita case “overdetermined” in order to capture the many intersecting forces that came together in the unhiring.
I believe this remains an important optic. Like the medium of Twitter itself, the case involves a ricochet of colliding messages and mixed contexts. Ultimately at stake in the case, I argue here, is the interpretive power to decode those messages and to frame the political stakes of those contexts: the conflict-laden contexts of Israel-Palestine, indigeneity, and the university itself.
The tweets that were the pretext for the university’s withdrawal of Salaita’s job offer were written in a moment of pronounced state violence. Salaita was responding to the latest of Israel’s assaults on Gaza — an assault that followed many others on the blockaded territory in previous years and that eventually cost the lives of more than 2,000 Gazans and injured many thousands more. Sixty-five percent of the Palestinian dead, including over 500 children, were civilians, according to the United Nations. Several dozen Israeli soldiers and several Israeli civilians were also killed by Hamas. To identify Israeli state violence as the first context of the Salaita affair is not to deny or downplay war crimes committed by Hamas, but it is to situate the actions of Hamas in the context of ongoing occupation, blockade, and invasion.
While it is tempting to draw a straight line from the assault on Gaza to the unhiring in Urbana-Champaign, the causality is more complicated. Neither simply a “local” affair nor an abstractly global one either, the Salaita controversy condenses diverse sites of conflict as well as various streams of social transformation. Depending on one’s perspective, one can easily see the contemporary politics of anti-Semitism, anti-Arab racism, or settler colonialism at the center of the controversy. It certainly illustrates the transnational dimensions of the Israeli-Palestinian conflict, which includes diasporic contestations of various sorts by Jews and Palestinians in the U.S. and elsewhere.
The conflict over Salaita also grows out of local, national, and transnational features of indigenous history, from Illinois’s ugly “Chief” mascot history to attempts to construct trans-indigenous solidarities that include Palestine. The particularities of those contests then play out in one instantiation of a widely shared neoliberal program to remake the university through top-down, anti-faculty forms of governance. Finally, all of those currents have converged on a stage shaped by the ongoing transformation of public discussion by new media platforms such as Twitter and Facebook.
Why do these various, overdetermined contexts intersect in the Salaita case and what is at stake in that intersection? While Palestinian and indigenous struggles concern, above all, claims to sovereignty and territory, those are not the immediate stakes of this controversy. I want to propose instead that the kernel holding together the multidimensional event of the unhiring is a contestation over interpretive power — a form of contestation that has an indirect though still critical relation to the struggle for sovereignty.
In the most obvious sense, the case involves a contest over the meaning of Salaita’s tweets: are they anti-Semitic or ironic? Within or beyond the bounds of acceptable speech? Relevant or not to an academic appointment? But the real struggle over interpretation lies not in assessment of the content of Salaita’s controversial statements, but rather in the institutional framing of the act of interpretation. What is really at stake is not what these statements mean but who gets to decide on the meaning of scholarly and public discourses and under what conditions. Should non-specialist administrators and politically appointed trustees have the authority to override the carefully vetted decisions of faculty? Should outside pressure — whether from donors or politically-motivated bloggers — be allowed to insinuate itself into academic considerations?
In the struggle over the institutional framing of the Salaita case, resistance to the neoliberal transformation of the university comes to occupy a social location provisionally analogous to Palestinian and indigenous resistance to colonialism and state power. The content of those three struggles is not identical by any means; the histories, scales, and stakes vary decisively. What links them — contingently but powerfully — is the fact that in all three cases, activists and scholar-activists confront powerful hegemonies of interpretive power.
In the United States, at the least, neither the Palestinian cause nor movements of American Indians and indigenous people more generally confront a neutral public sphere. The dominant “common sense” is aligned against the claims of these groups. Similarly, the struggle for control of the university confronts a market logic that has increasingly saturated the idea of the university in recent decades. Within that logic, critical thought of the kind practiced by Salaita and his defenders can only be considered beyond the boundaries of the acceptable: it is, in the shorthand evoked by Chancellor Wise and the university’s Board of Trustees, “uncivil.” Indeed, as Joan Scott and others have argued in relation to the Salaita case, the discourse of civility provides the “positive” vision that guides the hegemonic interpretive framing of the controversy. In this case it sutures together the three very different contexts I have highlighted.
In principle, the struggle over interpretive power could link together any number of radical and progressive causes; that is precisely the point of stressing the contingent, overdetermined nature of this case. But that point alone is far too general to be helpful. A return to the specific histories activated by the affair can help elucidate the particular configurations of power at stake.
It is not “accidental” that American Indian studies should be at the center of this controversy given the University of Illinois’s history of anti-indigenous stereotyping and hostility. It is not “accidental” that Israel’s far-away occupation and blockade of Palestine should have ignited the controversy given the central role that the US plays in propping up Israeli policies and the importance that Jewish-American opinion (as divided as it is) plays in maintaining U.S. support for Israel. Finally, it is not “accidental” that these two fields of conflict should intersect with struggles over the balance of power between faculty and administrators in the university.
In a context in which the university’s mission has been undermined by privatization, and fund-raising has replaced public funding, administrators increasingly feel the need to protect the “brand” by enforcing stricter limits on acceptable speech by faculty and students. Certain radical claims to sovereignty by Palestinian and indigenous activists exceed the bounds of acceptable liberal political discourse in the US and seem to threaten the university’s ability to raise money from wealthy private donors and to make its case for public funding to skeptical state legislatures. “Incivility” must be kept well hidden in order for the public university to function in such a precarious environment.
Yet, it is important to insist that these hegemonies are not total. Indeed, there would be no Salaita controversy — not to mention no boycott of the University of the Illinois — if capitulation to common sense were total. And this is another reason that Twitter and the university have served as sites for this mediated struggle over Israel-Palestine and indigeneity. Those are sites that remain partially open, that remain spaces of possibility despite the pressures of liberal consensus politics and neoliberal normalization. They are spaces from which alternative interpretations and counter-narratives can emerge.
Although the boycott under which some scholars are responding to the Salaita unhiring by staying away from Illinois has had a significant negative impact on faculty and students in the humanities and social sciences at Illinois (with numerous lectures and conferences being canceled), it has also served as the occasion for creating alternative venues and institutional structures. A number of speakers, including Katherine Franke, Bruce Robbins, and Todd Presner, have paid their own way and engaged with the Urbana-Champaign community in non-university spaces. The Salaita case thus illustrates how, as the university is opened up to market forces, scholars cannot simply retreat into the walls of the Ivory Tower: new solidarities that can serve as platforms for the struggle over interpretation need to be created that cut across the boundaries of the university.
There is enough at stake in the Salaita case by itself, but as a point of condensation for multiple conflicts it is also a kind of mirror that reveals a larger political landscape. For those of us who do not share the consensus views on Israel, indigeneity, or the privatization of the university, the case has been an opportunity to engage in a struggle over interpretation.
Like everything else that matters, interpretation is saturated with power. But as scholars trained in the arts of critical analysis — some of whom have far greater job security than many Americans — we possess the tools to engage the uneven field of interpretive power. We have the means to offer counter-narratives, to show how Urbana and Gaza are linked, but also to suggest how complicated those links are.
Michael Rothberg is professor and head of the Department of English at the University of Illinois at Urbana-Champaign, where he also directs the Initiative in Holocaust, Genocide, and Memory Studies. His most recent book is Multidirectional Memory: Remembering the Holocaust in the Age of Decolonization.