Temple was the first institution to offer a doctorate in African-American studies and has seen heated debates over the discipline's direction. The rejection of the department's choice as chair has set off a new controversy.
U. of Texas wanted to honor a late scholar whose career had focused on Middle Eastern studies. But when Arab contributors found out that two Israelis would be published in the same work, a tribute fell apart.
Perhaps you’ve heard of Rule 34. It expresses one of the core imperatives of 21st-century culture: “If something exists, there is porn about it. If no porn is found at the moment, it will be made. There are no exceptions.”
Consider, for example, the subculture devoted to eroticizing the My Little Pony cartoon characters. More people are into this than you might imagine. They have conventions. It seems likely that even more specialized niches exist -- catering to tastes that “vanilla” My Little Pony fetishists regard as kinky -- although I refuse to investigate the matter.
Consider it a topic for future issues of Porn Studies, a new journal published by Routledge. “Just as there are specialist journals, conferences, book series, and collections enabling consideration of other areas of media and cultural production,” says the introductory note for the inaugural double issue, “so pornography needs a dedicated space for research and debate.” (Last year, many people disagreed: news of the journal inspired much protest, as Inside Higher Ed reported.)
The most interesting thing about that sentence from the journal's editors is that “pornography” functions in it as an active subject. Porn is figured almost as an institution or a conscious entity — one capable of desiring, even demanding, scholarly recognition. The satirical Rule 34 comes very near to claiming agency for porn. With Porn Studies, there is no such ambiguity about the sheer world-making power of pornography.
It’s not just that the journal acknowledges the porn industry as an extremely profitable and expanding sector of the economy, or as a cultural force with an influence spreading all over the map. That much is a commonplace, and Porn Studies take it as a given. But something more happens in the pages of Porn Studies: academic discourse about porn turns into one more manifestation of its power.
One recent call for papers refers to “the emerging field of porn studies” — a piece of academic-entrepreneurial boilerplate that proves significant without actually being true.
It’s now a solid 10 years since Duke University Press published a volume of some 500 pages, also called Porn Studies, edited by Linda Williams, whose Hard Core: Power, Pleasure, and the “Frenzy of the Visible” (University of California Press, 1989) is by far the most-cited book in the new journal.
She wrote it amid the drawn-out and exhausting battles of the 1980s, when an uneasy alliance formed between radical feminists, rallying under the slogan “pornography is the theory, rape is the practice," and the religious right, which wanted to enforce the sexual “Thou shalt nots” by law. On the other side of the barricades were the "sex-positive” feminists and civil libertarians, who were not necessarily pro-porn so much as anti-censorship.
Hard Core went beyond the polemics, or around them. Williams approached the X-rated films of the 1970s and ‘80s with as much critical sophistication and command of the history of film as other scholars more typically brought to the cinematography of Eisenstein or Hitchcock. She didn’t deny the misogyny that appeared on screen but saw other forces at work as well -- including scenarios in which women were sexually exploratory or assertive in ways that the old phallic order couldn’t always predict or satisfy.
To anti-porn activists, whether feminist or fundamentalist, it went without saying that the market for pornography consisted of heterosexual men. Likewise, the heterosexual-male nature of the “gaze” in cinema was virtually an axiom of feminist film theory. Williams challenged both suppositions. Women became an ever more substantial share of the audience, especially after videotape made it possible to watch at home.
The status of Williams’s work as foundational suggests that porn studies began “emerging” at least a quarter century ago. Recent volumes of papers such as C'Lick Me: A Netporn Studies Reader (Institute for Network Cultures, 2007) and Hard to Swallow: Hard Core Pornography on Screen (Wallflower, distributed by Columbia University Press, 2012) take the field as growing but established.
For that matter, porn-studies scholars would have every right to claim ancestors working long before the Motion Picture Association of America invented the X rating. In The Horn Book: Studies in Erotic Folklore and Bibliography (1963), Gershon Legman surveys about two centuries’ worth of secondary literature, in several languages. The contributors launching the new journal do not cite Legman, much less any of the figures he discusses, even once. Nor does a single paper discuss any form of pornography that existed prior to the advent of video and digital forms of distribution.
The body of commentary and analysis predating Hard Core includes psycho- and sociological research, legal debate, and humanistic work in a variety of fields. It is seldom mentioned, except when dismissed as simplistic, under-theorized, or hopelessly in thrall to moralistic or ideological assumptions rendering its questions, let alone its arguments, highly suspect.
“Porn studies,” in other words, is not synonymous with scholarship about pornography, as such. It is its own demarcated zone of discussion, one that is present-minded and digital media-oriented to an extreme. (All of the double issue is available to the public here.)
An exemplary case is the paper called "Gonzo, trannys, and teens – current trends in U.S. adult content production, distribution, and consumption” by Chauntelle Anne Tibbals, who is identified as an independent scholar from Los Angeles. It is perhaps more sociological in perspective than an article from a porn-industry trade journal, and like other papers in Porn Studies it shies away from generalization even when inching in that direction:
"Some performers, producers, and others call for ‘feminist porn’ – a self-identified genre and social movement with no one articulated definition. At the same time, many producers and performers reject the attribution while creating content that seems decidedly feminist. At the center of every one of these debates are porn performers themselves, each of whom are impacted by individual choice, market concerns, and representation. ... Even if one was to focus only on the images contained in ‘pornographic’ representations, with no consideration of production processes or variations in reception, we would still be left with a vast and diverse body of work that is constantly shifting. Consequently, there is no way to say ‘pornography is this’ or ‘pornography is that’ – as I have done in this essay, all one can really do is attempt to describe and contextualize existing patterns as they currently resonate (in this case, with me).”
You cannot step into the same porno river twice. Even so, Lynn Comella’s “Studying Porn Cultures” calls for researchers to spend more time studying the performers, marketers, and fans in their native element, such as the Adult Entertainment Expo. (Other “data-rich field sites” range “from erotic film festivals to feminist porn sets to adult video stores.”)
Comella, an assistant professor of women’s studies at the University of Nevada at Las Vegas, writes that she has attended the Expo “every year since 2008, as a researcher, a credentialed member of the media, and an invited participant in the Expo’s popular seminar series,” twice serving as moderator for the session devoted to women and the adult entertainment market.
The practice of “porn-studies-in-action” she advocates is “accountable to cultural plurality, specificity, and nuance,” she writes, and “rejects sweeping generalizations and foregone conclusions that rely on preconceived notions about pornography’s inherent ’truths’ and effects."
The problem with that being that nobody ever intentionally accepts “sweeping generalizations and foregone conclusions that rely on preconceived notions.” If only it were that easy. One era's critical perspective can become the next's tacit presupposition. In Hard Core, Linda Williams challenged the assumption that the pornographic film was one big homogenous set of images and messages created to stroke the egos and stoke the libidos of straight white guys. By contrast, the papers in Porn Studies all take Williams’s interpretive stance as a given: the audience, product, meanings, and the effects of porn are intrinsically heterogeneous and in flux. Any general statement beyond “More research is needed” thus becomes highly problematic..
A BBC documentary from a few years ago included a segment on one of the better-known subgenres of recent times. In it, the porn director (who is also the star) has a performer who is young, but of legal age, dress up as if she were a schoolgirl. He then brutalizes them at length with slapping, gagging, abusive penetration, and a running stream of verbal humiliation, after which he and other men urinate on her.
The documentary crew follows an actress to the set, with the camera focusing in very closely when the male performer begins chatting up the actress as the scenario begins. The expression on his face is chilling. Ted Bundy must have gotten that look in his eyes once the victim was handcuffed.
Given that his video product, too, is part of the diverse and polymorphous carnival that is the adult entertainment industry — and not the least profitable part, by any means — I would have liked to see a paper in Porn Studies that asked about damage. So many of the contributors celebrate the feminist and LGBT-positive aspect of the industry that a naive reader would think nothing else sold, and that it exists solely to increase the sum of happiness in the world. This may be doubted; indeed, it must be. At times, the journal seems not just to analyze the world of porn but to be part of it. Not in the way the performers are, by any means, but perhaps as a sort of conceptual catering service.
“Having had to cut the book nearly in half for the final proof,” writes Benjamin Kline Hunnicut in the introduction to Free Time: The Forgotten American Dream (Temple University Press), “I am keenly aware of things omitted, still on my computer’s hard drive awaiting publication.” This is offered as an apology, though none is needed. Excessive leanness must be the least common fault in scholarly prose – and Free Time deserves laurels for not imposing too much on the scarce resource in question.
The author teaches at the University of Iowa, where he holds the enviable post of professor of leisure studies. He has devoted the better part of 40 years – including two previous books – to investigating the sources and implications of something both obvious and overlooked about the American work week.
Throughout the 19th and into the early 20th centuries, working people fought for, and won, more free time -- despite the dire mutterings by the pundits, who announced that economic collapse and social turmoil were sure to follow if employees worked for only, say, 10 hours a day, 6 days a week. By the 1930s, the combination of increased industrial productivity and collective bargaining made the trend for an ever-shorter work week seem irreversible. The demands of war production didn’t erase the expectation that the 40-hour week would shrink to 30, after peace came.
It did, in some places. For example, Hunnicutt and his students have interviewed retired factory workers from Akron, Ohio and Battle Creek, Michigan who won the six-hour day, once the war was over. And the social forecasts and magazine think-pieces from the 1960s and ‘70s made it sound like the great challenge of the future would be coping with all the free time created by automation and computerization.
Like hover-car collisions and overbooked hotels on the moon, the extremely shortened work week turns out not to be a major 21st-century social issue, after all. “Since the mid-1970s,” Hunnicutt says, “we have been working longer and longer each year, about a half a percentage point more from year to year….” It adds up. Americans now log an average 199 hours -- almost five 40-hour work weeks -- more per year than they did in 1973 – putting in “longer hours than those of other modern industrial nations, with the exception of South Korea,” according to the findings of the Bureau of Labor Statistics in 2009.
The point here is not that extrapolation is unreliable -- or even that seriously regressive trends can begin to seem normal after a generation or two. Hunnicutt begins his broad narrative of how things got like this in the 18th century, with a comment by Benjamin Franklin: “If every man and woman would work for four hours each day on something useful, that labor would produce sufficient to procure all the necessaries and comforts of life, want and misery would be banished out of the world, and the rest of the 24 hours might be leisure and happiness.”
Tracing this sentence back to its original context, I found it appeared in a letter expressing Franklin’s criticism of how much labor and effort went into producing luxury goods for conspicuous consumption. Millions, he wrote, were “employed in doing nothing, or in something that amounts to nothing, when the necessaries and conveniences of life are in question.” It is a good thing the man is dead; five minutes in an American shopping center would kill him.
In Hunnicutt’s reading, the passage is a particularly blunt expression of a perspective or set of values he calls the Higher Progress. The goal of economic development was not just to produce “necessaries and comforts of life” in abundance and affordably – that, too, of course -- but to give people the free time to enjoy what they’d made, as well as one another’s company, and to secure the general welfare through lifelong education and civic involvement. The same vision is expressed by Walt Whitman, Frank Lloyd Wright, “factory girls” writing to newspapers in the 1840s, and Robert Maynard Hutchins’s proposals for university reform. In a book published when he was president of the University Chicago, Hutchins described progress as having three stages:
“We want our private and individual good, our economic well-being… Second, we want the common good: peace, order, and justice. But most of all we want a third order of good, our personal or human good. We want, that is, to achieve the limit of our moral, intellectual, and spiritual powers.”
That “we” is not aristocratic. The examples of Higher Progress thinking that Free Time cites are profoundly democratic in temper. For every patrician worrying that more leisure would just lead to drunkenness and loutish habits, Hunnicutt seems to quote five plebians saying they wanted the time for “the ‘wants’ that were being repressed by long hours: reading newspapers and books, visiting and entertaining at home, writing letters, voting, cultivating flowers, walking with the family, taking baths, going to meetings, and enjoying works of art.”
The Higher Progress was hardly inevitable. Following the Civil War, Walt Whitman, whose poetry often seems the outpouring of a blithe spirit with a caffeine buzz, sounded almost desperate at the scene before him. Despite the country’s technological progress and material abundance, “our New World democracy … is, so far, an almost complete failure in its social aspects, and in really grand religious, moral, and literary results.”
As was his wont, Whitman seems to speak directly to the reader, across the decades, in warning about the danger of fetishizing all our stuff and gizmos: “a secret silent loathing and despair.” A steadily growing GNP would not necessarily prevent the Higher Progress, but consumerism (in the form Franklin criticized as “luxury”) was all too likely to substitute itself for leisure, in the richest possible sense of the word.
So where did things go off track? Why is it that one of the arguments made for the sheer practicality of a shorter work week – that it would reduce joblessness – seems never to have been made given recent unemployment figures? What did the men and women who won 30-hour week in the 1940s respond to the free time?
Free Time addresses all of these questions, or at least points in directions where the answers might be found. But in honor of the author’s own sacrifice – and in hopes of encouraging you to read the book – I am going to make this column half as long as it might well be. It deserves wide attention, and would provoke a more meaningful conversation about the past, present, and future than we’re likely to have otherwise.
Among the mountains of literature dedicated to "best practices" in pedagogy, the consensus has emerged that engagement is key, and that we teachers can no longer – as we did throughout history – willfully try to drag students violently by the ear into our own umwelt and call it learning. Rather we need to create an active halfway space between world-bubbles, thus allowing learning to happen more organically, through a mutual reorientation.
This is precisely what I tried to do in a recent course exploring the topic of reality TV. Here I was either brave or foolish enough to structure the class like an actual reality TV competition. And while I admit the initial thrill of conception involved the perverse prospect of voting students "off the island," I could not have anticipated the pedagogical benefits of such a novel format until I tried them out. The first half of the course was quite traditional, with scholarly readings about the history of the genre, and related themes such as narcissism, exhibitionism, attention economies, surveillance, and the new employment option of simply being watched (There is an excellent book on this topic by Mark Andrejevic, which served as the main textbook). It is truly remarkable how much more conscientious students suddenly become when they are informed that an A on the dreaded midterm paper will earn them "immunity" from the first challenge.
The competition section was loosely based on "Project Runway," which emerged from my own institution, the New School, in New York City (specifically the design school, Parsons). Students would be given a challenge a week – some individual, some in groups – and then face a revolving group of expert "judges" to see how well their response connected to the critical aspects of the readings. (I tried to juggle the dual roles of Tim Gunn and Heidi Klum in this scenario, dispensing equal parts encouragement and fear with each alternate comment.) Examples of challenges include, "pitch your own (progressive) reality TV show," "create your own (self-reflexive) reality TV persona," and "report back from your own Thanksgiving holiday as if it were a reality TV show.”
After each challenge the “contestants” would reflect on the competition via "confession cams" recorded on their own laptops or phones, and posted to the blog (a meta-meta exercise in self-reflection, given that reality TV is already a meta-phenomenon). Instead of running around a fabric store, trying to buy enough satin or leather to make an edgy, fashionable dress in less than an hour, my students were running around the library, trying to find appropriate readings to supplement the syllabus. (Those who were voted off switched to the "production" side of the competition: some helping with filming, sound, editing, etc. Others worked on publicity around the college and online, as well as making their own commentaries on the unfolding events. It was therefore possible to be voted off early, but still get an A.)
One of the most striking differences between the students’ umwelt and my own became clear from the very beginning, when I initially took great pains to reassure the class that while we would be filming sections of the competition for archival purposes – and to heighten the sense of being on TV – these would not be made public in any way. To my surprise, all the students were disappointed, going so far as to say, "Well what’s the point in filming it then?!" This emphatic question – and the new Facebook-saturated Zeitgeist that it distils – then became a touchstone for the whole semester, concerning naive assumptions about identity, action, performance, and modes of witnessing. Why is it that the millennial generation does not think anything is worth doing or experiencing unless it is immediately "shared" and "liked" online? How might this backfire when it comes to friends or future employers? And who benefits most from this automatic compulsion?
So what began as a "so-crazy-it-might-work" idea soon revealed itself to be a new way for students to critically reconstruct their own relationship to the media – and thus to themselves – while also shaking up all my cherished notions about traditional modes of teaching the humanities. Whereas the host of "Project Runway" encourages the contestants to "make it work," I exhorted the students to "think it through" (indeed, I was tempted to call the course "So You Think You Can Think?"). And in one of those perfect moments of synchronicity, I could even offer the perfect prize to the winner: a paid internship to work on a film about reality TV by one of my former students, Valerie Veatch (whose first film, "Me at the Zoo," on viral celebrity and its discontents, recently premiered at Sundance).
What’s more, I am almost grateful that the National Security Agency global spying scandal did not erupt during the first run of this course, even as it would have spectacularly underscored the social and political tendencies which the class was designed to question. Even if we loathe reality TV, and claim to never watch it, that doesn’t mean we haven’t all been engulfed in its logic, mannerisms, motifs, conventions, and conceits. One reason I designed the course was to test my theory that even young people who feel themselves to be far above televisual trash are still exposed to, and shaped by, the emotional currents in creates in the world. Reality TV threatens to eclipse reality itself, even in those rare moments when the cameras aren’t running.
Quite simply, identity is now influenced by things like the confession cam, the idea of immunity, and the asymmetrical power dynamics of "the judges." Even as our most significant political figures threaten to become little more than grotesque characters in the latest installment of "The Real Housewives of Congress" or "The Vatican’s Next Top Pontiff." So while the challenge of education is to almost literally burst each other’s bubbles, the bigger challenge is to figure out – across the generations – how to stop our collective umwelt being shaped by this omnipresent model of thought and behavior.
Dominic Pettman is professor of culture and media at Eugene Lang College and New School for Social Research, where he recently won the University Distinguished Teaching Award. His most recent book is Look at the Bunny: Totem, Taboo, Technology.
Tamerlan Tsaernev was in my College Writing I class at Bunker Hill Community College in the spring of 2007. My pinhole view of his life, including a couple of e-mails about why he missed some classes, adds nothing to either the pathological or the geopolitical debates about the bombs Tamerlan and his brother are accused of setting off two weeks ago at the 2013 Boston Marathon.
What I can tell you is that I’ve felt like crying most of the time since Bloody Friday, the Friday after the Marathon Monday bombings that killed three and wounded 264, when police shut down Boston and Cambridge. Disclaimer 1: Of course the dead and the injured and their families are the only focus of our love and prayers. I have no words. This is a column about education reform -– or the lack of it.
Everyone I know of every profession in Boston reported feeling about the same. I now know that these feelings have a name: Secondary Trauma. You don’t have to be one of the injured to feel numb or want to cry.
How to treat myself for secondary trauma? I had no idea that was a skill I'd learn and need at a community college.
Hydrate – lots of water. Fresh air. No caffeine. Breathe. Have a good cry. J.S. Bach, always. Keep in mind that the national policy debate about the central issue for community colleges, completion, makes no mention I’ve heard of secondary trauma expertise as necessary professional development. Here’s my bookmarked reference web site, Trauma Stewardship.
Here’s my list of student primary traumas I’ve been second to, in a few short years: murder, rape, shootings; sudden and prolonged homelessness; memories of wars in Somalia, Eritrea, El Salvador, the Congo; a father killed in the civil war in Mali; a student for whom I was buying a sandwich at 5 p.m. saying, “I guess you could tell I haven’t eaten since yesterday.” Domestic violence. Stories from veterans of the wars in Iraq and Afghanistan. All but a few arise from teaching, remember, College Writing I. To this list, I can now add a terrorist attack. Perhaps ribbons for each trauma, as in the military, would cause the completion critics to include consider trauma a factor.
Let me be perfectly clear. Withering completion accountability is fine by me. The solutions just need a load factor for the days that community college teachers need a good cry.
Disclaimer 2: The worst days of my own silver-, no, platinum-spooned life are miles from the everyday trauma of the millions of students in community colleges, and the secondary traumas of their professors. I do not teach full-time. With occasional slippage, I am a generally happy and optimistic person. I have family, friends, health and, more, health insurance, food, Dana-Farber Cancer Institute and the love of Friends Meeting Cambridge for three years of cancer that my wife survived. (Thank you STEM disciplines.) My trauma requires no help.
My point for this column is that at the nation’s 1,200 community colleges, thousands of instructors have a traditional workload, unopposed by any of our unions, of four and five classes a semester with classes of 20, 30 and more students all subject to the primary traumas I’ve described.
I have no words for how these colleagues survive. I have plenty of words, for another day, for the policy makers, legislators, trade associations, and union chiefs who won’t admit to these traumas while whining about low community college completion rates.
The 1 a.m. Friday bomb explosion and shootout that killed Tamerlan was about a mile from my home. My wife heard the bomb and the gunfire that I slept through. By morning, Cambridge was shut down, and we were ordered to stay at home. After a day with helicopters chopping overhead and Doppler-effecting sirens in all directions, my wife and daughter heard the shooting Friday evening when police arrested Tamerlan’s brother, again about a mile from our home. I didn’t hear the gunfire.
I’ve discovered I am learning, too, about relative secondary trauma rankings on my Emotional/Trauma Richter Scale (patent pending). What I can tell you is that my urge to cry last week, and even now, is higher by a bit on my E/T Richter scale reading than when Cedirick Steele, a student in that same class that spring of 2007, was shot seven times and killed. I learned Cedirick’s death was called a premeditated random murder. The shooters planned to kill someone, it didn’t matter who. Perhaps tertiary trauma is when we discover a new term for something too terrible to be true. (Click here for my report on Cedirick’s killing.)
Here’s what I don’t understand in my rankings. I knew Cedirick very well. I wouldn’t have recognized Tamerlan on the street. He missed most classes and didn’t complete the course. Why I do I feel sadder after Bloody Friday than I did right after Cedirick’s death?
I didn’t make the Tamerlan connection until late Friday morning. I hadn’t known the suspects’ names when I went to bed Thursday. The cat woke me up Friday morning about 5:30 a.m. with a left paw “Breakfast!” to the nose.
I let the dog out in the yard and looked out the front door. No newspaper. Odd but ok. I fed the cats, made coffee, changed the laundry, put out breakfast for my wife. Still no newspaper. Not ok. Another 15 minutes, and I would call in the missed delivery. I had another cup of coffee and read a book. My wife was asleep. I hadn’t turned on the radio. Still no paper.
Then, the day began. A text message from someone at work. “The MBTA is closed. How can I get to work? Do you know what’s going on?” I had no idea. Another text message. Bunker Hill Community College closed for citywide emergency. I turned on the radio and learned why no newspaper delivery that morning. My neighborhood was the news. Police were looking for the suspects right here. And the news said that one of the suspects had gone to Bunker Hill Community College.
In the next hour, friends e-mailed. Did I know this student? “No,” I said. After the third e-mail, something stirred. I put “Tamerlan” in the search box of my computer. There he was on a class list from 2007, along with two innocuous e-mails about missing class. As a comedy and to raise money for students like mine, two years ago, I ran -– well, completed, the Boston Marathon. (My report.) Oh, can I see the blocks to the finish line where the bombs went off. I guess all this factors into my E/T Richter Scale, terrorist bombing versus premeditated random murder.
Now, the Iraq tank-driving student in that same class graduated from Dartmouth last spring, and he is on his plan, teaching at-risk high school students.
Of course that cheers us up on a bad day. We, the people, have to chuck the way we mistake such stories for success. Along with head-in-the-sand union chiefs, policy makers and too many education trade associations, do we let ourselves believe that these feel-good, albeit individually triumphant, community college to Ivy League stories are progress? I did, for years.
Back to my secondary trauma professional development. Our refusal as a nation to face down the truth about the lives of so many students and their traumas every day in so many of our schools and colleges? The trauma professionals would call our refusal denial and avoidance. An unhealthy strategy.
On the E/T Richter scale, though, my urge to cry was lower this week than it was back in 2011, when I was called to testify at the third trial of the Cedirick’s murderers. (Click here for my report on the trial.) On the morning of my testimony, the Suffolk County Victim/Witness Advocate sat me down and asked how I felt. Did she really want to know? She did. I said I’d felt like crying about Cedirick every day since she’d called three weeks before, to ask me to testify. Normal, she said. My education on secondary trauma began. After the trial, she made me go see a trauma counselor.
After the trial, four years after Cedirick’s random, premeditated murder, at last, I had a good cry. Today, I’ll help any student I can. And I’ll say a prayer again, and again, for the three dead and the 264 injured at the Boston Marathon Massacre.
Wick Sloane writes the Devil's Workshop column for Inside Higher Ed. Follow him on Twitter at @WickSloane.