Ethnic / cultural / gender studies

American Studies Association backs boycott of Israeli universities

Section: 
Smart Title: 

American Studies Association members vote by a two-to-one margin to endorse the boycott of Israeli universities.

Temple University faces scrutiny over rejection of African-American studies department's choice as chair

Smart Title: 

Temple was the first institution to offer a doctorate in African-American studies and has seen heated debates over the discipline's direction. The rejection of the department's choice as chair has set off a new controversy.

UT-Austin scrutinizes ethics of controversial same-sex parenting study

Smart Title: 

UT-Austin launches administrative inquiry into integrity of controversial study about children of same-sex couples.

Boycott kills U. of Texas project on women writers in the Middle East

Smart Title: 

U. of Texas wanted to honor a late scholar whose career had focused on Middle Eastern studies. But when Arab contributors found out that two Israelis would be published in the same work, a tribute fell apart.

How to do good research at a teaching-intensive institution (essay)

It's hard -- but possible -- to find time and inspiration for good scholarship even when you teach four courses a term, writes Hollis Phelps.

Editorial Tags: 
Show on Jobs site: 

Review of Benjamin Kline Hunnicut, 'Free Time: The Forgotten American Dream'

“Having had to cut the book nearly in half for the final proof,” writes Benjamin Kline Hunnicut in the introduction to Free Time: The Forgotten American Dream (Temple University Press), “I am keenly aware of things omitted, still on my computer’s hard drive awaiting publication.” This is offered as an apology, though none is needed. Excessive leanness must be the least common fault in scholarly prose – and Free Time deserves laurels for not imposing too much on the scarce resource in question.

The author teaches at the University of Iowa, where he holds the enviable post of professor of leisure studies. He has devoted the better part of 40 years – including two previous books – to investigating the sources and implications of something both obvious and overlooked about the American work week.

Throughout the 19th and into the early 20th centuries, working people fought for, and won, more free time -- despite the dire mutterings by the pundits, who announced that economic collapse and social turmoil were sure to follow if employees worked for only, say, 10 hours a day, 6 days a week. By the 1930s, the combination of increased industrial productivity and collective bargaining made the trend for an ever-shorter work week seem irreversible. The demands of war production didn’t erase the expectation that the 40-hour week would shrink to 30, after peace came.

It did, in some places. For example, Hunnicutt and his students have interviewed retired factory workers from Akron, Ohio and Battle Creek, Michigan who won the six-hour day, once the war was over. And the social forecasts and magazine think-pieces from the 1960s and ‘70s made it sound like the great challenge of the future would be coping with all the free time created by automation and computerization.

Like hover-car collisions and overbooked hotels on the moon, the extremely shortened work week turns out not to be a major 21st-century social issue, after all. “Since the mid-1970s,” Hunnicutt says, “we have been working longer and longer each year, about a half a percentage point more from year to year….” It adds up. Americans now log an average 199 hours -- almost five 40-hour work weeks -- more per year than they did in 1973 – putting in “longer hours than those of other modern industrial nations, with the exception of South Korea,” according to the findings of the Bureau of Labor Statistics in 2009.

The point here is not that extrapolation is unreliable -- or even that seriously regressive trends can begin to seem normal after a generation or two. Hunnicutt begins his broad narrative of how things got like this in the 18th century, with a comment by Benjamin Franklin: “If every man and woman would work for four hours each day on something useful, that labor would produce sufficient to procure all the necessaries and comforts of life, want and misery would be banished out of the world, and the rest of the 24 hours might be leisure and happiness.”

Tracing this sentence back to its original context, I found it appeared in a letter expressing Franklin’s criticism of how much labor and effort went into producing luxury goods for conspicuous consumption. Millions, he wrote, were “employed in doing nothing, or in something that amounts to nothing, when the necessaries and conveniences of life are in question.” It is a good thing the man is dead; five minutes in an American shopping center would kill him.

In Hunnicutt’s reading, the passage is a particularly blunt expression of a perspective or set of values he calls the Higher Progress. The goal of economic development was not just to produce “necessaries and comforts of life” in abundance and affordably – that, too, of course -- but to give people the free time to enjoy what they’d made, as well as one another’s company, and to secure the general welfare through lifelong education and civic involvement. The same vision is expressed by Walt Whitman, Frank Lloyd Wright, “factory girls” writing to newspapers in the 1840s, and Robert Maynard Hutchins’s proposals for university reform. In a book published when he was president of the University Chicago, Hutchins described progress as having three stages:

“We want our private and individual good, our economic well-being… Second, we want the common good: peace, order, and justice. But most of all we want a third order of good, our personal or human good. We want, that is, to achieve the limit of our moral, intellectual, and spiritual powers.”

That “we” is not aristocratic. The examples of Higher Progress thinking that Free Time cites are profoundly democratic in temper. For every patrician worrying that more leisure would just lead to drunkenness and loutish habits, Hunnicutt seems to quote five plebians saying they wanted the time for “the ‘wants’ that were being repressed by long hours: reading newspapers and books, visiting and entertaining at home, writing letters, voting, cultivating flowers, walking with the family, taking baths, going to meetings, and enjoying works of art.”

The Higher Progress was hardly inevitable. Following the Civil War, Walt Whitman, whose poetry often seems the outpouring of a blithe spirit with a caffeine buzz, sounded almost desperate at the scene before him. Despite the country’s technological progress and material abundance, “our New World democracy … is, so far, an almost complete failure in its social aspects, and in really grand religious, moral, and literary results.”

As was his wont, Whitman seems to speak directly to the reader, across the decades, in warning about the danger of fetishizing all our stuff and gizmos: “a secret silent loathing and despair.” A steadily growing GNP would not necessarily prevent the Higher Progress, but consumerism (in the form Franklin criticized as “luxury”) was all too likely to substitute itself for leisure, in the richest possible sense of the word.

So where did things go off track? Why is it that one of the arguments made for the sheer practicality of a shorter work week – that it would reduce joblessness – seems never to have been made given recent unemployment figures? What did the men and women who won 30-hour week in the 1940s respond to the free time?

Free Time addresses all of these questions, or at least points in directions where the answers might be found. But in honor of the author’s own sacrifice – and in hopes of encouraging you to read the book – I am going to make this column half as long as it might well be. It deserves wide attention, and would provoke a more meaningful conversation about the past, present, and future than we’re likely to have otherwise.

Editorial Tags: 

Essay on teaching about reality TV by turning course into reality TV

Among the mountains of literature dedicated to "best practices" in pedagogy, the consensus has emerged that engagement is key, and that we teachers can no longer – as we did throughout history – willfully try to drag students violently by the ear into our own umwelt and call it learning. Rather we need to create an active halfway space between world-bubbles, thus allowing learning to happen more organically, through a mutual reorientation.

This is precisely what I tried to do in a recent course exploring the topic of reality TV. Here I was either brave or foolish enough to structure the class like an actual reality TV competition. And while I admit the initial thrill of conception involved the perverse prospect of voting students "off the island," I could not have anticipated the pedagogical benefits of such a novel format until I tried them out. The first half of the course was quite traditional, with scholarly readings about the history of the genre, and related themes such as narcissism, exhibitionism, attention economies, surveillance, and the new employment option of simply being watched (There is an excellent book on this topic by Mark Andrejevic, which served as the main textbook). It is truly remarkable how much more conscientious students suddenly become when they are informed that an A on the dreaded midterm paper will earn them "immunity" from the first challenge.

The competition section was loosely based on "Project Runway," which emerged from my own institution, the New School, in New York City (specifically the design school, Parsons). Students would be given a challenge a week – some individual, some in groups – and then face a revolving group of expert "judges" to see how well their response connected to the critical aspects of the readings. (I tried to juggle the dual roles of Tim Gunn and Heidi Klum in this scenario, dispensing equal parts encouragement and fear with each alternate comment.) Examples of challenges include, "pitch your own (progressive)  reality TV show," "create your own (self-reflexive)  reality TV persona," and "report back from your own Thanksgiving holiday as if it were a reality TV show.”

After each challenge the “contestants” would reflect on the competition via "confession cams" recorded on their own laptops or phones, and posted to the blog (a meta-meta exercise in self-reflection, given that reality TV is already a meta-phenomenon). Instead of running around a fabric store, trying to buy enough satin or leather to make an edgy, fashionable dress in less than an hour, my students were running around the library, trying to find appropriate readings to supplement the syllabus. (Those who were voted off switched to the "production" side of the competition: some helping with filming, sound, editing, etc. Others worked on publicity around the college and online, as well as making their own commentaries on the unfolding events. It was therefore possible to be voted off early, but still get an A.)

One of the most striking differences between the students’ umwelt and my own became clear from the very beginning, when I initially took great pains to reassure the class that while we would be filming sections of the competition for archival purposes – and to heighten the sense of being on TV – these would not be made public in any way. To my surprise, all the students were disappointed, going so far as to say, "Well what’s the point in filming it then?!" This emphatic question – and the new Facebook-saturated Zeitgeist that it distils – then became a touchstone for the whole semester, concerning naive assumptions about identity, action, performance, and modes of witnessing. Why is it that the millennial generation does not think anything is worth doing or experiencing unless it is immediately "shared" and "liked" online? How might this backfire when it comes to friends or future employers? And who benefits most from this automatic compulsion?

So what began as a "so-crazy-it-might-work" idea soon revealed itself to be a new way for students to critically reconstruct their own relationship to the media – and thus to themselves – while also shaking up all my cherished notions about traditional modes of teaching the humanities. Whereas the host of "Project Runway" encourages the contestants to "make it work," I exhorted the students to "think it through" (indeed, I was tempted to call the course "So You Think You Can Think?"). And in one of those perfect moments of synchronicity, I could even offer the perfect prize to the winner: a paid internship to work on a film about reality TV by one of my former students, Valerie Veatch (whose first film, "Me at the Zoo," on viral celebrity and its discontents, recently premiered at Sundance).

What’s more, I am almost grateful that the National Security Agency global spying scandal did not erupt during the first run of this course, even as it would have spectacularly underscored the social and political tendencies which the class was designed to question. Even if we loathe reality TV, and claim to never watch it, that doesn’t mean we haven’t all been engulfed in its logic, mannerisms, motifs, conventions, and conceits. One reason I designed the course was to test my theory that even young people who feel themselves to be far above televisual trash are still exposed to, and shaped by, the emotional currents in creates in the world. Reality TV threatens to eclipse reality itself, even in those rare moments when the cameras aren’t running.

Quite simply, identity is now influenced by things like the confession cam, the idea of immunity, and the asymmetrical power dynamics of "the judges." Even as our most significant political figures threaten to become little more than grotesque characters in the latest installment of "The Real Housewives of Congress" or "The Vatican’s Next Top Pontiff." So while the challenge of education is to almost literally burst each other’s bubbles, the bigger challenge is to figure out – across the generations – how to stop our collective umwelt being shaped by this omnipresent model of thought and behavior.

Dominic Pettman is professor of culture and media at Eugene Lang College and New School for Social Research, where he recently won the University Distinguished Teaching Award. His most recent book is Look at the Bunny: Totem, Taboo, Technology.

Editorial Tags: 

The Boston bombing suspect was my community college student (essay)

Tamerlan Tsaernev was in my College Writing I class at Bunker Hill Community College in the spring of 2007.  My pinhole view of his life, including a couple of e-mails about why he missed some classes, adds nothing to either the pathological or the geopolitical debates about the bombs Tamerlan and his brother are accused of setting off two weeks ago at the 2013 Boston Marathon. 

What I can tell you is that I’ve felt like crying most of the time since Bloody Friday, the Friday after the Marathon Monday bombings that killed three and wounded 264, when police shut down Boston and Cambridge.  Disclaimer 1: Of course the dead and the injured and their families are the only focus of our love and prayers. I have no words. This is a column about education reform -– or the lack of it. 

Everyone I know of every profession in Boston reported feeling about the same. I now know that these feelings have a name: Secondary Trauma. You don’t have to be one of the injured to feel numb or want to cry. 

How to treat myself for secondary trauma? I had no idea that was a skill I'd learn and need at a community college.

Hydrate – lots of water. Fresh air. No caffeine. Breathe. Have a good cry. J.S. Bach, always. Keep in mind that the national policy debate about the central issue for community colleges, completion, makes no mention I’ve heard of secondary trauma expertise as necessary professional development. Here’s my bookmarked reference web site, Trauma Stewardship.

Here’s my list of student primary traumas I’ve been second to, in a few short years: murder, rape, shootings; sudden and prolonged homelessness; memories of wars in Somalia, Eritrea, El Salvador, the Congo; a father killed in the civil war in Mali; a student for whom I was buying a sandwich at 5 p.m. saying, “I guess you could tell I haven’t eaten since yesterday.” Domestic violence. Stories from veterans of the wars in Iraq and Afghanistan. All but a few arise from teaching, remember, College Writing I. To this list, I can now add a terrorist attack. Perhaps ribbons for each trauma, as in the military, would cause the completion critics to include consider trauma a factor.

Let me be perfectly clear. Withering completion accountability is fine by me. The solutions just need a load factor for the days that community college teachers need a good cry.

Disclaimer 2: The worst days of my own silver-, no, platinum-spooned life are miles from the everyday trauma of the millions of students in community colleges, and the secondary traumas of their professors. I do not teach full-time. With occasional slippage, I am a generally happy and optimistic person.  I have family, friends, health and, more, health insurance, food, Dana-Farber Cancer Institute and the love of Friends Meeting Cambridge for three years of cancer that my wife survived. (Thank you STEM disciplines.) My trauma requires no help.

My point for this column is that at the nation’s 1,200 community colleges, thousands of instructors have a traditional workload, unopposed by any of our unions, of four and five classes a semester with classes of 20, 30 and more students all subject to the primary traumas I’ve described.

I have no words for how these colleagues survive. I have plenty of words, for another day, for the policy makers, legislators, trade associations, and union chiefs who won’t admit to these traumas while whining about low community college completion rates. 

The 1 a.m. Friday bomb explosion and shootout that killed Tamerlan was about a mile from my home.  My wife heard the bomb and the gunfire that I slept through.  By morning, Cambridge was shut down, and we were ordered to stay at home.  After a day with helicopters chopping overhead and Doppler-effecting sirens in all directions, my wife and daughter heard the shooting Friday evening when police arrested Tamerlan’s brother, again about a mile from our home. I didn’t hear the gunfire. 

I’ve discovered I am learning, too, about relative secondary trauma rankings on my Emotional/Trauma Richter Scale (patent pending).  What I can tell you is that my urge to cry last week, and even now, is higher by a bit on my E/T Richter scale reading than when Cedirick Steele, a student in that same class that spring of 2007, was shot seven times and killed. I learned Cedirick’s death was called a premeditated random murder.  The shooters planned to kill someone, it didn’t matter who.  Perhaps tertiary trauma is when we discover a new term for something too terrible to be true.  (Click here for my report on Cedirick’s killing.)

Here’s what I don’t understand in my rankings.  I knew Cedirick very well.  I wouldn’t have recognized Tamerlan on the street.  He missed most classes and didn’t complete the course.  Why I do I feel sadder after Bloody Friday than I did right after Cedirick’s death? 

I didn’t make the Tamerlan connection until late Friday morning.  I hadn’t known the suspects’ names when I went to bed Thursday.  The cat woke me up Friday morning about 5:30 a.m. with a left paw “Breakfast!” to the nose. 

I let the dog out in the yard and looked out the front door. No newspaper. Odd but ok. I fed the cats, made coffee, changed the laundry, put out breakfast for my wife. Still no newspaper. Not ok. Another 15 minutes, and I would call in the missed delivery. I had another cup of coffee and read a book.  My wife was asleep.  I hadn’t turned on the radio.  Still no paper. 

Then, the day began.  A text message from someone at work.  “The MBTA is closed.  How can I get to work?  Do you know what’s going on?”  I had no idea.  Another text message.  Bunker Hill Community College closed for citywide emergency. I turned on the radio and learned why no newspaper delivery that morning.  My neighborhood was the news.  Police were looking for the suspects right here.  And the news said that one of the suspects had gone to Bunker Hill Community College.  

In the next hour, friends e-mailed.  Did I know this student?  “No,” I said.  After the third e-mail, something stirred. I put “Tamerlan” in the search box of my computer. There he was on a class list from 2007, along with two innocuous e-mails about missing class. As a comedy and to raise money for students like mine, two years ago, I ran -– well, completed, the Boston Marathon. (My report.) Oh, can I see the blocks to the finish line where the bombs went off. I guess all this factors into my E/T Richter Scale, terrorist bombing versus premeditated random murder. 

Now, the Iraq tank-driving student in that same class graduated from Dartmouth last spring, and he is on his plan, teaching at-risk high school students.

Of course that cheers us up on a bad day. We, the people, have to chuck the way we mistake such stories for success. Along with head-in-the-sand union chiefs, policy makers and too many education trade associations, do we let ourselves believe that these feel-good, albeit individually triumphant, community college to Ivy League stories are progress?  I did, for years. 

Back to my secondary trauma professional development. Our refusal as a nation to face down the truth about the lives of so many students and their traumas every day in so many of our schools and colleges? The trauma professionals would call our refusal denial and avoidance. An unhealthy strategy. 

On the E/T Richter scale, though, my urge to cry was lower this week than it was back in 2011, when I was called to testify at the third trial of the Cedirick’s murderers. (Click here for my report on the trial.)  On the morning of my testimony, the Suffolk County Victim/Witness Advocate sat me down and asked how I felt. Did she really want to know? She did. I said I’d felt like crying about Cedirick every day since she’d called three weeks before, to ask me to testify.  Normal, she said.  My education on secondary trauma began. After the trial, she made me go see a trauma counselor. 

After the trial, four years after Cedirick’s random, premeditated murder, at last, I had a good cry. Today, I’ll help any student I can. And I’ll say a prayer again, and again, for the three dead and the 264 injured at the Boston Marathon Massacre.

Wick Sloane writes the Devil's Workshop column for Inside Higher Ed. Follow him on Twitter at @WickSloane.

Editorial Tags: 

review of 'Mad Men, Mad World: Sex, Politics, Style & the 1960s'

"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.

A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)

Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.   

Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home.  Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.

And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.

See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”

At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.

Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)

The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)

The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.

In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of "Mad Men" would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.   

There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)

When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.

 

Editorial Tags: 

Pages

Subscribe to RSS - Ethnic / cultural / gender studies
Back to Top