This essay is as timely as it is unlikely. Timely, because many studies have correlated economic crises, such as the one corroding the academic job market as well as so many other career prospects, with a rise in domestic abuse. Unlikely, because I am far from the type of person whom one would expect to chronicle personal experience in this area.
None of the stereotypes apply. I am a professor at a respected university with what many people (not just my mother) would describe as an international reputation in her field. The product of a white, upper-middle-class professional household, I seldom heard my father raise his voice to my mother -- his raising his hand would have been inconceivable. Their marriage was perhaps not one made in heaven, but neither was it an instance of cruelty by any stretch of the imagination. And I did not and do not have a pattern of involvement with abusive partners; indeed, for the past 22 years I have enjoyed a very happy and stable relationship with a compassionate and supportive man.
I had thought I had every reason to anticipate a happy and stable relationship in my erstwhile marriage as well. My ex-husband and I shared many cultural interests and were anticipating careers in the same field within the humanities, with similar pedigrees and similarly strong academic records. By chance my career, however, started more smoothly than his, despite his impressive credentials and abilities, indeed gifts. I finished graduate school a year before he did in the ‘70s — shortly after the precipitous decline in the job market -- and obtained a tenure-track appointment while he was completing his dissertation. We then moved for compelling personal reasons, and I was fortunate enough to find an academic position again, but he did not do so.
My ex-husband had slapped me once early in our marriage when, because I had not understood and hence had not followed his instructions during a household repair, a small amount of water fell on him. I was shocked, but I viewed the episode as an aberration. It was not.
That event suggests that the recurrence of such abuse cannot be wholly blamed on his not having a job. And after all, many unemployed people do not descend into such behavior, while many who are guilty of it hold stable jobs. Nonetheless, the timing persuades me that my ex-husband’s not obtaining the sort of position he had hoped for contributed significantly to the recurrence of wife-beating. For shortly after we had moved and I, but not he, held an academic appointment, physical abuse started again. He pinched, shoved, and hit me with some regularity over a period of about a year. Not by any means the most violent wife-beating, but quite enough, thank you, to leave significant black-and-blue marks on one occasion and less visible scars on the others. The physical abuse was accompanied by persistent belittling remarks. Throughout all this, my ex-husband continued to appear in public as a charming and highly educated gentleman and a courteous husband. I later learned that this Jekyll-Hyde scenario is a common symptom of pathologies like his.
Why did I put up with it? Barely able to believe that this was happening between people like us, I made excuses for him, justifying his behavior as a regrettable but understandable response to his unemployment, which was clearly all the more difficult for him because I had an attractive job in the same field. The contrast between his public and private behavior made it harder to confront the events squarely, as did the ways the situation sapped my own self-confidence. Like many victims of domestic abuse, I began to blame myself, not realizing that although I had made real mistakes, such as occasional tactless remarks, they neither explained nor justified this physical and emotional maltreatment.
Moreover, like many wife-beaters, he repeatedly seemed to repent. On the several occasions when I finally resolved to leave, he admitted that situations for which he had blamed only me were in fact in large measure his responsibility, and he promised to get therapy. These apparent reversals were, I was to discover, as much a pattern as the violence itself, and the therapy never materialized.
His career not only got back on track but flourished after that year of unemployment — a good though temporary job one year, a tenure-track job the next, the publication of a well-received book by a leading press, and so on. The physical abuse stopped shortly after he gained those academic positions, though the emotional analogues to it did not, and for that and many other reasons I finally, belatedly, got a divorce.
What I learned is relevant to anyone, man or woman, suffering domestic abuse.
Realizing that stressful circumstances outside the home -- and one's own behavior -- may have contributed to tension is a very different matter from excusing the behavior or shouldering all the responsibility oneself. Distinguish compassion from submission: it's healthy to understand the financial pressures that might bring out this type of violence in some individuals, but no one should accept its continuation. Be alert to connections between the physical and verbal, recognizing that physical abuse often merely goes into remission or resurfaces as verbal wife-beating. Apologies and promises need to be backed up with concrete and reliable evidence for believing that change will occur.
But one step must precede and accompany all of these: Avoid the temptation to excuse or deny the abuse by saying, "This isn't really occurring, and it will stop any minute because things like this don't happen to a professional couple like us." They can. They do. And, sadly, in this academic job market, they will.
The author of this piece, who asked to remain anonymous, is a tenured professor.
Money has always given people better options, but for humanities Ph.D. students, money’s now necessary just to get acceptable ones. Just now becoming noticeable, this “re-gilded ivory tower” looms over a landscape that everyone should consider.
As one fellow graduate student recently observed, "You have to have a spouse nowadays; that’s how more and more people seem to be doing it." As is well-known, the economic crash hastened the decline of tenure-track jobs and increased competition for them. Once standard, these stable jobs with adequate salary and benefits have become rarer, displaced by short-term, one- to two-year positions at best, and by piecemeal adjuncting at worst. In turn, entry-level qualifications also rose at some institutions to include a secondary research specialization, at least one article, and attention to pedagogy resulting in the creation of one or more substantive classes, ideally taught at outside institutions.
Thus, some form of outside support has become essential for wading through longer Ph.D. programs, and very often an indefinite period of unstable and unremunerative post-graduation employment while waiting for a good job that may never come. Spousal income, a parent-owned condo, a trust fund – no matter which, these necessities increasingly make a humanities Ph.D. less of a career path and more of a leisure pursuit for those with financial stability from elsewhere, even for students at top institutions.
Recent cohorts at my home institution of the University of Chicago show how money has effectively formed two tracks of Ph.D. students. One student, a self-supporting single person, graduated several years ago and entered a one-year position with a heavy teaching load because he "had to." He’s been able to renew his position – but he also hasn’t published, and was passed over for a tenure-track job where he teaches because his teaching load made it impossible to write articles.
Another, a married person who leans on her non-academic spouse for income and benefits, adjuncts one or two classes per semester and uses the rest of her time for research as she awaits and creates better possibilities. "There’s no way in hell I’m doing a one-year," she confided. "But then again, I can afford to do that."
As if this anecdotal evidence isn’t enough, panelists at a recent academic careers conference at the same university openly averred that money is necessary to achieve the recommended level of professionalization – or at least as much of it as a student can get.
Since many institutions don’t track job placement for doctoral students, let alone gather comprehensive student financial profiles, experiences like these give the first glimpses into an academic world where finances determine fate. Given the steady loss of good jobs and devaluation of the humanities in favor of fields like science and engineering, class stratification in academia is set to grow and raises several crucial issues:
Who will become our professors? Despite rare exceptions, our humanities professors will come from wealthier backgrounds. To the extent that the academy can draw from wealthier members of different racial and national demographics, however, overall diversity may suffer less than one might think. Nevertheless, the academy will recede as a symbol of general social mobility.
What will our intellectual life be? As poorer students fall by the wayside, students with money – but not necessarily as much merit – will take their place in Ph.D. programs and professorships. Thus, scholarly standards and intellectual vibrancy should drop somewhat. Gone too will be questions stemming from the underrepresented socioeconomic backgrounds. Accordingly, the social utility of university research may decline – at least in disciplines where these questions are more common. Will the effects be the same in literature as in history or sociology, for example?
How to conceptualize the humanities? Students from poorer backgrounds will still encounter the humanities in general education requirements – but how do professors convey their enriching potential in a way that makes sense, when deep and sustained engagement is the province of the privileged? Descriptions of the humanities as a common cultural inheritance will need revision, if not outright replacement.
How to balance student and institutional well-being? Self-supporting students are already at a disadvantage for professionalization and survival in the humanities. Since student exploration into other careers almost unavoidably involves volunteering and then facing off against candidates with more appropriate degrees and job histories, the most humane advice may be warning poorer prospective students away from the risky bet of a Ph.D. Some professors do this, but institutions depend on students’ loan money and teaching. In the best-case scenario, poorer students self-select out. When they don’t, however, they foist a complicated set of ethical decisions upon faculty and administrators, with whom institutional inertia and pressures often hold sway.
Overall, a re-gilded ivory tower currently seems inevitable. Yet, how much will change? At the end of the day, professors will teach, students will study, and academic conversations will continue. For those who think, however, tainting everything will be a simple but ugly truth: money, not mind, makes a colleague. Perhaps, then, the single most pressing task of all for those in the humanities is our current national challenge, how to cultivate sensitivity across class lines.
David Mihalyfy is a seventh-year Ph.D. candidate in the history of Christianity program at the University of Chicago Divinity School.
At the recent dedication of the $500 million George W. Bush Presidential Center at Southern Methodist University, President Clinton called it "the latest, grandest example of the eternal struggle of former presidents to rewrite history." In 2004, the Clinton Center and Foundation stunned with its more than $200 million price tag, and less than a decade later Bush has doubled that when the endowment for the Bush Institute is counted. When the Barack Obama center opens around 2020, perhaps on the campus of the University of Chicago, could it be the first billion-dollar presidential center? Possibly. A total of $1.4 billion was raised for Obama’s two successful presidential campaigns, and so for a center dedicated to his final campaign for a better place in history it’s at least likely that he’ll surpass previous records.
Although the final decision on the location of the Obama center is probably a couple of years away, professors and administrators at the University of Chicago (where he once taught) and the University of Hawaii (where his mother studied and his sister taught) are thinking about what it might mean if it lands on their campus. Chicago State University also wants to be considered. For universities, presidential centers present both opportunities and significant costs and challenges. Academics should consider carefully before getting into a bidding war over a presidential library, and weigh how much these centers promote spin in addition to scholarship.
Prime campus real estate is sometimes sacrificed for these presidential temples, which, although they house valuable historical records impartially managed by the National Archives, also have museums that high school students who have passed the Advanced Placement U.S. History test would likely find biased, as well as foundations or institutes that have agendas that the host university does not control.
Clinton was right in saying that these centers are attempts by former presidents to write their own history and polish their reputations. And to a significant degree they work. President Carter’s reputation was tarnished when he left office in 1981, but as The New York Times put it in a nearly prescient headline in 1986: "Reshaped Carter Image Tied to Library Opening" — and today, Carter is one of the more respected former presidents.
But Clinton exaggerated when he said that the struggle by former presidents to remake their images stretches back to the beginning of American history. Until the 20th century, former presidents rarely even wrote memoirs, and the first president to have a presidential library run by the federal government was Franklin D. Roosevelt. The Roosevelt Library, which opened on his estate at Hyde Park, New York, in 1941, was modest compared with succeeding presidential libraries. Its initial cost was about $7 million in today’s dollars, but critics still accused FDR of building a "Yankee pyramid." There was more than a grain of truth in the charge. When FDR first saw Egypt’s pyramids, he said, "man’s desire to be remembered is colossal." Although what Roosevelt said may not be true for everyone, it certainly was true for FDR and his successors.
Most succeeding presidential libraries dwarf FDR’s: The Harry S. Truman Library in Independence, Missouri, evokes Queen Hatshepsut’s Temple in Egypt, as well as being the first to feature a full-scale Oval Office replica (something copied by most of the others), and the Dwight D. Eisenhower Library in Abilene, Kansas, is a complex of buildings with a park that takes up an entire city block.
The first president to affiliate his library with a university was President Kennedy. JFK envisioned his library on the grounds of his alma mater, Harvard University. After Kennedy’s death some at Harvard decided they didn’t like the idea of common tourists on their campus (99 percent of the visitors to presidential libraries are tourists, and only 1 percent are researchers), and architecture critic Ada Louis Huxtable humorously lampooned their fear of "Goths overwhelming the intelligentsia." Harvard did establish the Kennedy School of Government, but the Kennedy Library itself was located on a campus of the University of Massachusetts, on a spectacular site overlooking Boston harbor.
The Kennedy Library was also the first to have a "starchitect," when Jackie Kennedy chose I.M. Pei — who later designed the East Building of the National Gallery of Art, as well as the expansion of the Louvre — to design her husband’s memorial. Originally, the Kennedy Library was going to be a large pyramid with the top cut off — representing JFK’s tragically truncated achievement — but eventually that plan was scrapped, and Pei reimagined that design as the glass pyramid at the Louvre. Pei’s final design for The Kennedy Library and Museum was a futuristic glass, steel, and concrete edifice that still looks like it could be used in a Star Trek movie.
President Lyndon Johnson, with Lady Bird Johnson’s help, also hired a star architect for his monument to himself. Gordon Bunshaft of the famous Skidmore, Owings, and Merrill firm had designed such modernist icons as Yale University’s beautiful Beinecke Library with its translucent marble walls. Bunschaft’s design for the Johnson Library on the campus of the University of Texas at Austin has, as Ada Louis Huxtable wrote, "a Pharaonic air of permanence" that "puts Mr. Johnson in the same class as some Popes and Kings who were equally receptive clients for architects with equally large ideas." The Johnson Library looks like a cross between an Egyptian pylon temple and a space-age bureaucracy.
We could talk about award-winning architect James Polshek’s design for the Clinton Center, or the renowned Robert A. M. Stern’s imposing design for the Bush Center at SMU, but you get the idea. All presidents since FDR have an edifice complex. Becoming a patron of a huge architectural project dedicated to yourself is one of the perks of being an Imperial Ex-President. Another perk is becoming a museum curator. Initially, the exhibits in presidential libraries are campaign commercials in museum form, designed with a lot of help from the former president. Eventually these exhibits become more balanced and complete, but it’s usually 30-50 years after a president leaves office before the National Archives installs decent exhibits. The former president and many of his supporters need to die before their power to spin subsides.
Supporters of presidential libraries hail their archives with their raw materials of history open to scholars, journalists, and even school kids. But these records would be available anyway because by law they are owned by the American people and must be impartially administered and released by the National Archives. If a president didn’t have a presidential library, the records would be housed in an equally accessible facility (probably in Washington), it just wouldn’t be so architecturally grandiose.
It was Jimmy Carter who first morphed the presidential library into a presidential center. The Carter Center, which is next to but administratively separate from the Carter Library and Museum in Atlanta, has been so effective at living up to its mantra of "Waging Peace. Fighting Disease. Building Hope" that President Carter won the Nobel Peace Prize in 2002. But Carter has also generated considerable controversy over the years because of his views on Israel. If the Carter Center had been located on the campus of nearby Emory University (with which it is loosely affiliated) that institution’s reputation might have been affected, but since the Carter Center is geographically separate from Emory the university was largely shielded.
There is not as much shielding for SMU from former President Bush and his views on such issues as enhanced interrogation techniques. The Bush Institute was inspired in part by the Hoover Institution on the campus of Stanford University, which is considered one of the nation’s leading conservative think tanks. The Hoover Institution has long offered a platform for high-profile Republicans such as George Schultz, Condoleezza Rice, and Donald Rumsfeld.
The Hoover Institution is to a large degree administratively separate from Stanford, and so although it effectively leverages the prestige of its host university to expand its influence, Stanford does not have a corresponding control over it. It’s possible that President Obama will seek a similar arrangement with a host university for a future Obama Center, or whatever he might choose to call it.
And the bottom line here is the bottom line: Although the price tag for the actual building of the Bush Library, Museum, and Institute was a cool quarter of a billion dollars, an equal amount was raised to endow the Bush Institute. And Bush and his supporters will continue their aggressive fund-raising for the foreseeable future, making the ultimate price tag and influence of the Bush Center perhaps in the billion-dollar range sometime in the next decade or two.
When President Johnson helped found the LBJ School of Public Affairs at the University of Texas at Austin, he gleefully anticipated breaking what he called "this goddamned Harvard" hold on top government positions. But like the Kennedy School of Government at Harvard, the Johnson School is run by its university, not by a self-perpetuating board largely independent of the university that seeks, in part, to enhance the reputation of the president whose name is on the building. In other words, as presidential centers have evolved and grown they have become a better and better deal for former presidents, but it’s less certain that they are a good deal for the universities that might host them.
What would make a presidential center a better deal for a university and the public? It would be useful for the 99 percent who will visit the future Obama museum to encourage the involvement of some history professors at the host university to help create exhibits with rigorous content. This content should be of a quality that would actually help future high school students pass the relevant portion of a future AP U.S. history test, rather than just being a museum of spin.
For a future Obama foundation or institute, it would be worthwhile for the university to have a significant number of faculty members from a variety of departments on the governing board. The university should have more than token input into a foundation that will be a big player on campus for many decades, perhaps even centuries. For, as some have noted, these presidential centers have become the American equivalent of the temples and tombs of the pharaohs. If professors, students, and the general public are to be more than bystanders or even would-be political worshippers, the host university needs to negotiate for the best interests of not just the university but the American public. Universities should not simply acquiesce to the desire that Clinton spoke of (only half-jokingly) that presidents have to rewrite their own history in self-glorifying memorials.
And President Obama himself would need to be involved in the process of reforming the presidential center. He has to a degree already taken on this role, for in his first full day in office in 2009 he revoked President Bush’s infamous Executive Order 13233, which restricted access to presidential records for political reasons. Obama and the university he partners with should continue this work so that presidential centers cease to remind us of the lines of the poem by Percy Shelley: "My name is Ozymandias, King of Kings, Look on my works, ye Mighty, and despair!"
The department of English invites applications for a tenure-track assistant professor in ME Studies, starting Fall 2014. Applicants should demonstrate a sustained scholarly engagement with ME. Demonstrated expertise in one or more of the following areas is preferred: research I care about, topics I've been focusing on for years, theories I am familiar with, practices I approve of, and debates already settled by ME.
Successful applicants will be less successful than I am but not so unsuccessful that it reflects poorly on ME. The lucky chosen one will have the opportunity to work with ME. Candidates must have a Ph.D. from an institution I approve of and have recommendation letters from people I know and respect but am not threatened by. Please send just the names of people you know I know by October 15th.
My university is an Affirmative Action/Equal Employment Opportunity Employer and does not discriminate against any individual on the basis of age, color, disability, gender, national origin, religion, sexual orientation, veteran status or genetic information. However, applicants who cite ME are particularly encouraged to apply.
Mead Embry is a pseudonym for an English professor.
Standing in line at the drugstore a couple of weeks ago, I spied on the magazine rack nearby this month’s issue of National Geographic – conspicuous as one of the few titles without a celebrity on the cover. Instead it showed a photograph of an infant beneath a headline saying "This Baby Will Live to Be 120."
The editors must have expected disbelief, because there was a footnote to the headline insisting that the claim was not hype: "New science could lead to very long lives." When was the last time you saw a footnote in a popular periodical, on the cover, no less? It seemed worth a look, particularly after the septuagenarian in front of me had opened complex, in-depth negotiations with the pharmacist.
The headline, one learns from a comment on the table of contents, alludes to a traditional Jewish birthday wish or blessing: "May you live to be 120." This was the age that Moses was said to have reached when he died. The same figure appears -- not so coincidentally perhaps – at an important moment in the book of Genesis. Before sending the Flood, Jehovah announces that man’s lifespan will henceforth peak at 120 years. (I take it there was a grandfather clause for Noah. When the waters recede, he lives another 350 years.)
The cap on longevity, like the deluge itself, is ultimately mankind’s own fault, given our tendency to impose too much on the Almighty’s patience and good humor. He declares in about so many words that there is a limit to how much He must endure from any single one of us. Various translations make the point more or less forcefully, but that’s the gist of it. Even 120 years proved too generous an offer – one quietly retracted later, it seems. Hence the Psalmist’s lament:
“The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labor and sorrow; for it is soon cut off, and we fly away.”
Nursing homes are full of people who passed the fourscore marker a while ago. If you visit such places very often, as I have lately, “May you live to be 120” probably sounds more like a curse than a blessing. Not even a funeral obliges more awareness of mortal frailty. There is more to life than staving off death. The prospect of being stranded somewhere in between for 30 or 40 years is enough to make an atheist believe in hell.
Meanwhile, in science…. The medical and biological research surveyed in that NatGeoarticle promises to do more than drag out the flesh’s “labor and sorrow” a lot longer. The baby on the magazine cover will live his or her allotted span of six score decades with an alert mind, in a reasonably healthy body. Our genetic inheritance plays a huge but not absolutely determinate role in how long we live. In the wake of the mapping of genome, it could be possible to tinker with the mechanisms that accelerate or delay the aging process. It may not be the elixir of youth, but close enough.
Besides treating the same research in greater depth, Ted Anton’s The Longevity Seekers: Science, Business, and the Fountain of Youth (University of Chicago Press) emphasizes how profound a change longevity research has already wrought. It means no longer taking for granted the status of aging as an inescapable, biologically hardwired, and fundamentally irreversible process of general decline. Challenging the stereotypes and prejudices about the elderly has been a difficult process, but longevity engineering would transform the whole terrain of what aging itself entails.
Anton, a professor of English at DePaul University, tells the story in two grand phases. The first bears some resemblance to James Watson’s memoir The Double Helix, which recounts the twists and turns of laboratory research in the struggle to determine the structure of DNA – work for which he and Francis Crick received a Nobel Prize in medicine in 1962. Watson’s book is particularly memorable for revealing science as an enterprise in which personalities and ambitions clash as much as theories ever do. (And with far more rancor as Watson himself demonstrated in the book’s vicious and petty treatment of Rosalind Franklin, a crystallographer whose contribution he downplayed as much as possible.)
A practitioner of long-form journalism rather than a longevity researcher, Anton writes about conflicts in the field with some detachment, even while remaining aware that the discoveries may change life in ways we can’t yet picture. The initial phase of the research he describes consisted largely of experiments with yeast cells and microscopic worms conducted in the 1990s. Both are short-lived, meaning that the impact of biochemical adjustments to their genetic “thermostats” for longevity would register quickly.
During the second phase of Anton’s narrative, lab research involved more complex organisms. But that that was not the most important development. The public began hearing news flashes that scientists had discovered that the key to a longer life was, say, restricted caloric intake, or a chemical called resveratrol found in red wine. Findings presented in scientific journals were reported on morning news programs, or endorsed on Oprah, within days or even hours of publication. Hypotheses became hype overnight.
This generated enthusiasm (more for drinking red wine than restricting calories, if memory serves) as well as additional confidence that biotechnological breakthroughs were on the way. Everybody in longevity research, or almost everybody, started a company and ran around looking for venture capital. Models, evidence, and ideas turned proprietary information -- with the hurry to get one’s findings into professional journals looking more and more like the rush to issue a press release.
So far, no pharmaceutical has arrived on the market to boost our lifespans as dramatically as the worm and yeast cells in the laboratory worms. “The dustbin of medical breakthroughs,” Anton reminds us, “bears the label ‘It Worked in Mice.’ ” On the other hand, the research has been a boon to the cosmetics industry.
As it is, we’re nowhere near ready to deal with the cumulative effect of all the life-extending medical developments from the past few decades. The number of centenarians in the world “is expected to increase tenfold between 2010 and 2050,” the author notes, “and the number of older poor, the majority of them women,” is predicted “to go from 342 million today to 1.2 billion by that same year.”
But progress is ruthless about doing things on its own terms. Biotech is still in its infancy, and its future course -- much less its side effects -- is beyond imagining. The baby on the magazine cover might well live to see the first centenarian win an Olympic medal. I wish that prospect were more cheering than it is.
Undergraduate students should join professors in selecting the content of courses taught in the humanities.
This is the conclusion I came to after teaching Humanities on Demand: Narratives Gone Viral, a pilot course at Duke University that not only introduced students to some of the critical modes humanists employ to analyze new media artifacts, but also tested the viability of a new, interactive course design. One semester prior to the beginning of class, we asked 6,500 undergraduates -- in other words, Duke¹s entire undergraduate student body -- to go online and submit materials they believed warranted examination in the course.
Submissions could be made regardless of whether a student planned on enrolling in the course. In response, hundreds of students from a variety of academic disciplines, including engineering, political science, religion, foreign languages, anthropology, public policy and computer science, submitted content for the class.
This interactive approach, which I call Epic Course Design (ECD) after German playwright Bertolt Brecht’s theory of epic theater, represents a radical break with traditional course-building techniques. Generally, humanities instructors unilaterally choose the content of their syllabuses -- and rightly so. After all, we are the experts. But this solitary method of course construction does not reflect how humanists often actually teach.
Far from being viewed as passive receptacles of instructional data, humanities students are often engaged as active contributors. With this in mind, ECD offers a student-centered alternative to traditional course-building methods. Importantly, ECD does not allow students to dictate the content of a course; it invites them to contribute, with the instructor ultimately deciding which (if any) student-generated submissions merit inclusion on the syllabus.
Nevertheless, when a colleague of mine first heard about my plans to allow students to determine what was to be examined in Narrative Gone Viral, he was deeply skeptical: "But students don¹t know what they don’t know," he objected. In my view, that is not a problem -- that is the point; or at least part of it. For crowdsourcing the curriculum not only invites students to submit material they are interested in, but also invites them to choose material they believe they already understand. Student-generated submissions for Narratives Gone Viral included popular YouTube videos like "He-Man sings 4 Non Blondes,""Inmates Perform Thriller" and "Miss Teen USA 2007- South Carolina answers a Question." While my students were already exceedingly familiar with these videos, they clearly didn’t always see what was at stake in them.
All of these works are worthy of academic scrutiny: the "He-Man" piece is interesting because it confronts preconceived notions of masculinity; "Inmates Perform Thriller" prompts questions of accessibility to social media; "Miss Teen USA" is notable because it reveals how viral videos often appeal to a viewer’s desire to feel superior to others.
I am not proposing that all humanities courses should integrate this approach. What I am suggesting, however, is that ECD represents a viable alternative to more familiar course-building methodologies. This includes classes that do not focus on social media and/or popular culture. Importantly, whether students will be interested in suggesting texts for, say, a course on medieval German literature is not the crucial question; in my view, the crucial question is: Why should we refrain from offering motivated students the opportunity to do so, if they wish?
There was relatively little repetition in student submissions for Narratives Gone Viral, an indication that students were reviewing posts made by their peers, weighing their options, and responding with alternative suggestions.
To put a finer point on the matter, students were not merely submitting course content: they were discussing the content of a course that -- in every traditional sense -- had yet to even begin.
Michael P. Ryan is a visiting assistant professor of German studies and the American Council of Learned Societies new faculty fellow at Duke University.