faculty

A comparison of 'quit lit' in the 1970s and today (opinion)

The academic job market is still collapsing, but academic “quit lit” is booming. The precipitous decline of tenure-track jobs in the humanities and social sciences over the past decade has meant that more early-career Ph.D.s than ever are leaving academe and writing about their experiences.

As this genre has developed, it has come to be distinguished by its reliance on personal narrative and its authors’ desire to publicly validate their private feelings at being shut out of a profession they’ve spent a significant portion of adulthood pursuing. Although the rise of quit lit feels unique to an era shaped by an epochal financial crisis and an unprecedented accessibility to publishing venues, this kind of writing by ex-academics is not quite as novel as it appears.

In fact, a form of “quit lit” (we might call it proto-quit lit) actually emerged in the 1970s, the first decade in which the number of new Ph.D.s first outpaced the number of available academic jobs. By 1980, headlines like “The Plight of the Ph.D.s,” “Ph.D. Equals ‘Prospects Highly Depressing,’” “Jobless Ph.D.s Rack Brains to Live” and “Untenured Teachers Face Uncertain Futures” had become a familiar sight, and stock types like the itinerant scholar were often invoked alongside anecdotes of Ph.D.s having to take second jobs as taxicab drivers, bartenders or bricklayers.

For all this public hand-wringing, the generation of Ph.D.s who lived through this first wave of academic underemployment did not write essays with anywhere near the frequency or intensity of those of our own moment. Nevertheless, a few young academics were charting the course for the genre in newspapers, professional bulletins and academic journals.

One of the earliest examples is John T. Harwood’s 1974 article for AAUP Bulletin, “Nonacademic Job Hunting.” More gently satiric than incisively critical of the academic profession, the essay peppers Harwood’s eminently practical advice about how Ph.D.s ought to go about securing nonacademic employment with personal stories and a few winking literary references.

A more forceful and personal denunciation of the academic job market appears in “The New Academic Hustle: Marketing a Ph.D.,” a 1978 article by two sociology Ph.D.s detailing the perils and grotesqueries of the academic hiring process. Despite their mostly measured tone, the authors manage to smuggle in a damning indictment of the academic job market in the form of an autobiographical note cleverly placed where an abstract would typically appear. “When we received our Ph.D.s in 1976,” it reads, “we knew we would have to look hard for the kinds of jobs we wanted. But we did not anticipate the often discourteous, unfeeling and degrading reception we were to encounter as job applicants.”

Perhaps the most instructive piece in this vein is a brief article written by a freshly minted sociology Ph.D. in 1975 for The Berkshire Eagle and headlined, “Is a Ph.D. Worth It?” The author’s list of complaints is all too familiar: the constriction of the labor market, the apathy of undergraduates, the consumer-centric focus of the modern university and the pressures to publish.

Yet his conclusion gestures toward wells of feeling yet untapped. “I have no regrets about my decision,” he writes, but “this does not mean I am not bitter; trying to find work during these depressed times is no less anguishing for me than it is for most people.” He hastens to add, however, “whatever anger I feel is not directed towards my seven years in graduate school.” Unlike the quit lit of our current moment, this author explicitly diverts his anger away from his home institution in order to absolve it of any responsibility for his feelings.

Notable in all these bygone essays is how reticent their authors are to give voice to their pain. When grief and anger surface, they are swiftly curtailed with pragmatic prescriptions for institutional change. They display a uniform resistance to soul baring that stands in stark contrast to the recent wave of quit lit.

In light of the proto-quit lit of the 1970s, in the 2010s the genre’s reliance on personal narratives that marshal intense feelings -- of rage and grief and everything in between -- to overturn a broken hiring system is indicative of how intractable the problem of academic underemployment has become. For so many of these authors, the very notion that institutional reforms will resolve this crisis seems absurd given its scope and duration.

There is no better or timelier example of this transformation than Erin Bartram’s blog post, which led to a broad discussion in academe.

Whereas a quit lit essay like Rebecca Schuman’s 2013 barn burner for Slate, “Thesis Hatement,” derives its rhetorical power from the righteous anger of its author, Bartram’s piece appeals to collective grief and sympathy. In this way, it marks the genre’s turn toward sentimentalism, by which I do not mean mere mawkishness but rather a philosophical and aesthetic paradigm -- with a complicated legacy in American social history --- that privileges feeling and leverages it to enact change.

In her essay, Bartram eloquently and movingly mobilizes her personal story to validate her private grief at having to leave the profession. “My feelings are, thankfully, not subject to peer review,” she writes, snatching the legitimacy of her interior experience away from the profession’s preferred methods of scrutiny and authorization.

The piece is far more than the story of one person’s loss, however, as Bartram attempts to extend her grief to those “left behind” -- the ones who have successfully secured jobs -- urging them to recognize “not only the magnitude of the loss [of potential colleagues] but also that it was a loss at all.”

Although the essay is more ethically than politically oriented, Bartram concludes it with a question that gestures in the direction of the latter. “What would happen,” she asks, “if we actually grieved for those losses?” In this, she hints at the political possibilities that would inhere in the creation of a community of mourners who grieve for those individuals chewed up and spit out by the academic job market.

There is much to admire in this vision, yet we should be cautious of relying on grief alone as a catalyst for change.

Ralph Waldo Emerson famously wrote, “I grieve that grief can teach me nothing, nor carry me one step into real nature.” According to Georgetown University professor Dana Luciano, this sentence negates the “compelling cultural fantasy that grief can make one better.” Powerful though it may be, grief has as much potential to reinforce the status quo as to dismantle it. That some have already begun to repurpose Bartram’s grief for political projects along precisely these lines is proof of this.

A 1975 profile of an out-of-work professor concludes with her reflecting on a group of faculty recently ousted from a small, rural college where she previously taught. She implores her interviewer to “imagine how they feel.” This imperative is a prescient anticipation of the guiding principle behind the most resonant academic quit lit of the 2010s. As we look to the genre’s future, we ought to consider whether simply getting others to feel right is enough.

Grant Shreve is a writer and former academic living in Baltimore.

Editorial Tags: 
Image Source: 
iStock
Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Historians need to measure what their students learn (opinion)

“What are you going to do with that -- teach?” Uttered with disdain, it’s a question history majors have been asked many times. Clio’s defenders have a response. The head of the American Historical Association says that the study of history creates critical thinkers who can “sift through substantial amounts of information, organize it, and make sense of it.” A university president asserts that the liberal arts endow students with the “features of the enlightened citizen” who possesses “informed convictions … and the capacity for courageous debate on the real issues.” Historians pride themselves on the evidence for their claims.

So, what’s the evidence?

Not much, actually. Historians aren’t great at tracking what students learn. Sometimes they even resent being asked. Recently, however, the winner of the Bancroft Prize, one of history’s most distinguished awards, washed the profession’s dirty laundry in public. The article’s title: “Five Reasons History Professors Suck at Assessment.”

Anne Hyde described what happened when accreditors asked her colleagues to document what students learned. They paid little heed to the requests -- that is, until Colorado College’s history department flunked its review. Committed teachers all, her colleagues “had never conducted assessment in any conscious way beyond reporting departmental enrollment numbers and student grade point averages.”

Among many college history departments, this is routine. To address the issue of assessment, the American Historical Association in 2011 set out on a multiyear initiative to define what students should “be able to do at the end of the major.” Eight years, dozens of meetings and hundreds of disposable cups later, the Tuning Project produced a set of ambitious targets for student learning. But when it came to assessing these goals, they left a big question mark.

Which is one of the reasons why we were convinced of the need to create new assessments. With support from the Library of Congress, we came up with short tasks in which history students interpreted sources from the library’s collection and wrote a few sentences justifying their response. For example, one assessment, “The First Thanksgiving,” presented students with a painting from the beginning of the 20th century and asked if the image of lace-aproned Pilgrim women serving turkey to bare-chested Indians would help historians reconstruct what may have transpired in 1621 at the supposed feast between the Wampanoag and English settlers.

In the March issue of the Journal of American History, we describe what happened when we gave our assessments to students at two large state universities. On one campus, we quizzed mostly first-year students satisfying a distribution requirement. All but two of 57 ignored the 300-year time gap between the Thanksgiving painting and the event it depicts. Instead, they judged the painting on whether it matched their preconceptions, or simply took its contents at face value -- an answer we dubbed the “picture’s worth a thousand words” response.

We weren’t terribly surprised. When we tested high school students on these tasks, they struggled, too, and many of these college students were in high school only months earlier. But what would happen, we wondered, if we gave our tasks to college juniors and seniors, the majority of whom were history majors and all of whom had taken five or more history courses? Would seasoned college students breeze through tasks originally designed for high school?

What we found shocked us. Only two in 49 juniors and seniors explained why it might be a problem to use a 20th-century painting to understand an event from the 17th century. Another one of our assessments presented students with excerpts from a soldier’s testimony before the 1902 Senate Committee investigating the war in the Philippines. We asked how the source provided evidence that “many Americans objected to the war.” Rather than considering what might prompt a congressional hearing, students mostly focused on the document’s content at the expense of its context. Rare were responses -- only 7 percent -- that tied the testimony to the circumstances of its delivery. As one student explained, “If there hadn’t been such a huge opposition by Americans to this war, I don’t believe that the investigation would have occurred.”

We suffer no illusions that our short exercises exhaust the range of critical thinking in history. What they do is provide a check on stirring pronouncements about the promised benefits of historical study. In an age of declining enrollments in history classes, soaring college debt and increased questions about what’s actually learned in college, feel-good bromides about critical thinking and enlightened citizenship won’t cut it. Historians offer evidence when they make claims about the past. Why should it be different when they make claims about what’s learned in their classrooms?

Sam Wineburg is the Margaret Jacks Professor of Education and of history (by courtesy) at Stanford University. Joel Breakstone is the executive director and Mark Smith is director of assessment at the Stanford History Education Group.

Editorial Tags: 
Image Source: 
Library of Congress
Image Caption: 
"The First Thanksgiving" by Jean Leon Gerome Ferris
Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

Colleges award tenure

Longwood University

  • Sarai Blincoe, psychology
  • Angela Bubash, art
  • Karla Collins, education
  • Ann Cralidis, communication sciences and disorders
  • Kenneth Fortino, biology
  • Patricia Horne Hastings, education
  • Pamela McDermott, music
  • Adam Paulek, art
  • Shannon Salley, communication sciences and disorders
  • Wendy Snow, education
  • Wade Znosko, biology

University of Maine

How to use your website and LinkedIn to further your career (opinion)

Category: 

Pallavi Eswara provides advice on how to make sure it accurately reflects what you do professionally and lets you control what others see about you.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 
Image Source: 
iStock/exdez
Is this diversity newsletter?: 
Newsletter Order: 
5
Advice Newsletter publication date: 
Thursday, April 5, 2018
Is this Career Advice newsletter?: 
Disable left side advertisement?: 
Email Teaser: 
Curating Your Online Career Presence

Study finds the lecture remains dominant form of teaching in STEM

New study of undergraduate STEM courses finds that lectures remain dominant -- despite finding after finding questioning their effectiveness.

Unauthorized searches of professors' email create rift at Rochester

University of Rochester professors found out their emails were reviewed and shared, raising questions of how much privacy faculty members should be assured.

Holy Cross defends professor under attack for his writings on Jesus and sexuality

Scholar's suggestion that Jesus be read as a "drag king" leads to calls for his resignation and to local bishop criticizing response of college, which cited academic freedom.

Why the University of Wisconsin Stevens Point plans to eliminate certain traditional liberal arts majors (opinion)

How can you be a university without a major in history?

We field this kind of question frequently at the University of Wisconsin Stevens Point. In March, we released curricular recommendations designed to repair our budget and stabilize enrollment. The proposal, Point Forward, called for the elimination of numerous majors in the traditional liberal arts, a greater emphasis on career-focused programs and the reimagining of our core liberal arts curriculum.

Since then, we have received a flood of messages from students, faculty members, alumni and scholarly organizations across the country. Some of them ask about process, wondering why we made recommendations suddenly and without stakeholder input. In fact, we discussed these issues for years. Others suggest there must be alternatives to eliminating underenrolled majors. I wish there were; we have tried nearly everything else. A small but growing number of people express sympathy with our dilemma, placing responsibility on the decades-long erosion of public investment in higher education. They are correct.

Set aside such issues for the moment. More interesting are the numerous messages wondering how we can be a university without majors in the traditional liberal arts. Are we not becoming a trade school, abandoning enrichment of the mind in favor of training in workplace skills?

These perceptions result from misunderstanding. Far from eliminating liberal arts disciplines, our proposal aims only at full majors. In fact, we are fighting to preserve as much as 80 percent of our faculty and curriculum in these areas -- and not just through general education but in refocused majors and minors with upper-level courses offering genuine opportunities for deep engagement in the liberal arts. Equally important, our baccalaureate degrees in natural resources, health, business, education and the performing arts -- the majority of degrees we offer -- are hardly narrow or technical.

But set this aside, too. Implied in the claim that “abandoning the liberal arts” means we “cannot be a university” are assumptions worth examining. Most students at UW Stevens Point do not choose to major in the traditional liberal arts disciplines. In fact, many universities already do not offer some of these programs. Within the University of Wisconsin System alone, four universities lack full majors in philosophy, three in sociology and four in Spanish. One of the founding tenets of the UW System was the idea that each campus would have a distinct program array defined by its select mission. If our current proposal takes this concept seriously, do we really cease to be a university?

Reading these comments as an environmental historian, I’m reminded of American attitudes toward wilderness. Few people choose to live in the wild, and most visit only rarely. True wilderness is uncomfortable and the Wi-Fi is terrible. Instead, most Americans seem content just knowing that wilderness is there, a notion that celebrates a romanticized frontier that never existed. Similarly, the implicit message from some of our critics is: it’s OK if your students major in finance, health science, and resource management; we just need to know that a philosophy major is there. This feeling, too, derives from false nostalgia -- specifically, for the idea that regional public universities can be smaller versions of research institutions. During the 1950s and 1960s, an aberrational moment in higher education when students and funding were plentiful, institutions like UW Stevens Point could afford to launch majors in the traditional liberal arts. Today, everything is different.

The inability to acknowledge this reality is deeply rooted in academic culture. Take the numerous condemnations we received from scholarly organizations asserting that without majors in their respective liberal arts disciplines, our university will lack something fundamental. “Elimination of the history program,” reads the letter from the Organization of American Historians, “means the elimination of a university’s capacity to teach … critical life skills.” Really? No one would dispute that every graduate should have meaningful courses in history. As the letter noted, “History is the discipline dedicated to studying the past … [and is] essential to navigating rapid economic transformation, international crises, epidemic disease, political gridlock and myriad other modern challenges.” I agree. I want every student in our College of Natural Resources to have a course in environmental history. But to conflate this with needing to preserve a history major seems disingenuous.

Here we encounter the elephant in the room in our current dialogue about the liberal arts and the meaning of a university. If the majority of students in universities today encounter these traditional disciplines, not as majors but only through general-education programs, should we not direct our attention there? The need for new approaches to general education has long been evident, and many institutions have sought meaningful change. Yet many such efforts, including ours, have run aground due to structural impediments rooted in our conflation of the narrow role of liberal arts majors with the much broader and more vital role of these same disciplines, a dynamic that stifles curricular innovation.

Too many general-education programs rely on courses that are introductions to liberal arts majors, even as they enroll primarily nonmajors. This double duty leaves the majority of students wondering why they must take such classes and hoping only to “get them out of the way.” Too many general education programs spark battles over department “turf,” elevating the protection of student credit hours, budgets and faculty positions above thoughtful consideration of student needs in shaping curricula. As a result, too many general education programs have little purposeful cohesion and little relevance to the majority of students. Given that most universities assign one-third of the courses required to complete a baccalaureate degree to these core curricula -- and given the tuition we charge -- is it any wonder that students resent the cost of higher education?

Our aim at UW Stevens Point is to fix this problem, to look beyond a set of majors that serve roughly 6 percent of our students and ask how the disciplines of the liberal arts can better educate everyone. Do we lose something in this equation? Absolutely. The loss is real and should be debated in the context of urging greater public investment in higher education. Will we cease to be a university? Of course not. In fact, if we succeed in making the liberal arts more relevant and available to the majority of students who never major in these disciplines, we will be a stronger university indeed.

Greg Summers is provost and vice chancellor for academic affairs at the University of Wisconsin Stevens Point.

Section: 
Editorial Tags: 
Image Caption: 
University of Wisconsin Stevens Point
Is this diversity newsletter?: 
Disable left side advertisement?: 
Is this Career Advice newsletter?: 

The importance of being a scholar-activist (opinion)

While traditional or mainstream scholars refuse to fully recognize our research-action efforts, writes Alvaro Huerta, activists criticize us for operating in the so-called ivory tower.

Job Tags: 
Ad keywords: 
Topic: 
Editorial Tags: 
Show on Jobs site: 
Image Size: 
Is this diversity newsletter?: 
Newsletter Order: 
5
Diversity Newsletter publication date: 
Tuesday, April 3, 2018
Is this Career Advice newsletter?: 
Disable left side advertisement?: 
Email Teaser: 
Viva the Scholar-Activist!

American and British researchers less likely than others to share data, study finds

American, British and Canadian researchers are less likely than others to share data behind their research projects, study finds.

Pages

Subscribe to RSS - faculty
Back to Top