Humanities

Essay on interviews at teaching-oriented colleges

It's time to set your research aside and to think about the undergraduate classroom, writes John Fea.

Job Tags: 
Topic: 
Editorial Tags: 

U. of Rhode Island president issues new statement about controversial professor

Smart Title: 

U. of Rhode Island president, criticized for his first statement about a professor's controversial tweet about an NRA leader, issues another statement.

Essay on questions one might be asked in MLA interviews

Katherine Ellison and Cheryl Ball share questions you can expect to hear at an MLA interview.

Job Tags: 
Ad keywords: 
Topic: 
Editorial Tags: 

Problems and potential solutions in humanities doctoral education (essay)

The recent conversations on the future of the humanities degree -- most prominently at the Annual Convention of the Modern Language Association by its then-president, Russell Berman -- are encouraging steps in addressing the challenges. The position paper that Berman helped write outlines some meaningful first steps to address the time-to-degree issue, for example, that will need to be a driver for change. The recent article “The 5-Year Humanities Ph.D.” on Inside Higher Ed reiterates Stanford’s desire to continue fostering the debate with an emphasis on shortening time to degree for humanities Ph.D.s.

The current contribution seeks to expand the conversation and offer some concrete ideas for desirable changes beyond the time-to-degree issue. In particular, some funding changes -- coupled with restructuring programs so that the summers are utilized better and students have an expectation of an impactful year-around engagement -- need to take place. In addition, in order to open more avenues for employment, we may have to provide a similar co-curriculum as we do on the undergraduate level, one that produces T-shaped Ph.D.s aware and confident not only of their disciplinary depth, but also of their broader transferable skill set.

Given the public’s preoccupation with STEM disciplines and the less-than-stellar reputation of the humanities in the larger public, coupled with changes in student loan deferment during graduate school, and the challenging job market, these conversations are urgent.

What Is the Proper Size of Arts and Humanities Graduate Programs?

Although some universities including ours have addressed the issue of proper program size for a decade or so already, seen in context with the lingering overproduction issue, the choice seems fairly clear.  In this national context, graduate programs in the humanities need to come to terms with the often painful lesson that bigger is not always better. Administrators and faculty need to have realistic views on what a “right-sized cohort” is for the given discipline, the institutional profile, and, in some instances, the geographic location.

What Is Meant by Right-Sized?

By right-sized, I mean a frame of reference based on quantitative and qualitative factors like the following: 

  • the demand in the field
  • the placement rate of the unit
  • the number of applications to the program
  • their “fit” for the institutional mission
  • the level of active faculty engagement required by quality graduate education, including timely and targeted intervention when student progress is imperiled.

It is advantageous for graduate programs to focus on their distinctiveness within their larger institutional and national context -- and not strive to be everything to everybody.

With good planning and a lot of good will, the more technical aspects and issues surrounding graduate education can and should be addressed. The bigger and more contentious issue will be the disciplinary reframing that has to be part of this discussion:

  • How much coursework is enough?
  • What kind of coursework?
  • What should the comprehensive exam look like?
  • How does the coursework, the comprehensive exam, the dissertation prospectus integrate in the most efficient ways.
  • Will there be a distinctive niche for the program?

Faculty in every humanities department offering the Ph.D. should be discussing these questions.

Graduate Education in More Differentiated Higher Education Environments?

The distinction among institutions could offer some welcome differentiation in the higher education environment. Creating a particular focus as a distinctive niche in each program, where more resources go into certain subfields, is a productive move to avoid duplication and to carve out an attractive competitive position. Examples from our own institution, Michigan State University, include: a focus on biomedical and environmental ethics in our philosophy program; additional training in how to deliver first-rate general education courses in addition to disciplinary courses in our English department; a focus on digital humanities and educational technology in the German Ph.D. program and several other humanities programs; a focus on “Writing in Digital Environments” and cultural rhetorics in our writing program.

In our case, MSU’s strong science and engineering programs and its highly developed tradition as a global university allow the College of Arts and Letters to also integrate a strong sense of global awareness, and a focus on educational technology, digital humanities and media, and writing in digital environments for our graduate students. At other places, it might be a leading medical school that drives the campus climate that could provide many unique opportunities for humanities Ph.D.s. Yet others may have a strong policy and diplomacy focus or distinctive advanced institutes that may provide a compelling niche or added value dimension to humanities Ph.D.s.

Time to Degree

Then there are practical issues of how to foster a more deliberate and rapid move through the program, and the composition of its various elements; the avoidance of drift by shifting the culture of the programs to provide more hands-on mentoring; the avoidance of “unproductive lines of inquiry” (as David Damrosch described it in this article); more targeted support (including summer support) suggested in the same article, all of which would be helpful measures to enhance most programs.

What is somewhat lacking in the national discussion is a level of specificity and concrete ideas, such as how to better-integrate coursework, comprehensive exams and dissertation research to avoid unproductive breaks between these various pieces in graduate education. One looks forward to a discussion on this issue at the Modern Language Association Meeting in January 2013. Our time to degree overall across the humanities at Michigan State is already around six years and even shorter where this integration has already happened, and nowhere near the nine-year Ph.D. assumed in the recent article coming out of Stanford with its call for proposals for a five-year degree. The five-year Ph.D. is certainly within reach with a few modifications and more targeted financial support.

Better Utilization of the Summer

The productive and funded use of the summers will be crucial to make significant progress in course work and dissertation writing. We should not kid ourselves and assume that this is a trivial task  —if the level of funding remains the same with no significant increases, the cohort might very well have to shrink. Even more significant is the fact that the way faculty work during the summers has to undergo significant changes. There obviously will need to be a number of courses offered and hands-on mentoring, possibly in research groups or cohorts will have to be conducted, with possibly negative consequence for faculty research productivity. As an alternative, a robust and well-designed digital environment for student-student and student-faculty exchanges could be conceived of to keep students on task, on track and connected to faculty mentors. Faculty-led reading groups in preparation for the comprehensives could be part of summer offerings or be part of year-round workshops.

Beyond Shortening Time-to-Degree

To enhance preparation of our students for a variety of institutions, our programs at Michigan State University have added important features to assure realistic and defensible notions of quality graduate education. Most have integrated scholarship and pedagogy into the curriculum, and some provide job-shadowing opportunities at liberal arts colleges. We have added internships where possible and desirable. The philosophy program offers internships in a regional hospital for their bioethics students; language internships are available at MSU’s Community Language School (a language school for pre-K to middle school students from the greater Lansing area). Students from English and professional writing gain internship experience with journals and leadership experience working on co-curricular initiatives in project-based learning (leadership roles in our Creativity Exploratory, an interdisciplinary project-based space and concept to foster team work, design process thinking, and project management).

We consider advanced preparation in educational technology to be essential in today’s market regardless of field. In collaboration with our graduate school, we have created two distinct certificates that emphasize the pedagogy associated with humanities teaching and learning (one of a general nature, one with a focus on foreign language teaching). We are working on certificates in digital humanities and educational technology for graduate students to enhance their capacities as researchers and teachers. Furthermore, we encourage our graduate students to avail themselves of opportunities to learn what it takes to educate the whole student (informal shadowing in career services, study abroad, alumni relations, etc.) to further prepare themselves for a variety of institutions.

More Radical Solutions….

The voices that call for nonacademic career paths that would make students more suitable for the broader, nonacademic job market are becoming louder. This suggestion is often coupled with the time-to-degree issue. Making graduate education shorter and, thus, cheaper, might lead to the possibility of a larger and more diverse cohort (Louis Menand, The Marketplace of Ideas, New York: W. W. Norton, 2010).  With less time invested and less expense associated with the Ph.D., graduates might be more inclined to see the Ph.D. more like a professional degree and to pursue career paths outside of academia.

The elephant in the room is of course this “world outside of academia,” the “broader job market” that is alluded to in these kinds of discussions. It is generally less clear in the pertinent discussions what these “other career areas” are. It is not clear that a more narrow disciplinary preparation coupled with the shorter tighter time-to-degree timeline — although very laudable — is in itself ultimately successful in broadening job prospects beyond the academy.

Possible Solutions

As we know from our undergraduate placement in the humanities, humanities majors indeed find employment, but they have to be more proactive and more entrepreneurial in looking for and preparing for jobs. A lesson for the broader graduate market could be learned from that.

The other insight from undergraduate placements is the criticism by employers that undergraduate professional students don’t display sufficient big-picture thinking, the ethical maturity, the global perspective, the critical and analytical skills, the written and verbal communication skills, and the overall goal orientation that employers seek and many jobs demand. Given the increasing complexity of tasks in certain areas of the not-for-profit and the for-profit sector, maybe it is not that undergraduates and professional majors are not as well-prepared as they should be, but that expectations are too high.

With mandates for social innovation, technology-enhanced work habits, global awareness, and a generally broad education as key assets, work environments such as academic administration, the editing and publishing industry, translation and international diplomacy opportunities, entrepreneurial contexts, cultural organizations, think tanks, private and public sectors, government and nongovernmental organizations, research foundations, and local and regional public policy centers, might well be better-served by hiring employees with advanced degrees with their much stronger research, critical thinking, and communication skills.

Ph.D.-holders already display, by the nature of their work, some advanced transferable skills. They are able to

  • define a research question or a problem
  • research the topic
  • identify what is relevant, and distinguish it from what is not
  • synthesize the work of others
  • integrate information.

Further, Ph.Ds. are able to

  • offer independent and critical analysis of data
  • self-manage an area of inquiry
  • bring a complex project to completion
  • display significant experience in writing with precision
  • offer creative reconstruction of information
  • formulate new approaches
  • deal with constantly changing fields.

While these skills were honed in field-specific contexts, they transfer well. Making these skills more visible to both graduate students themselves and potential employers can be fostered through additional leadership training in a series of linked activities. A significant informative public relations and advocacy effort will need to take place to get this message across.

T-Shaped Graduate Education?

Humanities Ph.D.s could be both broadly and narrowly trained in the ideal T-structure consisting of their disciplinary field for depth on the one hand; and leadership skills, time and project management abilities, technology skills, an ability to analyze data, and the pedagogical understanding to convey information in appropriate ways and the most useful medium for breadth on the other. This is a very attractive combination of skill sets for a variety of employment contexts. Furthermore, Ph.D.s embody the essence of innovation and creativity as they are used to formulating original research questions. A research degree coupled with excellent technology skills, leadership skills, a solid grasp of data analysis, and self-efficacy seems to be a promising combination of transferable skills. 

Becoming a T-shaped professional is not only desirable for undergraduates, but will make graduate students more competitive as well. Even if students seek academic jobs, these skills will be extremely useful for future faculty because they will be able to adjust to the ever-changing landscape of higher education and understand and appreciate the bigger picture. They will be more nimble in whatever context they enter. Humanities Ph.D.s could and should make highly attractive job candidates for a range of sectors.

After all, before the wicked problems of our present and future can be solved, historically informed complex analyses of the underlying issues and questions will need to be framed, the ethical dimensions considered, collaborative relationships formed, and effective forms of multimodal communication for the issue at hand created. Without understanding the respective cultural underpinnings of global competitiveness or conflict, technological solutions may miss the mark. 

Similarly, the ability to understand global forces and local diversity, ethical issues, and complex environments through interdisciplinary projects that combine creativity, research, critical analysis, and technology furthermore could make humanities Ph.D.s compelling employees.

These are but a few examples of how to add value to graduate education through more focus in the discipline while adding breadth to the experience beyond the discipline. If the conversation on the future of the humanities Ph.D. is to go beyond general statements of intent, it will be important to share best practices; to collect data and evidence; to work not only among humanities faculties but to involve graduate deans, deans and other relevant administrators; to engage national organizations like the MLA, the Humanities Centers, and the foundations that support humanities scholarship and education.

It is also clear that graduate students will need to receive additional training beyond their research focus, in a more thoughtful co-curriculum, and will need to be more creative and flexible in exploring options.

Additional Training and Who Should Provide It

The other lesson from undergraduate education is that, in general, university career-service professionals tend to be more helpful and knowledgeable than faculty advisers in assisting students to think about what transferable skills they have. Likewise, if we were to use the analogy for the graduate level, we may have similar issues in that faculty for the most part are not well-versed in the nonprofit, government, and business world. The other emergent area of entrepreneurship, that many undergraduates and potentially graduate students are interested in, is also not necessarily on the radar of faculty advisers.

The notion that humanities faculty could directly train Ph.D.s for jobs outside the academy seems implausible as very few of them have extensive experience beyond higher education. Working with alumni who have made the successful transition into business or government, etc., is one important facet that can provide inspiration and contacts. However, it will not suffice to rely on this informal network.

In order to maximize impact, it will be important to offer internships with potential employers not only to learn whether the desired career path is suitable but to understand early on what kind of additional skills will be important. I see a similar move proposed by the BiblioTech concept at Stanford which “includes trying to change the mindset of academics and nonacademics alike… and garnering the trust of industry leaders.” 

Furthermore, it seems clear that a discipline-based humanities program will have to offer additional training to make inroads into business and the technology fields with specific intervention and additional training in the technology area and in leadership skills. One would also expect a need for internship developers, career services professionals and other support professionals, just as there are on the undergraduate level, to assist with planning and organizing these additional features of graduate education for positions outside of the academy. This career segment — especially in the more supervisory functions -- could, incidentally, be a valuable career path in itself for Ph.D.s. 

The recent initiatives to collect hard data on nonacademic placements conducted by the Scholarly Communications Institute and a database titled “Who We Are” by Katina Rogers is welcome news and long overdue. Efforts at further quantitative analysis will help us map the possibilities better than anecdotes can.  Universities themselves need to keep fairly differentiated data on their graduate students to learn how in their particular environment their students move through their programs, what the hurdles and bottlenecks are, and how and where they place.

Given the complexity of issues in all sectors of our current environment, it seems that humanities Ph.D.s with additional training in technology, data analysis, and leadership skills are an underappreciated and underutilized resource. Some of our attention in graduate education needs to go into further serious exploration of the possibilities and whether or not they are attractive to employers and Ph.D.s.

I think there are exciting opportunities ahead. The big question is whether humanities Ph.D.s themselves will embrace these options as desirable, which, of course depends on what motivated them to select the humanities Ph.D. path in the first place. Their voice is conspicuously absent in these conversations and it is, after all, their future that is at stake. A more robust conversation with these most important stakeholders should be one of the first steps.

Early indications from conversations with our graduate students indicate that there is a mix of motivations; many are still very interested in academic positions, others are open to a broader set of possibilities. The most ambitious students are quite interested in leadership skills such as effective communication, time management, resilience, self-efficacy, conflict resolution, etc., which they see as broadly applicable for effective career advancement in any field.  As our graduate students accept, and even embrace, a world of wider vocational choices, I am confident that enough of our faculty change leaders will rise to the occasion to reshape graduate education in the humanities in the ways suggested above, many possible ways not addressed here, and some that are yet to be imagined in the current social, cultural, political and economic environment.

Given the mounting complexity and accelerated change, our Ph.D.s need to have a new mindset fostered by additional skills that allows them to act with greater agility and creativity to changing environments. On the most fundamental level, Ph.D.s assemble and organize existing knowledge, create new knowledge, and are trained experts in how to convey knowledge in a variety of contexts.

Which sector could not use this kind of sophisticated expertise?

Karin A. Wurst is professor and dean of the College of Arts and Letters at Michigan State University.

Editorial Tags: 

Essay on the MLA job interview

Cheryl Ball and Katherine Ellison walk you through what to expect in the hotel room meeting with a search committee.

Job Tags: 
Ad keywords: 
Topic: 
Editorial Tags: 

Essay questions idea of a humanities job crisis

Should students considering a Ph.D. in the humanities have their heads examined? It’s a reasonable question to ask, what with all the mockery they have to endure. Take the cover of The New Yorker on May 24, 2010. It shows a certain Tim, hanging up his Ph.D. diploma in the bedroom where he grew up. He’s no scientist, as other headlines make clear: "The crisis of the humanities officially arrives," reads one (from October 2010), which was occasioned by the closure of some underenrolled undergraduate programs in the humanities at the State University of New York at Albany.
 
To answer the question, one might ask some questions of the data. The numbers tell a different story.
 
To judge by the choices that undergraduates are making in selecting their majors, the humanities continue to have appeal. For the period between 1987 and 2009, there’s no sign of steep decline in interest; instead, it’s a story of a modest rise and an even more modest descent. Since data about majors fail to track total course enrollment, majors are an indirect proxy that may actually underestimate students’ interests and activities. If one looks at the behavioral and social sciences, one finds that they show a similar pattern. In part because students continue to choose humanities courses and majors at the undergraduate level, colleges and universities continue to hire for these departments. Again, the data tell the real story: there has been no significant decline in full- and part-time employment in the humanities between 1999 and 2006. As measured by advertised vacancies, employment prospects for humanities Ph.D.s trended upward between 2003-04 and 2007-08 and have begun to recover after a recession-related drop in 2008-09.
 
In fact, a reduction in enrollment in Ph.D. programs in the humanities, coupled with the evidence showing that undergraduate majors in the humanities have remained steady, can be taken to suggest that the longstanding oversupply of Ph.D.s is now being mitigated. The relative share of doctorates in education and the humanities has dropped considerably over the last decade, in part because the production of Ph.D.s in science and engineering, which accounted for 73 percent of all doctorates in 2010, has risen so steeply. According to results from the Survey of Earned Doctorates, the number of humanities Ph.D.s granted in the U.S. dropped from 5,404 in 2000 to 4,979 in 2010.
 
And what of the choices that graduate students in the humanities are currently making?
 
According to the most recent survey data that we have gathered at the Graduate Center of the City University of New York, 86 percent of humanities Ph.D. students are satisfied with their programs, and 78 percent would recommend them to prospective students. Our figures are slightly higher than the most recent national satisfaction data, available on the website of the National Association of Graduate and Professional Students. And these students are thinking not only about their Ph.D. departments, but also about their employment prospects.

The national data show that, when surveyed three years after finishing their degrees, about 94 percent of students with humanities Ph.D.s report being employed. Of these, 77 percent were employed in education, and 17 percent outside of it, in a wide variety of occupations — from artists and entertainers, to writers, public-relations specialists, broadcasters and administrators — and much more besides. In the past five years of our own alumni survey, between 89 and 100 percent of humanities students with full-time employment reported that their employment five years after graduation utilized their doctoral training. And employment outside of the academy is not necessarily or even mostly a fallback response to failure in the academic marketplace: when asked about their primary career goals, about 17 percent of first-year students in the humanities at our institution identify goals in activities other than research and teaching. Many of our students don’t end up with academic jobs because they are interested in pursuing other types of employment.
 
Now it almost goes without saying that a Ph.D. in the humanities, given opportunity costs and the long-term promise of modest salaries, hardly makes much sense for someone who wishes to maximize income. For one thing, the degree takes longer: the average Ph.D. recipient in the humanities spends almost nine and a half years enrolled in graduate school; the average student in the life and physical sciences under seven years. For another, securing a post-degree position takes more and more time. And, as is well-known, when they do secure their jobs, humanists are paid less than those in other fields. Like it or not, we live in a culture that rewards the production of applied knowledge far more than it does the preservation, analysis or critique of culture, the rare and exceptionally well-compensated philosopher or literary critic notwithstanding. Differential salaries, from this point of view, are merely the individuated results of market forces and sociopolitical values.
 
Perhaps we should mock students less and apply ourselves more to understanding the broader structural changes in the economy, including how these changes affect the academy. Numbers that show flat (or even slightly improving) job prospects for Ph.D.s in the humanities should not obscure a number of underlying patterns, the most important of which are increased "casualization" and job insecurity. One may justifiably lament that adjunct and full-time, non-tenure-track jobs now constitute about 70 percent of the academic labor force, and that the path to a tenure-track position increasingly takes a detour through short-term employment. But the problems are not unique to higher education, which is a microcosm of the globalizing workplace. The decline in tenure among faculty mirrors the loss of lifelong (or at least long-term) employment in other sectors of the labor force.
 
What makes higher education distinctive is not so much that labor practices are changing, much less that students have their heads in the sand. It’s that academic employees — the readers and writers who constitute a faculty — are such sharp-eyed observers of those practices and energetic advocates for their profession.


 

Chase F. Robinson is distinguished professor of history and provost of the Graduate Center of the City University of New York.

Editorial Tags: 

Poem about student writing

Since the beginning of time
Everyone knows in society today
Student writing hasn’t gotten any better
Nor is it really any worse than usual.
The sentences are still afraid of commas
And plurals and possessives share a closet.
I don’t expect much improvement
Without better nutrition and stronger threats.
Plus, there are far too many sentences
That begin with This or There followed
By big empty boxes of Is and Are.
(Perhaps this student should take a year off
And read books with real people in them.)
And I’m only talking about sentences
Not the paragraphs that struggle along
Between the left and right margins
But miraculously start and finish
At the top and bottom of each page.
Also, I was really hoping for an original title
And just once my name spelled right.

 

Laurence Musgrove is professor and chair of English and modern languages at Angelo State University.

Section: 
Editorial Tags: 

Essay on the idea that non-philosophers should judge philosophers

One of the oldest questions of philosophy is, "Who guards the guardians?" When Plato posed this question -- if not quite this succinctly -- his concern was with how a community can keep its leaders focused on the good of the whole. Plato's answer was that guardians should govern themselves — philosophy would train their souls so that they would choose wisely rather than unjustly. Kings would become philosophers, and philosophers kings.

This is not how we do things today. In representative forms of government the people rule, at least intermittently, through processes such as voting, recalls, and referenda. Particularly within the American experiment everybody guards everyone else — through a system of "checks and balances." But there is at least one major institution that still follows Plato's lead, relying on self-governance and remaining proudly nondemocratic: the academy.

We academics have long argued that we have a special justification for self-rule. We claim that our activities — which consist of the production of knowledge, and its dissemination via presentations, publications, and teaching — are so specialized and so important that ordinary people cannot properly judge our work. Instead, we have devised a way to evaluate ourselves, through a process known as peer review.

Whether it is a matter of articles or books, grant applications, or tenure and promotion, review by one's academic peers has long been the standard. And who are one's peers? The academy gives a disciplinary answer to this question. Biologists are the ones competent to judge work in biology, and only chemists can judge the research of other chemists. Nonexperts — whether within or outside the academy — will only disrupt the process, leading to misguided or even disastrous results. Best to leave such questions to the experts.

But what of philosophy? Across the 20th century and now into the 21st, philosophers have been evaluated in the same way. Even while claiming that philosophy has a special relevance to everyday life, philosophers have mostly written for and been evaluated by their disciplinary peers. Philosophy became more and more professionalized in the 20th century, with nonexperts increasingly unable to comprehend, much less judge, the work of philosophers. A philosopher today is not considered successful unless he or she contributes to professional, peer-reviewed publications in the field.

But should philosophy really act like the other disciplines in this regard? Should philosophy be considered a "discipline" at all? And if not, what are the consequences for the governance of philosophy?

One of the oddities of present-day philosophy is how rarely this question is asked. Go to a philosophy department with a graduate program, and sign up for a course in ancient philosophy: the professor will be expected to know ancient Greek, and to be well-read in the scholarly literature in the area. The irony is that there was no secondary literature for the Greeks — no scholarship at all, in fact, in the sense that we mean it today. Philosophers were thinkers, not scholars. Socrates would never get tenure: what did he write?

This situation was partly a matter of technology; paper was expensive and reproduction of a manuscript laborious. But it is still odd to assume that Plato and Aristotle would have been good scholars if only they’d had access to the Philosopher's Index and an Internet connection. Nor were the Greeks good disciplinarians. Socrates was notorious for speaking with people from all walks of life; and when he came to be evaluated it was by a jury of his peers consisting of 500 Athenians. He may not have liked the verdict, but he did not dispute the jury's right to pass judgment.

Across the long sweep of Western history we find the point repeated: Bacon, Machiavelli, Descartes, Leibniz, Locke, Marx and Nietzsche all wrote for and sought the judgment of peers across society. One wonders what they would think of what counts as philosophy across the 20th century — a highly technical, inward-looking field that values intellectual rigor over other values such as relevance or timeliness.

Questions about who should count as a philosopher's peers are timely today, for our standard notions of academic peer review are now under assault. Publicly funded science is being held more socially accountable. At the National Science Foundation, grant proposals are now judged by both disciplinary and transdisciplinary criteria — what are called, respectively, "intellectual merit" and "broader impacts." Universities are also being held responsible for outcomes, with state funding increasingly being tied to graduation rates and other achievement measures. Philosophers, too, have begun to feel the pinch of accountability, especially in Britain, where the so-called "impact agenda" has advanced more rapidly than in the United States.

We view this situation as more of an opportunity than as a problem. Philosophers should get creative and treat the question of who counts as our peers as itself a philosophic question. There are a variety of ethical, epistemological, and political issues surrounding peer review worthy of philosophic reflection. But perhaps the most pressing is the question of whether we should extend the notion of peer beyond disciplinary bounds.

This could occur in a number of different ways. Not only could we draw nonphilosophers or nonacademics into the peer review process. We could also consider a variety of other criteria, such as the number of publications in popular magazines or newspaper articles; number of hits on philosophic blogs; number of quotes in the media; or the number of grants awarded by public agencies to conduct dedisciplined philosophic work.

Now, some will claim that extending the idea of our philosophical peers to include nonphilosophers will expose philosophy to the corruptions of the demos. Is philosophizing to become a sheer popularity contest, where philosophers are promoted based on their Klout score, or the number of Facebook likes their blog posts garner? Aren’t we proposing that the Quineans be replaced by the Bieberians?

Such objections stem, in part at least, from what we could call a Cartesian ethos — the idea that philosophers should strive above all to avoid error. We should withhold our assent to any claim that we do not clearly and distinctly perceive to be true. This Cartesian ethos dominates philosophy today, and nowhere is this clearer than in regard to peer review. Our peers are our fellow philosophers, experts whose rigor stands in for Descartes' clear and distinct ideas.

For a counterethos we could call upon William James's "The Will to Believe." James argues that the pursuit of truth, even under conditions where we cannot be certain of our conclusions, is more important than the strict avoidance of error. Those who object that this will open philosophy up to all sorts of errors that would otherwise have been caught by expert peer review are exhibiting excessive Cartesianism. In fact, those who insist on the value of expertise in philosophy are reversing the Socratic approach. Whereas Socrates always asked others to contribute their opinions in pursuit of truth, Descartes trusted no one not to lead him into error. A Jamesian approach to peer review, on the other hand, would be generous in its definition of who ought to count as a peer, since avoiding error at all costs is not the main goal of philosophy. On a Jamesian approach, we would make use of peers in much the way that Socrates did — in an effort to pursue wisdom.

It is true that when philosophers broaden their peer group, they lose some control over the measures used to define philosophic excellence. This raises another risk — that philosophy will be merely an instrument for an exterior set of ends. The fear here is not that abandoning disciplinary peer review will lead us into error. Instead, it is that the only alternative to value as judged by disciplinary peers is a crass utilitarianism, where philosophic value is judged by how well it advances a paymaster’s outcome. One philosopher may be labeled a success for helping a racist political candidate hone his message, while another may be labeled a failure for not sufficiently fattening a corporation's bottom line. Isn’t a dedisciplined philosophy actually a return to sophistry rather than to Socrates? Won’t it sell its services to whoever is buying, adjusting its message to satisfy another’s agenda and criteria for success? In order to survive until the turn of the 22nd century, must we sell the soul of philosophy at the beginning of the 21st?

We have two replies to such concerns. First, philosophy existed long before the 20th-century model of academic disciplinarity came to define its soul. The struggle between philosophy and sophistry is a perennial one, and one does not necessarily sell out by writing for a larger audience — or remain pure by staying within disciplinary boundaries.

Second, disciplinary and dedisciplinary approaches to philosophy should be seen as complementary rather than antagonistic to one another. Rigor should be seen as pluralistic: the rigor of disciplinary work is different from, but neither better nor worse, than the philosophic rigor required to adjust one’s thinking to real world exigencies. This is a point that bioethicists have long understood.  In his seminal 1973 "Bioethics as a Discipline," Daniel Callahan already saw that doing philosophical thinking with physicians, scientists, and other stakeholders demands "rigor … of a different sort than that normally required for the traditional philosophical or scientific disciplines." Bioethics exists in disciplinary and in nondisciplinary forms — in ways that synergize. It shows that we need not be forced, as a matter of general principle, to choose one set of peers over another.

Practically speaking, examining the question of who should count as a peer means that philosophers will need to revisit some of the core elements of our field. For one, our criteria for tenure and promotion would need to be reviewed. The current strict hierarchy surrounding where we publish — say, in "The Stone" (the New York Times philosophy blog) or in Mind — would need to be re-evaluated. And really, what is the argument for claiming that the latter sort of publication is of higher value? If you reply that the latter is peer-reviewed, excuse us for pointing out that your answer begs the question.

And what about the question of multiple authorship? Should this article count for less because three of us wrote it? How much less? Why? Co-authoring is actually just as challenging as producing single-authored works, as we can attest, so the justification cannot be that it is less work. Should we value independent scholarship over collaboration? Why? This is the Cartesian ethos coming back to haunt philosophy: I think; I exist; I write; I am a scholar. We doubt it.

As universities face growing demands for academic accountability, philosophers ought to take the lead in exploring what accountability means. Otherwise we may be stuck with Dickens’s Mr. Gradgrind. ("Now, what I want is Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life.") But a philosophical account of accountability will also require redefining the boundaries of what counts as philosophy. We ought to engage those making accountability demands from the outside in just the way that Socrates engaged Euthyphro on piety. If there are indeed Bieberians at the gate, we say let them in — as long as they are willing to engage in dialogue, we philosophers should do all right. Unless it is we philosophers who refuse to engage.

 

Author/s: 
Robert Frodeman, J. Britt Holbrook and Adam Briggle
Author's email: 
info@insidehighered.com

Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas. He was editor in chief of the Oxford Handbook of Interdisciplinarity.

J. Britt Holbrook is research assistant professor of philosophy and assistant director of the Center for the Study of Interdisciplinarity at the University of North Texas. He is editor in chief of Ethics, Science, Technology, and Engineering: An International Resource, forthcoming from Gale-Cengage Learning.

Adam Briggle is assistant professor of philosophy at the University of North Texas. He is author, with Carl Mitcham, of Ethics and Science: An Introduction from Cambridge University Press, 2012.

Editorial Tags: 

Essay on landing an academic job when not expecting to

Category: 
On the Fence

When Eliza Woolf gave up on finding a good academic job, she landed one.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Essay critiques the ideas of Clay Shirky and others advocating higher ed disruption

Clay Shirky is a big thinker, and I read him because he’s consistently worth reading. But he’s not always right – and his thinking (and the flaws in it) is typical of the unquestioning enthusiasm of many thinkers today about technology and higher education. In his recent piece on "Napster, Udacity, and the Academy," for example, Shirky is not only guardedly optimistic about the ways that MOOCs and online education will transform higher education, but he takes for granted that they will, that there is no alternative. Just as inevitably as digital sharing turned the music industry on its head, he pronounces, so it is and will be with digital teaching. And as predictably as rain, he anticipates that "we" in academe will stick our heads in the sand, will deny the inevitable -- as the music industry did with Napster -- and will "screw this up as badly as the music people did." His views are shared by many in the "disruption" school of thought about higher education.

I suspect that if you agree with Clay Shirky that teaching is analogous to music, then you are likely to be persuaded by his assertion that Udacity -- a lavishly capitalized educational startup company -- is analogous to Napster. If you are not impressed with this analogy, however, you will not be impressed by his argument. And just to put my cards on the table, I am not very impressed with his argument. I think teaching is very different from music; that it is so different as to make the comparison obscure a lot more than it reveals.

But the bigger problem is that this kind of argument is weighted against academics, virtually constructed so as to make it impossible for an academic to reply. If you observe that "institutions will try to preserve the problem to which they are the solution," after all -- what has been called "The Shirky Rule" -- it can be easy to add the words "all" and "always" to a sentence in which they do not belong. This not a principle or a rule; it’s just a thing that often happens, and often is not always. But if you make the mistake of thinking that it is, you can become uniformly prejudiced against "institutions," since you literally know in advance what they will do and why. Because you understand them better than they understand themselves -- because they don’t or can’t realize that they are simply "institutions" -- you can explain things about them that they can neither see, nor argue against. "Why are you so defensive?" you ask, innocently, and everything they say testifies against them.

If someone like me -- a graduate student for many years, currently trying to find an academic job -- looks at MOOCs and online education, and sees the downsides very clearly, it’s also true that no one has a more strongly vested interest in arguing the benefits of radically transforming the academe than Clay Shirky and a number of others who talk about the inevitability of radical change. As Chuck Klosterman unkindly put it, once, "Clay Shirky must argue that the Internet is having a positive effect – it’s the only reason he’s publicly essential." Which is not to say that Shirky is wrong, simply that he must prove, not presume, that he is right.

I have to go through this excessively long wind-up because of the ways that Shirky has stacked the rhetorical deck in his favor. He uses the word "we" throughout his piece, and in this powerful final paragraph, he hammers us over the head with it, so precisely that we might mistake it for a caress:

"In the academy, we lecture other people every day about learning from history. Now it's our turn, and the risk is that we’ll be the last to know that the world has changed, because we can’t imagine — really cannot imagine — that story we tell ourselves about ourselves could start to fail. Even when it’s true. Especially when it’s true."

But what do you mean "we," Mr. Distinguished Writer in Residence? I asked Shirky on Twitter if he considered himself primarily an academic, and though he didn’t respond, it’s important that he frames his entire post as if he’s an insider. But while it’s certainly true that I am biased in favor of academic labor continuing to exist in something like its present form, he is no less biased by having nothing to lose and everything to gain if academe is flipped on its head. And yet the cumulative rhetorical effect of his framing is to remind us that no one within the institution can speak knowledgeably about their institution, precisely because of their location within it; when Shirky speaks of "we" academics, he does so only to emphasize that "we" can’t imagine that the story we tell ourselves is wrong.

It's because he is willing to burn the village to save it that Shirky can speak for and of academe. Because Shirky never has to show evidence that online education will ever be any good; he notes an academic’s assessment of a Udacity course as "amazingly, shockingly awful" and is then, apparently, satisfied when Udacity admitted that its courses "can be improved in more than one way." A defensive blog post written by Udacity’s founder is enough to demonstrate that change for the better is happening. And when the academic who criticized the Udacity course mentions a colleague whose course showed some of the same problems -- but does not name the colleague -- Shirky is triumphant. The academic in question "could observe every aspect of Udacity’s Statistics 101 (as can you) and discuss them in public," Shirky observes, "but when criticizing his own institution, he pulled his punches."

This is Clay Shirky’s domain, and also the domain of so many others who point to one or another failing of traditional higher ed to suggest that radical change is needed. The anecdote that illustrates something larger. In this case, the fact that academe is a "closed" institution means it cannot grow, change, or improve. By contrast, "[o]pen systems are open" seems to be the end of the discussion; when he contemplates the openness of a MOOC, the same definitional necessity applies. "It becomes clear," he writes, "that open courses, even in their nascent state, will be able to raise quality and improve certification faster than traditional institutions can lower cost or increase enrollment.” It becomes clear because it is clear, because "open" is better, because it is open.

But how "open" is Udacity, really? Udacity’s primary obligation is to its investors. That reality will always push it to squeeze as much profit out of its activities as it can. This may make Udacity better at educating, but it also may not; the job of a for-profit entity is not to educate, but to profit, and it will. There’s nothing necessarily wrong with for-profit education -- and most abuses can be traced back to government deregulation, not tax status -- but the idea that "openness," as such, will magically transform how a business does business is a massively begged question. A bit of bad press can get Sebastian Thrun to write a blog post promising change, but actually investing the resources necessary to follow through on that is actually a very different question. The fact that someone like Shirky takes him at face value -- not only gives him the benefit of the doubt, but seems to have no doubt at all -- speaks volumes to me.

Meanwhile, did the academic that Shirky criticizes really "pull his punches"? Did he refrain from naming his colleague because of the way academics instinctively shield each other from criticism? It’s far from clear; if you read the original blog post, in fact, it’s not even apparent that the academic knew who this "colleague" actually was. All we really know is that a student referred to something her "last teacher" did. But suppose he did know who this student’s last teacher was; suppose the student mentioned the teacher by name. Would it have been appropriate to post someone’s name on the Internet just because a secondhand source told you something bad about them? Does that count as openness?

Open vs. closed is a useful conceptual distinction, but when it comes down to specific cases, these kinds of grand narratives can mislead us. For one thing, far from the kind of siege mentality that characterized an industry watching its business model go up in smoke -- an industry that was not interested in giving away its product for free -- academics are delighted to give away their products for free, if they can figure out a way to do it. Just about every single public and nonprofit university in the country is working to develop digital platforms for education, or thinking hard about how they can. This doesn’t mean they are doing it successfully, or well; time will tell, and the proof will be in the pudding. But to imagine that Silicon Valley venture capitalists are the only people who see the potential of these technologies requires you to ignore the tremendous work that academics are currently doing to develop new ways of doing what they do. The most important predecessors to MOOCs, after all, were things like Massachusetts Institute of Technology's OpenCourseWare, designed entirely in the spirit of openness and not in search of profit.

The key difference between academics and venture capitalists, in fact, is not closed versus open but evidence versus speculation. The thing about academics is that they require evidence of success before declaring victory, while venture capitalists can afford to gamble on the odds. While Shirky can see the future revolutionizing in front of us, he is thinking like a venture capitalist when he does, betting on optimism because he can afford to lose. He doesn’t know that he’s right; he just knows that he might not be wrong. And so, like all such educational futurologists, Shirky’s case for MOOCs is all essentially defensive: he argues against the arguments against MOOCs, taking shelter in the possibility of what isn’t, yet, but which may someday be.

For example, instead of arguing that MOOCs really can provide "education of the very best sort," Shirky explicitly argues that we should not hold them to this standard. Instead of thinking in terms of quality, we should talk about access: from his perspective, the argument against MOOCs is too narrowly focused on the "18-year-old who can set aside $250k and four years" and so it neglects to address students who are not well-endowed with money and time. "Outside the elite institutions," Shirky notes, "the other 75 percent of students — over 13 million of them — are enrolled in the four thousand institutions you haven’t heard of." And while elite students will continue to attend elite institutions, "a good chunk of the four thousand institutions you haven’t heard of provide an expensive but mediocre education."

This is a very common argument from MOOC boosters, because access is a real problem. But while a "good chunk" of 13 million students are poorly served by the present arrangement, it is quite telling that his example of "expensive but mediocre education" is Kaplan and the University of Phoenix, for-profit institutions that are beloved by the same kinds of venture capitalists who are funding Udacity. He is right: For-profit education has amassed a terrible track record of failure. If you are getting a degree at a for-profit institution, you probably are paying too much for too little. But would it be any less mediocre if it were free?

Udacity’s courses are free to consumers (though not, significantly, to universities), at least for now. And Shirky is not wrong that "demand for knowledge is so enormous that good, free online materials can attract extraordinary numbers of people from all over the world." But Shirky doesn’t mean "demand" in the economic sense: demand for a free commodity is just desire until it starts to pay for the thing it wants. Since there is a lot of unmet desire for education out there, and since that desire is glad to have the thing it wants when it finds it for free, it seems all to the good that students can find courses for free. But while we should ask questions about why venture capitalists are investing so heavily in educational philanthropy, we also need to think more carefully about why is there so much unmet desire in the first place, and why so many people want education without, apparently, being able to pay for it. Why hasn’t that desire already found a way to become demand, such that it must wait until Silicon Valley venture capitalists show up, benevolently bearing the future in their arms?

The giveaway is when Shirky uses the phrase "non-elite institutions": for Shirky, there are elite institutions for elite students and there are non-elites for everyone else. The elite institutions will remain the same. No one will ever choose Udacity over Harvard or U.Va., and while elite institutions like MIT, Stanford, Princeton, and my own University of California are leaping into the online education world head first, anyone who thinks these online brands will ever compete with "the real thing" will be exactly the kind of sucker who would fork over full price for a watered-down product.

MOOCs are only better than nothing and speculation that this will someday change is worth pursuing, but for now, remains just that, speculation. It should be no surprise that venture capital is interested in speculation. And it should be no surprise that when academics look at the actual track record, when we try to evaluate the evidence rather than the hope, we discover a great deal to be pessimistic about.

Why have we stopped aspiring to provide the real thing for everyone? That’s the interesting question, I think, but if we begin from the distinction between "elite" and "non-elite" institutions, it becomes easy to take for granted that "non-elite students" receiving cheap education is something other than giving up. It is important to note that when online education boosters talk about "access," they explicitly do not mean access to "education of the best sort"; they mean that because an institution like Udacity provides teaching for free, you can’t complain about its mediocrity. It’s not an elite institution, and it’s not for elite students. It just needs to be cheap.

Talking in terms of "access" (instead of access to what?) allows people like Shirky to overlook the elephant in the room, which is the way this country used to provide inexpensive and high-quality education to all sorts of people who couldn’t afford to go to Yale -- people like me and my parents. While state after state is defunding its public colleges and universities (and so tuition is rising while quality is declining), the vast majority of American college students are still educated in public colleges and universities, institutions that have traditionally provided very high-quality mass higher education, and which did it nearly for free barely a generation ago.

"Access" wouldn’t even be a problem if we didn’t expect mass higher education to still be available: Americans only have the kind of reverence for education that we have because the 20th century made it possible for the rising middle class to have what had previously been a mark of elite status, a college education. But the result of letting these public institutions rot on the vine is that a host of essentially parasitic institutions -- like Udacity -- are sprouting like mushrooms on the desire for education that was created by the existence of the world’s biggest and best public mass higher education system.

Shirky talks dismissively about his own education, at Yale, and recalls paying a lot of money to go to crowded lectures and then to discussion sections with underpaid graduate students. Let me counter his anecdote with my own, When I was a high school student, in Appalachian Ohio, I told my guidance counselor that I wanted to go to Harvard, and he made me understand that people from Fairland High School do not really go to Harvard. I was a dumb high school student, so I listened to him. But although both of my parents worked in West Virginia, they had moved to Ohio when I was young so that I could go to Ohio schools, and this meant that although my grades were only moderately good -- and I had never had access to Advanced Placement classes -- I was able to apply to Ohio State University, get in, afford it, and get an education that was probably better than the one that Shirky got at Yale, and certainly a heck of a lot cheaper. My parents paid my rent, but I paid my tuition myself -- with part time jobs and $20,000 in loans -- and I didn’t have a single class in my major with more than 30 students. I had one-on-one access to all of my professors, and I took advantage of it.

It's a lot harder to do this now, of course; tuition at Ohio State is more than double what it was when I started in 1997. More important, you not only pay a lot more if you go to a school like Ohio State, you’re also a lot less likely to get in; the country’s college-age population has continued to grow, but the number of acceptance letters that public universities like OSU send out has not increased. As Mike Konczal and I have argued, this shortfall in quality higher education creates what economists call "fake supply." If you don’t get in to a college specializing in education "of the best sort" (or if your guidance counselor tells you not to apply), where do you go, if you go? You go to an online university, to Kaplan, or maybe now you try a MOOC or a public college relying on MOOCs to provide general education, as Texas now envisions. Such things are better than nothing. But "nothing" only seems like the relevant point of comparison if we pretend that public higher education doesn’t exist. And if we ignore the fact that we are actively choosing to let it cease to exist.

Beware anyone who tries to give you a link to WebMD as a replacement for seeing a real doctor.

 

Aaron Bady is a doctoral candidate in English literature at the University of California at Berkeley, and he writes and tweets for The New Inquiry as @zunguzungu.

Editorial Tags: 

Pages

Subscribe to RSS - Humanities
Back to Top