faculty

Essay questions idea of a humanities job crisis

Should students considering a Ph.D. in the humanities have their heads examined? It’s a reasonable question to ask, what with all the mockery they have to endure. Take the cover of The New Yorker on May 24, 2010. It shows a certain Tim, hanging up his Ph.D. diploma in the bedroom where he grew up. He’s no scientist, as other headlines make clear: "The crisis of the humanities officially arrives," reads one (from October 2010), which was occasioned by the closure of some underenrolled undergraduate programs in the humanities at the State University of New York at Albany.
 
To answer the question, one might ask some questions of the data. The numbers tell a different story.
 
To judge by the choices that undergraduates are making in selecting their majors, the humanities continue to have appeal. For the period between 1987 and 2009, there’s no sign of steep decline in interest; instead, it’s a story of a modest rise and an even more modest descent. Since data about majors fail to track total course enrollment, majors are an indirect proxy that may actually underestimate students’ interests and activities. If one looks at the behavioral and social sciences, one finds that they show a similar pattern. In part because students continue to choose humanities courses and majors at the undergraduate level, colleges and universities continue to hire for these departments. Again, the data tell the real story: there has been no significant decline in full- and part-time employment in the humanities between 1999 and 2006. As measured by advertised vacancies, employment prospects for humanities Ph.D.s trended upward between 2003-04 and 2007-08 and have begun to recover after a recession-related drop in 2008-09.
 
In fact, a reduction in enrollment in Ph.D. programs in the humanities, coupled with the evidence showing that undergraduate majors in the humanities have remained steady, can be taken to suggest that the longstanding oversupply of Ph.D.s is now being mitigated. The relative share of doctorates in education and the humanities has dropped considerably over the last decade, in part because the production of Ph.D.s in science and engineering, which accounted for 73 percent of all doctorates in 2010, has risen so steeply. According to results from the Survey of Earned Doctorates, the number of humanities Ph.D.s granted in the U.S. dropped from 5,404 in 2000 to 4,979 in 2010.
 
And what of the choices that graduate students in the humanities are currently making?
 
According to the most recent survey data that we have gathered at the Graduate Center of the City University of New York, 86 percent of humanities Ph.D. students are satisfied with their programs, and 78 percent would recommend them to prospective students. Our figures are slightly higher than the most recent national satisfaction data, available on the website of the National Association of Graduate and Professional Students. And these students are thinking not only about their Ph.D. departments, but also about their employment prospects.

The national data show that, when surveyed three years after finishing their degrees, about 94 percent of students with humanities Ph.D.s report being employed. Of these, 77 percent were employed in education, and 17 percent outside of it, in a wide variety of occupations — from artists and entertainers, to writers, public-relations specialists, broadcasters and administrators — and much more besides. In the past five years of our own alumni survey, between 89 and 100 percent of humanities students with full-time employment reported that their employment five years after graduation utilized their doctoral training. And employment outside of the academy is not necessarily or even mostly a fallback response to failure in the academic marketplace: when asked about their primary career goals, about 17 percent of first-year students in the humanities at our institution identify goals in activities other than research and teaching. Many of our students don’t end up with academic jobs because they are interested in pursuing other types of employment.
 
Now it almost goes without saying that a Ph.D. in the humanities, given opportunity costs and the long-term promise of modest salaries, hardly makes much sense for someone who wishes to maximize income. For one thing, the degree takes longer: the average Ph.D. recipient in the humanities spends almost nine and a half years enrolled in graduate school; the average student in the life and physical sciences under seven years. For another, securing a post-degree position takes more and more time. And, as is well-known, when they do secure their jobs, humanists are paid less than those in other fields. Like it or not, we live in a culture that rewards the production of applied knowledge far more than it does the preservation, analysis or critique of culture, the rare and exceptionally well-compensated philosopher or literary critic notwithstanding. Differential salaries, from this point of view, are merely the individuated results of market forces and sociopolitical values.
 
Perhaps we should mock students less and apply ourselves more to understanding the broader structural changes in the economy, including how these changes affect the academy. Numbers that show flat (or even slightly improving) job prospects for Ph.D.s in the humanities should not obscure a number of underlying patterns, the most important of which are increased "casualization" and job insecurity. One may justifiably lament that adjunct and full-time, non-tenure-track jobs now constitute about 70 percent of the academic labor force, and that the path to a tenure-track position increasingly takes a detour through short-term employment. But the problems are not unique to higher education, which is a microcosm of the globalizing workplace. The decline in tenure among faculty mirrors the loss of lifelong (or at least long-term) employment in other sectors of the labor force.
 
What makes higher education distinctive is not so much that labor practices are changing, much less that students have their heads in the sand. It’s that academic employees — the readers and writers who constitute a faculty — are such sharp-eyed observers of those practices and energetic advocates for their profession.


 

Chase F. Robinson is distinguished professor of history and provost of the Graduate Center of the City University of New York.

Editorial Tags: 

Controversial Education Dean Quits at Iowa

Margaret Crocco announced Monday that she is resigning as dean of the College of Education at the University of Iowa, where her tenure has been controversial, The Des Moines Register reported. Professors have questioned her leadership, and last week all of the members of a faculty advisory committee for the college quit amid reports that administrators ordered some faculty leaders to destroy the results of a survey about Crocco's performance.

 

Ad keywords: 

Poem about student writing

Since the beginning of time
Everyone knows in society today
Student writing hasn’t gotten any better
Nor is it really any worse than usual.
The sentences are still afraid of commas
And plurals and possessives share a closet.
I don’t expect much improvement
Without better nutrition and stronger threats.
Plus, there are far too many sentences
That begin with This or There followed
By big empty boxes of Is and Are.
(Perhaps this student should take a year off
And read books with real people in them.)
And I’m only talking about sentences
Not the paragraphs that struggle along
Between the left and right margins
But miraculously start and finish
At the top and bottom of each page.
Also, I was really hoping for an original title
And just once my name spelled right.

 

Laurence Musgrove is professor and chair of English and modern languages at Angelo State University.

Section: 
Editorial Tags: 

Essay on the idea that non-philosophers should judge philosophers

One of the oldest questions of philosophy is, "Who guards the guardians?" When Plato posed this question -- if not quite this succinctly -- his concern was with how a community can keep its leaders focused on the good of the whole. Plato's answer was that guardians should govern themselves — philosophy would train their souls so that they would choose wisely rather than unjustly. Kings would become philosophers, and philosophers kings.

This is not how we do things today. In representative forms of government the people rule, at least intermittently, through processes such as voting, recalls, and referenda. Particularly within the American experiment everybody guards everyone else — through a system of "checks and balances." But there is at least one major institution that still follows Plato's lead, relying on self-governance and remaining proudly nondemocratic: the academy.

We academics have long argued that we have a special justification for self-rule. We claim that our activities — which consist of the production of knowledge, and its dissemination via presentations, publications, and teaching — are so specialized and so important that ordinary people cannot properly judge our work. Instead, we have devised a way to evaluate ourselves, through a process known as peer review.

Whether it is a matter of articles or books, grant applications, or tenure and promotion, review by one's academic peers has long been the standard. And who are one's peers? The academy gives a disciplinary answer to this question. Biologists are the ones competent to judge work in biology, and only chemists can judge the research of other chemists. Nonexperts — whether within or outside the academy — will only disrupt the process, leading to misguided or even disastrous results. Best to leave such questions to the experts.

But what of philosophy? Across the 20th century and now into the 21st, philosophers have been evaluated in the same way. Even while claiming that philosophy has a special relevance to everyday life, philosophers have mostly written for and been evaluated by their disciplinary peers. Philosophy became more and more professionalized in the 20th century, with nonexperts increasingly unable to comprehend, much less judge, the work of philosophers. A philosopher today is not considered successful unless he or she contributes to professional, peer-reviewed publications in the field.

But should philosophy really act like the other disciplines in this regard? Should philosophy be considered a "discipline" at all? And if not, what are the consequences for the governance of philosophy?

One of the oddities of present-day philosophy is how rarely this question is asked. Go to a philosophy department with a graduate program, and sign up for a course in ancient philosophy: the professor will be expected to know ancient Greek, and to be well-read in the scholarly literature in the area. The irony is that there was no secondary literature for the Greeks — no scholarship at all, in fact, in the sense that we mean it today. Philosophers were thinkers, not scholars. Socrates would never get tenure: what did he write?

This situation was partly a matter of technology; paper was expensive and reproduction of a manuscript laborious. But it is still odd to assume that Plato and Aristotle would have been good scholars if only they’d had access to the Philosopher's Index and an Internet connection. Nor were the Greeks good disciplinarians. Socrates was notorious for speaking with people from all walks of life; and when he came to be evaluated it was by a jury of his peers consisting of 500 Athenians. He may not have liked the verdict, but he did not dispute the jury's right to pass judgment.

Across the long sweep of Western history we find the point repeated: Bacon, Machiavelli, Descartes, Leibniz, Locke, Marx and Nietzsche all wrote for and sought the judgment of peers across society. One wonders what they would think of what counts as philosophy across the 20th century — a highly technical, inward-looking field that values intellectual rigor over other values such as relevance or timeliness.

Questions about who should count as a philosopher's peers are timely today, for our standard notions of academic peer review are now under assault. Publicly funded science is being held more socially accountable. At the National Science Foundation, grant proposals are now judged by both disciplinary and transdisciplinary criteria — what are called, respectively, "intellectual merit" and "broader impacts." Universities are also being held responsible for outcomes, with state funding increasingly being tied to graduation rates and other achievement measures. Philosophers, too, have begun to feel the pinch of accountability, especially in Britain, where the so-called "impact agenda" has advanced more rapidly than in the United States.

We view this situation as more of an opportunity than as a problem. Philosophers should get creative and treat the question of who counts as our peers as itself a philosophic question. There are a variety of ethical, epistemological, and political issues surrounding peer review worthy of philosophic reflection. But perhaps the most pressing is the question of whether we should extend the notion of peer beyond disciplinary bounds.

This could occur in a number of different ways. Not only could we draw nonphilosophers or nonacademics into the peer review process. We could also consider a variety of other criteria, such as the number of publications in popular magazines or newspaper articles; number of hits on philosophic blogs; number of quotes in the media; or the number of grants awarded by public agencies to conduct dedisciplined philosophic work.

Now, some will claim that extending the idea of our philosophical peers to include nonphilosophers will expose philosophy to the corruptions of the demos. Is philosophizing to become a sheer popularity contest, where philosophers are promoted based on their Klout score, or the number of Facebook likes their blog posts garner? Aren’t we proposing that the Quineans be replaced by the Bieberians?

Such objections stem, in part at least, from what we could call a Cartesian ethos — the idea that philosophers should strive above all to avoid error. We should withhold our assent to any claim that we do not clearly and distinctly perceive to be true. This Cartesian ethos dominates philosophy today, and nowhere is this clearer than in regard to peer review. Our peers are our fellow philosophers, experts whose rigor stands in for Descartes' clear and distinct ideas.

For a counterethos we could call upon William James's "The Will to Believe." James argues that the pursuit of truth, even under conditions where we cannot be certain of our conclusions, is more important than the strict avoidance of error. Those who object that this will open philosophy up to all sorts of errors that would otherwise have been caught by expert peer review are exhibiting excessive Cartesianism. In fact, those who insist on the value of expertise in philosophy are reversing the Socratic approach. Whereas Socrates always asked others to contribute their opinions in pursuit of truth, Descartes trusted no one not to lead him into error. A Jamesian approach to peer review, on the other hand, would be generous in its definition of who ought to count as a peer, since avoiding error at all costs is not the main goal of philosophy. On a Jamesian approach, we would make use of peers in much the way that Socrates did — in an effort to pursue wisdom.

It is true that when philosophers broaden their peer group, they lose some control over the measures used to define philosophic excellence. This raises another risk — that philosophy will be merely an instrument for an exterior set of ends. The fear here is not that abandoning disciplinary peer review will lead us into error. Instead, it is that the only alternative to value as judged by disciplinary peers is a crass utilitarianism, where philosophic value is judged by how well it advances a paymaster’s outcome. One philosopher may be labeled a success for helping a racist political candidate hone his message, while another may be labeled a failure for not sufficiently fattening a corporation's bottom line. Isn’t a dedisciplined philosophy actually a return to sophistry rather than to Socrates? Won’t it sell its services to whoever is buying, adjusting its message to satisfy another’s agenda and criteria for success? In order to survive until the turn of the 22nd century, must we sell the soul of philosophy at the beginning of the 21st?

We have two replies to such concerns. First, philosophy existed long before the 20th-century model of academic disciplinarity came to define its soul. The struggle between philosophy and sophistry is a perennial one, and one does not necessarily sell out by writing for a larger audience — or remain pure by staying within disciplinary boundaries.

Second, disciplinary and dedisciplinary approaches to philosophy should be seen as complementary rather than antagonistic to one another. Rigor should be seen as pluralistic: the rigor of disciplinary work is different from, but neither better nor worse, than the philosophic rigor required to adjust one’s thinking to real world exigencies. This is a point that bioethicists have long understood.  In his seminal 1973 "Bioethics as a Discipline," Daniel Callahan already saw that doing philosophical thinking with physicians, scientists, and other stakeholders demands "rigor … of a different sort than that normally required for the traditional philosophical or scientific disciplines." Bioethics exists in disciplinary and in nondisciplinary forms — in ways that synergize. It shows that we need not be forced, as a matter of general principle, to choose one set of peers over another.

Practically speaking, examining the question of who should count as a peer means that philosophers will need to revisit some of the core elements of our field. For one, our criteria for tenure and promotion would need to be reviewed. The current strict hierarchy surrounding where we publish — say, in "The Stone" (the New York Times philosophy blog) or in Mind — would need to be re-evaluated. And really, what is the argument for claiming that the latter sort of publication is of higher value? If you reply that the latter is peer-reviewed, excuse us for pointing out that your answer begs the question.

And what about the question of multiple authorship? Should this article count for less because three of us wrote it? How much less? Why? Co-authoring is actually just as challenging as producing single-authored works, as we can attest, so the justification cannot be that it is less work. Should we value independent scholarship over collaboration? Why? This is the Cartesian ethos coming back to haunt philosophy: I think; I exist; I write; I am a scholar. We doubt it.

As universities face growing demands for academic accountability, philosophers ought to take the lead in exploring what accountability means. Otherwise we may be stuck with Dickens’s Mr. Gradgrind. ("Now, what I want is Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life.") But a philosophical account of accountability will also require redefining the boundaries of what counts as philosophy. We ought to engage those making accountability demands from the outside in just the way that Socrates engaged Euthyphro on piety. If there are indeed Bieberians at the gate, we say let them in — as long as they are willing to engage in dialogue, we philosophers should do all right. Unless it is we philosophers who refuse to engage.

 

Author/s: 
Robert Frodeman, J. Britt Holbrook and Adam Briggle
Author's email: 
info@insidehighered.com

Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas. He was editor in chief of the Oxford Handbook of Interdisciplinarity.

J. Britt Holbrook is research assistant professor of philosophy and assistant director of the Center for the Study of Interdisciplinarity at the University of North Texas. He is editor in chief of Ethics, Science, Technology, and Engineering: An International Resource, forthcoming from Gale-Cengage Learning.

Adam Briggle is assistant professor of philosophy at the University of North Texas. He is author, with Carl Mitcham, of Ethics and Science: An Introduction from Cambridge University Press, 2012.

Editorial Tags: 

Furor Over Education Dean at U. of Iowa

All the members of the Faculty Advisory Committee for the College of Education at the University of Iowa have resigned in a growing disagreement between professors and administrators over the college's direction, the Associated Press reported. The chair of the committee said he was ordered by Provost P. Barry Butler to turn over and destroy comments made in a survey about the performance of Margaret Crocco, the education dean. The comments apparently were quite negative. Faculty leaders say that they aren't being listened to by the dean, and aren't sure how to move forward with administrators seeming uninterested in their grievances. Butler said that the comments haven't been destroyed and will be considered. Crocco said that she was working to improve communication with the college's faculty members.

Ad keywords: 

NJ Medical University Pays $4.65M to Settle Bias Suit

The University of Medicine and Dentistry of New Jersey has agreed to pay $4.65 million to settle a class action charging the institution with bias against female faculty members, NewJersey.com reported. The university declined to comment on the agreement. The suit charged sex discrimination was behind a $20,000 gender gap in the mean salaries of full professors who had been at the university for at least 10 years -- even though women in the sample brought in more research grants and also had more teaching responsibilities than did male professors.

 

Ad keywords: 

Essay on landing an academic job when not expecting to

Category: 
On the Fence

When Eliza Woolf gave up on finding a good academic job, she landed one.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Academic Freedom and Harkin Institute at Iowa State

Senator Tom Harkin, an Iowa Democrat, is calling on Iowa State University to end restrictions on agriculture-related research at a new public policy institute named for Harkin at Iowa State University, The Des Moines Register reported. Harkin said that the restrictions violate academic freedom, and that he might ask to have his name removed from the center if the measures aren't lifted. The university has said that the new institute must coordinate all agriculture-related research with Iowa State's Center for Agricultural and Rural Development. Faculty members involved in the new institute say that this requirement could limit their work, and some see the requirement as a way to assure that research agendas are consistent with those of the state's major agriculture industries, which support the Center for Agriculture and Rural Development.

Ad keywords: 

MLA president says reforming graduate education in the humanities requires hard decisions

Smart Title: 

Michael Bérubé tells graduate school deans that the issues are complicated and interconnected.

Essay critiques the ideas of Clay Shirky and others advocating higher ed disruption

Clay Shirky is a big thinker, and I read him because he’s consistently worth reading. But he’s not always right – and his thinking (and the flaws in it) is typical of the unquestioning enthusiasm of many thinkers today about technology and higher education. In his recent piece on "Napster, Udacity, and the Academy," for example, Shirky is not only guardedly optimistic about the ways that MOOCs and online education will transform higher education, but he takes for granted that they will, that there is no alternative. Just as inevitably as digital sharing turned the music industry on its head, he pronounces, so it is and will be with digital teaching. And as predictably as rain, he anticipates that "we" in academe will stick our heads in the sand, will deny the inevitable -- as the music industry did with Napster -- and will "screw this up as badly as the music people did." His views are shared by many in the "disruption" school of thought about higher education.

I suspect that if you agree with Clay Shirky that teaching is analogous to music, then you are likely to be persuaded by his assertion that Udacity -- a lavishly capitalized educational startup company -- is analogous to Napster. If you are not impressed with this analogy, however, you will not be impressed by his argument. And just to put my cards on the table, I am not very impressed with his argument. I think teaching is very different from music; that it is so different as to make the comparison obscure a lot more than it reveals.

But the bigger problem is that this kind of argument is weighted against academics, virtually constructed so as to make it impossible for an academic to reply. If you observe that "institutions will try to preserve the problem to which they are the solution," after all -- what has been called "The Shirky Rule" -- it can be easy to add the words "all" and "always" to a sentence in which they do not belong. This not a principle or a rule; it’s just a thing that often happens, and often is not always. But if you make the mistake of thinking that it is, you can become uniformly prejudiced against "institutions," since you literally know in advance what they will do and why. Because you understand them better than they understand themselves -- because they don’t or can’t realize that they are simply "institutions" -- you can explain things about them that they can neither see, nor argue against. "Why are you so defensive?" you ask, innocently, and everything they say testifies against them.

If someone like me -- a graduate student for many years, currently trying to find an academic job -- looks at MOOCs and online education, and sees the downsides very clearly, it’s also true that no one has a more strongly vested interest in arguing the benefits of radically transforming the academe than Clay Shirky and a number of others who talk about the inevitability of radical change. As Chuck Klosterman unkindly put it, once, "Clay Shirky must argue that the Internet is having a positive effect – it’s the only reason he’s publicly essential." Which is not to say that Shirky is wrong, simply that he must prove, not presume, that he is right.

I have to go through this excessively long wind-up because of the ways that Shirky has stacked the rhetorical deck in his favor. He uses the word "we" throughout his piece, and in this powerful final paragraph, he hammers us over the head with it, so precisely that we might mistake it for a caress:

"In the academy, we lecture other people every day about learning from history. Now it's our turn, and the risk is that we’ll be the last to know that the world has changed, because we can’t imagine — really cannot imagine — that story we tell ourselves about ourselves could start to fail. Even when it’s true. Especially when it’s true."

But what do you mean "we," Mr. Distinguished Writer in Residence? I asked Shirky on Twitter if he considered himself primarily an academic, and though he didn’t respond, it’s important that he frames his entire post as if he’s an insider. But while it’s certainly true that I am biased in favor of academic labor continuing to exist in something like its present form, he is no less biased by having nothing to lose and everything to gain if academe is flipped on its head. And yet the cumulative rhetorical effect of his framing is to remind us that no one within the institution can speak knowledgeably about their institution, precisely because of their location within it; when Shirky speaks of "we" academics, he does so only to emphasize that "we" can’t imagine that the story we tell ourselves is wrong.

It's because he is willing to burn the village to save it that Shirky can speak for and of academe. Because Shirky never has to show evidence that online education will ever be any good; he notes an academic’s assessment of a Udacity course as "amazingly, shockingly awful" and is then, apparently, satisfied when Udacity admitted that its courses "can be improved in more than one way." A defensive blog post written by Udacity’s founder is enough to demonstrate that change for the better is happening. And when the academic who criticized the Udacity course mentions a colleague whose course showed some of the same problems -- but does not name the colleague -- Shirky is triumphant. The academic in question "could observe every aspect of Udacity’s Statistics 101 (as can you) and discuss them in public," Shirky observes, "but when criticizing his own institution, he pulled his punches."

This is Clay Shirky’s domain, and also the domain of so many others who point to one or another failing of traditional higher ed to suggest that radical change is needed. The anecdote that illustrates something larger. In this case, the fact that academe is a "closed" institution means it cannot grow, change, or improve. By contrast, "[o]pen systems are open" seems to be the end of the discussion; when he contemplates the openness of a MOOC, the same definitional necessity applies. "It becomes clear," he writes, "that open courses, even in their nascent state, will be able to raise quality and improve certification faster than traditional institutions can lower cost or increase enrollment.” It becomes clear because it is clear, because "open" is better, because it is open.

But how "open" is Udacity, really? Udacity’s primary obligation is to its investors. That reality will always push it to squeeze as much profit out of its activities as it can. This may make Udacity better at educating, but it also may not; the job of a for-profit entity is not to educate, but to profit, and it will. There’s nothing necessarily wrong with for-profit education -- and most abuses can be traced back to government deregulation, not tax status -- but the idea that "openness," as such, will magically transform how a business does business is a massively begged question. A bit of bad press can get Sebastian Thrun to write a blog post promising change, but actually investing the resources necessary to follow through on that is actually a very different question. The fact that someone like Shirky takes him at face value -- not only gives him the benefit of the doubt, but seems to have no doubt at all -- speaks volumes to me.

Meanwhile, did the academic that Shirky criticizes really "pull his punches"? Did he refrain from naming his colleague because of the way academics instinctively shield each other from criticism? It’s far from clear; if you read the original blog post, in fact, it’s not even apparent that the academic knew who this "colleague" actually was. All we really know is that a student referred to something her "last teacher" did. But suppose he did know who this student’s last teacher was; suppose the student mentioned the teacher by name. Would it have been appropriate to post someone’s name on the Internet just because a secondhand source told you something bad about them? Does that count as openness?

Open vs. closed is a useful conceptual distinction, but when it comes down to specific cases, these kinds of grand narratives can mislead us. For one thing, far from the kind of siege mentality that characterized an industry watching its business model go up in smoke -- an industry that was not interested in giving away its product for free -- academics are delighted to give away their products for free, if they can figure out a way to do it. Just about every single public and nonprofit university in the country is working to develop digital platforms for education, or thinking hard about how they can. This doesn’t mean they are doing it successfully, or well; time will tell, and the proof will be in the pudding. But to imagine that Silicon Valley venture capitalists are the only people who see the potential of these technologies requires you to ignore the tremendous work that academics are currently doing to develop new ways of doing what they do. The most important predecessors to MOOCs, after all, were things like Massachusetts Institute of Technology's OpenCourseWare, designed entirely in the spirit of openness and not in search of profit.

The key difference between academics and venture capitalists, in fact, is not closed versus open but evidence versus speculation. The thing about academics is that they require evidence of success before declaring victory, while venture capitalists can afford to gamble on the odds. While Shirky can see the future revolutionizing in front of us, he is thinking like a venture capitalist when he does, betting on optimism because he can afford to lose. He doesn’t know that he’s right; he just knows that he might not be wrong. And so, like all such educational futurologists, Shirky’s case for MOOCs is all essentially defensive: he argues against the arguments against MOOCs, taking shelter in the possibility of what isn’t, yet, but which may someday be.

For example, instead of arguing that MOOCs really can provide "education of the very best sort," Shirky explicitly argues that we should not hold them to this standard. Instead of thinking in terms of quality, we should talk about access: from his perspective, the argument against MOOCs is too narrowly focused on the "18-year-old who can set aside $250k and four years" and so it neglects to address students who are not well-endowed with money and time. "Outside the elite institutions," Shirky notes, "the other 75 percent of students — over 13 million of them — are enrolled in the four thousand institutions you haven’t heard of." And while elite students will continue to attend elite institutions, "a good chunk of the four thousand institutions you haven’t heard of provide an expensive but mediocre education."

This is a very common argument from MOOC boosters, because access is a real problem. But while a "good chunk" of 13 million students are poorly served by the present arrangement, it is quite telling that his example of "expensive but mediocre education" is Kaplan and the University of Phoenix, for-profit institutions that are beloved by the same kinds of venture capitalists who are funding Udacity. He is right: For-profit education has amassed a terrible track record of failure. If you are getting a degree at a for-profit institution, you probably are paying too much for too little. But would it be any less mediocre if it were free?

Udacity’s courses are free to consumers (though not, significantly, to universities), at least for now. And Shirky is not wrong that "demand for knowledge is so enormous that good, free online materials can attract extraordinary numbers of people from all over the world." But Shirky doesn’t mean "demand" in the economic sense: demand for a free commodity is just desire until it starts to pay for the thing it wants. Since there is a lot of unmet desire for education out there, and since that desire is glad to have the thing it wants when it finds it for free, it seems all to the good that students can find courses for free. But while we should ask questions about why venture capitalists are investing so heavily in educational philanthropy, we also need to think more carefully about why is there so much unmet desire in the first place, and why so many people want education without, apparently, being able to pay for it. Why hasn’t that desire already found a way to become demand, such that it must wait until Silicon Valley venture capitalists show up, benevolently bearing the future in their arms?

The giveaway is when Shirky uses the phrase "non-elite institutions": for Shirky, there are elite institutions for elite students and there are non-elites for everyone else. The elite institutions will remain the same. No one will ever choose Udacity over Harvard or U.Va., and while elite institutions like MIT, Stanford, Princeton, and my own University of California are leaping into the online education world head first, anyone who thinks these online brands will ever compete with "the real thing" will be exactly the kind of sucker who would fork over full price for a watered-down product.

MOOCs are only better than nothing and speculation that this will someday change is worth pursuing, but for now, remains just that, speculation. It should be no surprise that venture capital is interested in speculation. And it should be no surprise that when academics look at the actual track record, when we try to evaluate the evidence rather than the hope, we discover a great deal to be pessimistic about.

Why have we stopped aspiring to provide the real thing for everyone? That’s the interesting question, I think, but if we begin from the distinction between "elite" and "non-elite" institutions, it becomes easy to take for granted that "non-elite students" receiving cheap education is something other than giving up. It is important to note that when online education boosters talk about "access," they explicitly do not mean access to "education of the best sort"; they mean that because an institution like Udacity provides teaching for free, you can’t complain about its mediocrity. It’s not an elite institution, and it’s not for elite students. It just needs to be cheap.

Talking in terms of "access" (instead of access to what?) allows people like Shirky to overlook the elephant in the room, which is the way this country used to provide inexpensive and high-quality education to all sorts of people who couldn’t afford to go to Yale -- people like me and my parents. While state after state is defunding its public colleges and universities (and so tuition is rising while quality is declining), the vast majority of American college students are still educated in public colleges and universities, institutions that have traditionally provided very high-quality mass higher education, and which did it nearly for free barely a generation ago.

"Access" wouldn’t even be a problem if we didn’t expect mass higher education to still be available: Americans only have the kind of reverence for education that we have because the 20th century made it possible for the rising middle class to have what had previously been a mark of elite status, a college education. But the result of letting these public institutions rot on the vine is that a host of essentially parasitic institutions -- like Udacity -- are sprouting like mushrooms on the desire for education that was created by the existence of the world’s biggest and best public mass higher education system.

Shirky talks dismissively about his own education, at Yale, and recalls paying a lot of money to go to crowded lectures and then to discussion sections with underpaid graduate students. Let me counter his anecdote with my own, When I was a high school student, in Appalachian Ohio, I told my guidance counselor that I wanted to go to Harvard, and he made me understand that people from Fairland High School do not really go to Harvard. I was a dumb high school student, so I listened to him. But although both of my parents worked in West Virginia, they had moved to Ohio when I was young so that I could go to Ohio schools, and this meant that although my grades were only moderately good -- and I had never had access to Advanced Placement classes -- I was able to apply to Ohio State University, get in, afford it, and get an education that was probably better than the one that Shirky got at Yale, and certainly a heck of a lot cheaper. My parents paid my rent, but I paid my tuition myself -- with part time jobs and $20,000 in loans -- and I didn’t have a single class in my major with more than 30 students. I had one-on-one access to all of my professors, and I took advantage of it.

It's a lot harder to do this now, of course; tuition at Ohio State is more than double what it was when I started in 1997. More important, you not only pay a lot more if you go to a school like Ohio State, you’re also a lot less likely to get in; the country’s college-age population has continued to grow, but the number of acceptance letters that public universities like OSU send out has not increased. As Mike Konczal and I have argued, this shortfall in quality higher education creates what economists call "fake supply." If you don’t get in to a college specializing in education "of the best sort" (or if your guidance counselor tells you not to apply), where do you go, if you go? You go to an online university, to Kaplan, or maybe now you try a MOOC or a public college relying on MOOCs to provide general education, as Texas now envisions. Such things are better than nothing. But "nothing" only seems like the relevant point of comparison if we pretend that public higher education doesn’t exist. And if we ignore the fact that we are actively choosing to let it cease to exist.

Beware anyone who tries to give you a link to WebMD as a replacement for seeing a real doctor.

 

Aaron Bady is a doctoral candidate in English literature at the University of California at Berkeley, and he writes and tweets for The New Inquiry as @zunguzungu.

Editorial Tags: 

Pages

Subscribe to RSS - faculty
Back to Top