Submitted by Aaron Bady on December 6, 2012 - 3:00am
Clay Shirky is a big thinker, and I read him because he’s consistently worth reading. But he’s not always right – and his thinking (and the flaws in it) is typical of the unquestioning enthusiasm of many thinkers today about technology and higher education. In his recent piece on "Napster, Udacity, and the Academy," for example, Shirky is not only guardedly optimistic about the ways that MOOCs and online education will transform higher education, but he takes for granted that they will, that there is no alternative. Just as inevitably as digital sharing turned the music industry on its head, he pronounces, so it is and will be with digital teaching. And as predictably as rain, he anticipates that "we" in academe will stick our heads in the sand, will deny the inevitable -- as the music industry did with Napster -- and will "screw this up as badly as the music people did." His views are shared by many in the "disruption" school of thought about higher education.
I suspect that if you agree with Clay Shirky that teaching is analogous to music, then you are likely to be persuaded by his assertion that Udacity -- a lavishly capitalized educational startup company -- is analogous to Napster. If you are not impressed with this analogy, however, you will not be impressed by his argument. And just to put my cards on the table, I am not very impressed with his argument. I think teaching is very different from music; that it is so different as to make the comparison obscure a lot more than it reveals.
But the bigger problem is that this kind of argument is weighted against academics, virtually constructed so as to make it impossible for an academic to reply. If you observe that "institutions will try to preserve the problem to which they are the solution," after all -- what has been called "The Shirky Rule" -- it can be easy to add the words "all" and "always" to a sentence in which they do not belong. This not a principle or a rule; it’s just a thing that often happens, and often is not always. But if you make the mistake of thinking that it is, you can become uniformly prejudiced against "institutions," since you literally know in advance what they will do and why. Because you understand them better than they understand themselves -- because they don’t or can’t realize that they are simply "institutions" -- you can explain things about them that they can neither see, nor argue against. "Why are you so defensive?" you ask, innocently, and everything they say testifies against them.
If someone like me -- a graduate student for many years, currently trying to find an academic job -- looks at MOOCs and online education, and sees the downsides very clearly, it’s also true that no one has a more strongly vested interest in arguing the benefits of radically transforming the academe than Clay Shirky and a number of others who talk about the inevitability of radical change. As Chuck Klosterman unkindly put it, once, "Clay Shirky must argue that the Internet is having a positive effect – it’s the only reason he’s publicly essential." Which is not to say that Shirky is wrong, simply that he must prove, not presume, that he is right.
I have to go through this excessively long wind-up because of the ways that Shirky has stacked the rhetorical deck in his favor. He uses the word "we" throughout his piece, and in this powerful final paragraph, he hammers us over the head with it, so precisely that we might mistake it for a caress:
"In the academy, we lecture other people every day about learning from history. Now it's our turn, and the risk is that we’ll be the last to know that the world has changed, because we can’t imagine — really cannot imagine — that story we tell ourselves about ourselves could start to fail. Even when it’s true. Especially when it’s true."
But what do you mean "we," Mr. Distinguished Writer in Residence? I asked Shirky on Twitter if he considered himself primarily an academic, and though he didn’t respond, it’s important that he frames his entire post as if he’s an insider. But while it’s certainly true that I am biased in favor of academic labor continuing to exist in something like its present form, he is no less biased by having nothing to lose and everything to gain if academe is flipped on its head. And yet the cumulative rhetorical effect of his framing is to remind us that no one within the institution can speak knowledgeably about their institution, precisely because of their location within it; when Shirky speaks of "we" academics, he does so only to emphasize that "we" can’t imagine that the story we tell ourselves is wrong.
It's because he is willing to burn the village to save it that Shirky can speak for and of academe. Because Shirky never has to show evidence that online education will ever be any good; he notes an academic’s assessment of a Udacity course as "amazingly, shockingly awful" and is then, apparently, satisfied when Udacity admitted that its courses "can be improved in more than one way." A defensive blog post written by Udacity’s founder is enough to demonstrate that change for the better is happening. And when the academic who criticized the Udacity course mentions a colleague whose course showed some of the same problems -- but does not name the colleague -- Shirky is triumphant. The academic in question "could observe every aspect of Udacity’s Statistics 101 (as can you) and discuss them in public," Shirky observes, "but when criticizing his own institution, he pulled his punches."
This is Clay Shirky’s domain, and also the domain of so many others who point to one or another failing of traditional higher ed to suggest that radical change is needed. The anecdote that illustrates something larger. In this case, the fact that academe is a "closed" institution means it cannot grow, change, or improve. By contrast, "[o]pen systems are open" seems to be the end of the discussion; when he contemplates the openness of a MOOC, the same definitional necessity applies. "It becomes clear," he writes, "that open courses, even in their nascent state, will be able to raise quality and improve certification faster than traditional institutions can lower cost or increase enrollment.” It becomes clear because it is clear, because "open" is better, because it is open.
But how "open" is Udacity, really? Udacity’s primary obligation is to its investors. That reality will always push it to squeeze as much profit out of its activities as it can. This may make Udacity better at educating, but it also may not; the job of a for-profit entity is not to educate, but to profit, and it will. There’s nothing necessarily wrong with for-profit education -- and most abuses can be traced back to government deregulation, not tax status -- but the idea that "openness," as such, will magically transform how a business does business is a massively begged question. A bit of bad press can get Sebastian Thrun to write a blog post promising change, but actually investing the resources necessary to follow through on that is actually a very different question. The fact that someone like Shirky takes him at face value -- not only gives him the benefit of the doubt, but seems to have no doubt at all -- speaks volumes to me.
Meanwhile, did the academic that Shirky criticizes really "pull his punches"? Did he refrain from naming his colleague because of the way academics instinctively shield each other from criticism? It’s far from clear; if you read the original blog post, in fact, it’s not even apparent that the academic knew who this "colleague" actually was. All we really know is that a student referred to something her "last teacher" did. But suppose he did know who this student’s last teacher was; suppose the student mentioned the teacher by name. Would it have been appropriate to post someone’s name on the Internet just because a secondhand source told you something bad about them? Does that count as openness?
Open vs. closed is a useful conceptual distinction, but when it comes down to specific cases, these kinds of grand narratives can mislead us. For one thing, far from the kind of siege mentality that characterized an industry watching its business model go up in smoke -- an industry that was not interested in giving away its product for free -- academics are delighted to give away their products for free, if they can figure out a way to do it. Just about every single public and nonprofit university in the country is working to develop digital platforms for education, or thinking hard about how they can. This doesn’t mean they are doing it successfully, or well; time will tell, and the proof will be in the pudding. But to imagine that Silicon Valley venture capitalists are the only people who see the potential of these technologies requires you to ignore the tremendous work that academics are currently doing to develop new ways of doing what they do. The most important predecessors to MOOCs, after all, were things like Massachusetts Institute of Technology's OpenCourseWare, designed entirely in the spirit of openness and not in search of profit.
The key difference between academics and venture capitalists, in fact, is not closed versus open but evidence versus speculation. The thing about academics is that they require evidence of success before declaring victory, while venture capitalists can afford to gamble on the odds. While Shirky can see the future revolutionizing in front of us, he is thinking like a venture capitalist when he does, betting on optimism because he can afford to lose. He doesn’t know that he’s right; he just knows that he might not be wrong. And so, like all such educational futurologists, Shirky’s case for MOOCs is all essentially defensive: he argues against the arguments against MOOCs, taking shelter in the possibility of what isn’t, yet, but which may someday be.
For example, instead of arguing that MOOCs really can provide "education of the very best sort," Shirky explicitly argues that we should not hold them to this standard. Instead of thinking in terms of quality, we should talk about access: from his perspective, the argument against MOOCs is too narrowly focused on the "18-year-old who can set aside $250k and four years" and so it neglects to address students who are not well-endowed with money and time. "Outside the elite institutions," Shirky notes, "the other 75 percent of students — over 13 million of them — are enrolled in the four thousand institutions you haven’t heard of." And while elite students will continue to attend elite institutions, "a good chunk of the four thousand institutions you haven’t heard of provide an expensive but mediocre education."
This is a very common argument from MOOC boosters, because access is a real problem. But while a "good chunk" of 13 million students are poorly served by the present arrangement, it is quite telling that his example of "expensive but mediocre education" is Kaplan and the University of Phoenix, for-profit institutions that are beloved by the same kinds of venture capitalists who are funding Udacity. He is right: For-profit education has amassed a terrible track record of failure. If you are getting a degree at a for-profit institution, you probably are paying too much for too little. But would it be any less mediocre if it were free?
Udacity’s courses are free to consumers (though not, significantly, to universities), at least for now. And Shirky is not wrong that "demand for knowledge is so enormous that good, free online materials can attract extraordinary numbers of people from all over the world." But Shirky doesn’t mean "demand" in the economic sense: demand for a free commodity is just desire until it starts to pay for the thing it wants. Since there is a lot of unmet desire for education out there, and since that desire is glad to have the thing it wants when it finds it for free, it seems all to the good that students can find courses for free. But while we should ask questions about why venture capitalists are investing so heavily in educational philanthropy, we also need to think more carefully about why is there so much unmet desire in the first place, and why so many people want education without, apparently, being able to pay for it. Why hasn’t that desire already found a way to become demand, such that it must wait until Silicon Valley venture capitalists show up, benevolently bearing the future in their arms?
The giveaway is when Shirky uses the phrase "non-elite institutions": for Shirky, there are elite institutions for elite students and there are non-elites for everyone else. The elite institutions will remain the same. No one will ever choose Udacity over Harvard or U.Va., and while elite institutions like MIT, Stanford, Princeton, and my own University of California are leaping into the online education world head first, anyone who thinks these online brands will ever compete with "the real thing" will be exactly the kind of sucker who would fork over full price for a watered-down product.
MOOCs are only better than nothing and speculation that this will someday change is worth pursuing, but for now, remains just that, speculation. It should be no surprise that venture capital is interested in speculation. And it should be no surprise that when academics look at the actual track record, when we try to evaluate the evidence rather than the hope, we discover a great deal to be pessimistic about.
Why have we stopped aspiring to provide the real thing for everyone? That’s the interesting question, I think, but if we begin from the distinction between "elite" and "non-elite" institutions, it becomes easy to take for granted that "non-elite students" receiving cheap education is something other than giving up. It is important to note that when online education boosters talk about "access," they explicitly do not mean access to "education of the best sort"; they mean that because an institution like Udacity provides teaching for free, you can’t complain about its mediocrity. It’s not an elite institution, and it’s not for elite students. It just needs to be cheap.
Talking in terms of "access" (instead of access to what?) allows people like Shirky to overlook the elephant in the room, which is the way this country used to provide inexpensive and high-quality education to all sorts of people who couldn’t afford to go to Yale -- people like me and my parents. While state after state is defunding its public colleges and universities (and so tuition is rising while quality is declining), the vast majority of American college students are still educated in public colleges and universities, institutions that have traditionally provided very high-quality mass higher education, and which did it nearly for free barely a generation ago.
"Access" wouldn’t even be a problem if we didn’t expect mass higher education to still be available: Americans only have the kind of reverence for education that we have because the 20th century made it possible for the rising middle class to have what had previously been a mark of elite status, a college education. But the result of letting these public institutions rot on the vine is that a host of essentially parasitic institutions -- like Udacity -- are sprouting like mushrooms on the desire for education that was created by the existence of the world’s biggest and best public mass higher education system.
Shirky talks dismissively about his own education, at Yale, and recalls paying a lot of money to go to crowded lectures and then to discussion sections with underpaid graduate students. Let me counter his anecdote with my own, When I was a high school student, in Appalachian Ohio, I told my guidance counselor that I wanted to go to Harvard, and he made me understand that people from Fairland High School do not really go to Harvard. I was a dumb high school student, so I listened to him. But although both of my parents worked in West Virginia, they had moved to Ohio when I was young so that I could go to Ohio schools, and this meant that although my grades were only moderately good -- and I had never had access to Advanced Placement classes -- I was able to apply to Ohio State University, get in, afford it, and get an education that was probably better than the one that Shirky got at Yale, and certainly a heck of a lot cheaper. My parents paid my rent, but I paid my tuition myself -- with part time jobs and $20,000 in loans -- and I didn’t have a single class in my major with more than 30 students. I had one-on-one access to all of my professors, and I took advantage of it.
It's a lot harder to do this now, of course; tuition at Ohio State is more than double what it was when I started in 1997. More important, you not only pay a lot more if you go to a school like Ohio State, you’re also a lot less likely to get in; the country’s college-age population has continued to grow, but the number of acceptance letters that public universities like OSU send out has not increased. As Mike Konczal and I have argued, this shortfall in quality higher education creates what economists call "fake supply." If you don’t get in to a college specializing in education "of the best sort" (or if your guidance counselor tells you not to apply), where do you go, if you go? You go to an online university, to Kaplan, or maybe now you try a MOOC or a public college relying on MOOCs to provide general education, as Texas now envisions. Such things are better than nothing. But "nothing" only seems like the relevant point of comparison if we pretend that public higher education doesn’t exist. And if we ignore the fact that we are actively choosing to let it cease to exist.
Beware anyone who tries to give you a link to WebMD as a replacement for seeing a real doctor.
Aaron Bady is a doctoral candidate in English literature at the University of California at Berkeley, and he writes and tweets for The New Inquiry as @zunguzungu.
Preschool teachers are the Rodney Dangerfields of the teaching profession, the "we don’t get no respect" gang. They’re often dismissed, even by their K-12 colleagues, as babysitters and not "real" teachers, but nothing could be further from the truth. The time I’ve recently spent crouching in classrooms, watching how 3- and 4-year-olds explore their universe with the aid of an inspiring guide, convinces me that these teachers are the best in the business. They're changing the arc of children’s lives — and they have a lot to teach the rest of us.
The job of a prekindergarten teacher is unbelievably demanding — if you doubt it, just spend a morning in a classroom filled with 3- and 4-year-olds. Because of the rapidity with which their brains are developing, those kids learn far more rapidly than even our smartest students — think of them as little Lewises and Clarks on their own journeys of discovery. Every teacher relishes the teachable moments, the occasions when you can almost see the lightbulbs of dawning comprehension, because for many students after their early years they’re so rare and special. Each day in a preschool classroom brings a meteor shower of these moments.
College professors usually know what needs to be taught. But for many academics, that knowledge of our own field is the only thing we bring to the classroom. We spend almost no time thinking about how to teach. Though new instructional strategies have proliferated, professors aren’t taught how to teach. They must pick up these new tools on their own, and many don’t bother.
There’s abundant evidence, for instance, that lectures rarely engage students' minds: college students pay attention to the lecturer just 40 percent of the time and retain even less of what’s being said. Still, the "sage on the stage" remains the norm, and in big universities classes of 100 and more are common. Lectures offer a way of saving money and professors’ time, dressed up in the rationale that students are empty vessels into which knowledge can be poured. To the question, "How did your class go?" an all-too-common response is "I gave a good lecture." But this isn’t how learning usually occurs.
Good prekindergarten teachers not only know what to teach; they also know how they can have the biggest impact. They’ve learned varied a host of ways to teach reading and math, art and science, gymnastics and music and much more. What’s equally important, they’ve studied how children’s minds and emotions develop. They understand that learning isn’t a spectator sport.
To be sure, preschoolers spend part of the day in "circle time," huddled together with their eyes glued to the teacher; that’s the pre-k equivalent of a lecture, though often considerably more enticing. But those lightbulbs really turn on when these three and four year olds are trying out ideas, either on their own or with a few classmates, making mistakes and trying again, as the teacher scans the room, chipping in when kids get stumped.
In these classrooms a lot is occurring simultaneously — while the teacher may be writing down children’s stories that will later be acted out by fellow students, some kids may be painting, others constructing bridges, performing experiments, staff manning a doctor’s office or ordering pizza. And some will be curled up with a picture book from the classroom library.
I became familiar with this world when I spent time crouching in classrooms in Union City, New Jersey, across the Hudson from Manhattan. Union City is the most crowded and one of the most impoverished municipalities in America, and students in such communities are often marked for failure. That’s not the case here — these schools, which I write about in Improbable Scholars, are bringing poor immigrant Latino kids (school officials estimate that 30 percent are undocumented) into the educational mainstream. In 2011, the last year for which official figures are available, the high school graduation rate was 89 percent — that’s about 15 percent higher than the national average — and 60 percent of the graduates enrolled in college. Ask the administrators how Union City manages this feat and they’ll tell you that delivering good early education is makes a critical difference.
The best way to appreciate what’s so remarkable about prekindergarten is by looking closely at what’s going on there. Walk into Suzy Rojas’s classroom and you’ll see art plastering the walls, plants hanging from the ceiling. In every niche there’s something to seize a child’s imagination. Three boys whom I’ll call Angel, Victor and Rodrigo are peering at insects through a microscope, and they’re happy to explain to me what they’re seeing. "Remember when we went to the museum and the butterfly landed on my arm?" Angel asks his friends.
Suzy has joined the conversation. "Are these all insects?" she wonders aloud. "How do you know?" "That one has eight legs," Victor responds, “and that means it’s not an insect.” Then Suzy brings over a prism. "What do you see when you look through it?" she asks, and Rodrigo looks up to say that he can’t tell them apart, that they look like leaves. "Why do you think so?" she inquires. The boys have already learned about lenses, and she tells them that the prism is a special kind of lens.
There’s still more to be gleaned from these creatures. "How about an insect salad — would you want to eat it?" Suzy inquires, and when the boys chorus "ugh," she bounces it back to them: "How come?" They stare once more at the insects. "How many parts does an insect body have? Do you remember what they’re called?" Neville knows the answer: "Three parts — the antenna, abdomen and legs."
"It’s all about exposure to concepts — wide, narrow, long, short,” Suzy tells me. “ 'I have three brothers, three sisters and an uncle — let’s graph that.’ I bring in breads from different countries. 'Let’s do a pie chart showing which one you liked the best.' " Stop for a moment to consider how we expect to absorb concepts — passively, for the most part. "I don’t ask them to memorize 1, 2, 3 or A, B, C," Suzy adds. "I could teach a monkey to count." So much for making college students memorize facts and regurgitate them on the midterm, only to see realize that in a couple of weeks most of that information has been forgotten.
Suzy Rojas’ students aren’t simply acquiring an understanding of cognitive concepts. They’re also coming to understand why you should wait your turn, how to share, how to manage your own feelings — the emotional skills that report cards once summarized as "works and plays well with others." (I’ve attended faculty meetings whose participants must have missed those lessons.) Back in the classroom, Suzy leaves Rodrigo and his friends, turning to several students who are solving a puzzle on a computer. But when she sees Victor and Rodrigo fighting over who gets the next look at the insects, she quickly returns. "Use your words," she says — familiar teacher-talk — but then she adds a twist. "What can we do?” “We,” not “you”: the boys think about it. "How about adding another container for insects," she suggests. “That way you can all take turns.”
Cognitive and noncognitive, thinking and feeling, Descartes’ mind-body dualism — in a good preschool classroom these distinctions vanish. The teacher is always on the lookout for both kinds of lessons, aiming to reach both head and heart. College students are more mature, of course — fights don’t break out in our classrooms — but if we ignore their emotional responses we risk irrelevance. Our students often react to what’s being said in class at an emotional as well as an intellectual level, paying attention to how the message is being delivered, not just its content. If a professor is so busy imparting knowledge that he misses the students’ body language — the arms folded in "show me" posture or the fingers busily tweeting — he’s lost the class.
Suzy Rojas’s approach to teaching offers a reminder that professors should be relying can do better. We need to rely less on lectures, varying the classroom experience with give-and-take discussion and breakout groups, online learning, outside experts who can join the conversation, student-led classes and group research projects. And we should check in with the students — midcourse corrections can make a world of difference.
There are days when preschoolers come to school agog about what’s happening in their world, a fierce snowstorm or a great movie they’ve seen over the weekend, and a talented pre-k teacher like Suzy Rojas knows how to incorporate their excitement into her lesson. That’s another takeaway — finding ways to incite our students into thinking hard matters a lot infinitely more than marching them through the syllabus.
David L. Kirp is the James D. Marver Professor of Public Policy at the University of California at Berkeley. He is the author of the forthcoming Uncommon Scholars: The Rebirth of a Great American School System and a Strategy for American Education.
In my 14-year tenure as president I have often been asked to define and defend the notion of a "useful" liberal arts education. The general public has difficulty associating the liberal arts with anything useful. That obstacle prompts them to dismiss liberal arts colleges as repositories of graduates with majors such as philosophy, history, anthropology and American studies who cannot get jobs. The thought that these same colleges also have majors such as biology, chemistry, physics and economics is totally missed.
The public is not to blame. American higher education never really experienced the American Revolution. While we threw away the oppressive dictates of monarchy, we never threw off the privileged notion of an English upper class liberal education that was literally defined as being only for those with sufficient wealth to do nothing professionally but dabble in learning. We remained enthralled by the notion of learning for learning’s sake and despite our emerging pragmatic nature, wanted our education to remain sublime and removed from the business of life.
There were prominent founders of the nation who argued for a new kind of liberal education for a new kind of nation. Thomas Jefferson urged a "practical education" for his University of Virginia. And Benjamin Rush, the founder of Dickinson College, decried the unwillingness of Americans to reform education after the Revolution:
It is equally a matter of regret, that no accommodation has been made in the system of education in our seminaries [colleges] to the new form of our government and the many national duties, and objects of knowledge, that have been imposed upon us by the American Revolution. Instead of instructing our sons in the Arts most essential to their existence, and in the means of acquiring that kind of knowledge which is connected to the time, the country, and the government in which they live, they are compelled to spend [time] learning two languages which no longer exist, and are rarely spoken, which have ceased to be the vehicles of Science and literature, and which contain no knowledge but what is to be met with in a more improved and perfect state in modern languages. We have rejected hereditary power in the governments of our country. But we continue the willing subjects of a system of education imposed upon us by our ancestors in the fourteenth and fifteenth centuries. Had agriculture, mechanics, astronomy, navigation and medicine been equally stationary, how different from the present would have been the condition of mankind!
But these singular calls for a more pragmatic education in America to match a new form of government went largely unheeded. Rush’s founding of Dickinson is particularly illustrative. In his 1785 "Plan of Education" he called for a "useful liberal education." The curriculum was to be absent instruction in the writing and speaking of Greek and Latin, but rich in instruction of German, French, Spanish and even Native American languages as those would be highly useful to Americans striving to establish a native economy that would grow as it interacted linguistically with trading nations throughout the world and in the United States. Democracy was to be established through commerce informed by useful liberal education. Liberal education, commerce and democracy were interdependent. The Dickinson course of study was also to include chemistry as Rush thought this subject held the greatest number of connections to emerging knowledge useful to the nation.
The first president of the college and Rush’s fellow trustees ignored his plan. They recommitted to what Rush once called "the monkish" course of study, unchanged for centuries.
Latin and Greek were taught and a chemistry professor was not hired. Additionally, the college refused to hire a German professor. Rush was so angry that he founded nearby what was called Franklin College (today Franklin and Marshall College). It wasn’t until 1999 that Rush’s notion of a "useful" liberal education was reintroduced and embraced explicitly as part of a revised mission statement some 216 years after it was introduced.
Unfortunately for those in America today who wish to argue the usefulness, and thus the worthiness, of a liberal arts education, the founding fathers were not explicit. We know that a liberal education was to yield informed citizens who could build and protect the new government. We know that certain courses were to be taken out and others inserted — those that related more to emerging and immediately explicable knowledge, expanded the appreciation of democracy and created new knowledge and wealth that would materially power the nation’s development. A useful liberal arts education was essentially entrepreneurial. But for all the novelty and potent force in this "disruptive technology" in American higher education introduced by the founding fathers, we know little about how a liberal arts education actually becomes useful — that is, how the study of the liberal arts converts to material effect in the wider world.
Much is at stake to define explicitly and to reassert the usefulness of a distinctively American liberal arts education. The liberal arts are under assault by those who, under the mantle of affordability and efficiency, would reject it for the immediate, but often temporary, benefit of higher education defined as job training. My own experience offers a definition for the 21st century, in fact, for any century, where economic uncertainty prevails. I was a German and philosophy double major. At first glance, what could be more useless? And yet, my professional life has proven such a conclusion wrong.
I have been — sometimes simultaneously — a military officer, a pre-collegiate teacher, administrator and coach. I founded an athletic team, developed a major center at a prestigious research university, acted as a senior consultant to the U.S. Department of State with diplomatic status, served as a corporate officer at two publicly traded companies and now serve as president of Dickinson College. For none of these careers did I ever study formally or take a class.
I gained competency through independent reading, experience and observation. I appreciated that the breadth of knowledge and the depth of cognitive skill that my undergraduate courses in social science, political science, art and science prepared me for any field of professional pursuit. I was prepared for professional chance. I knew how to ask the right questions, how to gather information, how to make informed decisions, how to see connections among disparate areas of knowledge, how to see what others might miss, how to learn quickly the basics of a profession, how to discern pertinent information from that which is false or misleading, how to judge good, helpful people from those who wish you ill. All of this I gathered in a useful liberal education — in and out of the classroom — and in an intense residential life where experimentation with citizenship and social responsibility were guiding principles.
There were no formal, discrete courses to learn these habits of mind and action — no courses devoted to brain exercises, critical-thinking skills, leadership and citizenship; rather, professors and staff were united in all interactions to impress upon students day after day, year after year a liberal arts learning environment that was intellectually rigorous and defining. This was contextual learning at its fullest deployment. We absorbed and gradually displayed ultimately useful knowledge and skill not in a studied manner, but discretely and naturally. Time after time in my various careers, I applied these liberal arts skills to solve materially wider-world problems. And most important, except for my military service and my college presidency, none of my jobs existed before I assumed them. My useful education has enabled me to maximize opportunity within highly fluid and changing employment rhythms. As I now face another job transition in my life, I go forward with confidence that something appropriate will develop. I have no concrete plans and I like it that way. I know I am prepared on the basis of my liberal arts education to maximize chance. Something will develop. Something that probably doesn’t yet exist.
I am not alone in my appreciation of the liberal arts. Those of privilege have appreciated liberal education historically. It has contributed to their access and hold on power and influence. Their sons and daughters, generation after generation, have attended liberal arts institutions without hesitation. There is no job training in their educational landscape. It would be tragic if all the new and previously underserved populations now having access to higher education missed the opportunity for their turn at leadership and influence simply because of the outspoken — arguably purposeful — dismissal of the liberal arts as "useless," often by those who received a liberal arts education themselves and intend nothing less for their own children.
William G. Durden is president of Dickinson College.
"I know my time is short," G. tells me, "and I want to pack as much thinking as possible into what’s left."
It's the last night of class in the last course these students can take with me. A mix of nostalgia, excitement and exhaustion is in the air. We are saying goodbye with presentations and food, quick hugs and promises to keep in touch. Against all odds (some acknowledge with stunned expressions), this class has not been a mere deposit in the bank vault of education. We have changed each other.
G. is not dying, just graduating. But tonight feels like the death of ideas. All our fellow thinkers and talkers and dreamers are walking out the door. There’s no structure left to reel them back tomorrow, next week, next year. Our community has dispersed (something we’ve talked about this semester — the virtual nature of community) and the finale, as always, has a melancholy feel.
For the past several weeks, we have collaborated to create what Hemingway might call a "clean, well-lighted place" to question our own practices. Now, the lights are going out throughout the building and, in many ways, throughout the world. Slashed budgets, job cuts, strange politics, war, discrimination, willful misunderstanding, despair. And here we sit, asking, "How is identity formed? What is the nature of community? Who is the oft-cited 'they'?"
After 16 weeks of intellectual abandon, G. and I both know that the space to come and talk about these things is narrowing to a pinpoint of light.
And so he stays to talk after everyone has left, a habit we’ve fallen into these past few months, unusual tonight only because it’s the time most of us — students as well as teachers — are coiled tight and ready to bolt at the precise moment when break begins. It's the latest in a series of late-night concept pitches and strategy sessions about how he can articulate his thoughts without stifling them.
Much later, as I’m driving home, I will think of all the things, trite and otherwise, I should have said. This is not the end; it's a transition. You can never really lose a mind. The universe would not be so cruel to limit thought to a mere 16 weeks. You are leaving the institutionalization of critical thought. Now, you will have to create your own clean, well-lighted place in the face of what can seem like a very dark world. From here on, you have to make it happen.
But for now, we talk as if G.'s interpretation is truly our plight, the only reasonable conclusion given our experience. We discuss biology and culture and personal choice, wrong-headed policies, the future of education, his envisioned place in the corporate world. We make cross-generational references to popular films. We finish each other’s sentences.
"The really exciting thing about J.’s work is—"
"--everything we’ve been talking about is only 5 percent of the potentiality--"
"--even if the theory is ultimately proven false—"
"—it opens up so much—"
Which, we agree, is both terrifying and exhilarating.
G. thinks at warp speed, a far greater velocity than the everyday world requires or supports. A simple assignment turns into a 50-page thesis. Every sentence that comes out of his mouth or pen has several disclaimers, qualifiers, and alternate interpretations lurking behind it. If he tries to follow our mandates to "focus" and “frame,” his work becomes a strangely truncated outline with key connections missing. When everything seems important, editing is an arbitrary act. What to cut? How to choose? In a world full of meaning, which vital thing will you omit?
He's been medicated, counseled, mentored, and rewarded for this. But he remains the passionate explorer. Once an idea grabs him, he can’t seem to edit out intersecting issues. He experiences everything at once. Nothing is backdrop; it’s all center stage. He wants to explain totality. Anything less is a cheat.
"You’ve got to go to grad school," I tell him. We laugh.
We are suddenly aware of a peculiar silence. The building has taken on that hushed waiting that all public spaces get after hours. We can hear little pings and creaks in the walls and air ducts all around us, no longer masked by the rush of humanity through the rooms and halls. It’s long after 10:00 p.m. The security guard rattles the main doors, checks the side entrance. We are about to be "secured," and we decide that we don’t want to be the ones to discover whether exiting after lockdown sets off the alarms.
Backpacks and briefcases gathered, keys jangling as I shut down the computer and enter the security code, we walk, still talking, through the halls and out into the deserted parking lot. My cheap, reliable car sits not far away, in a little pool of streetlight, and we head toward it. As I unlock my door, I glance around the empty lot.
"Where’d you park?" I say, expecting to see his car lurking in the shadows nearby.
He flings one hand toward the deep-dark at the far end of the lot. "Back over there," he says. "I just didn’t want you walking out here alone."
I pause, keys in hand. It’s a courtly gesture, an everyday kindness. But tonight, it feels a lot like hope. I stand here, five thoughts warring at once in my head, each jamming the others so that not a one gets spoken. Because it strikes me just then that we create these clean, well-lighted places for each other. Hope flows both ways. It flows both ways. We conjure these temporary, malleable, and, most importantly, collaborative spaces for, and with, each other. We build them, not as escapes from a world gone unaccountably off track, but as paths through it. And from here on, we’ll have to make that happen. The scaffold is falling away.
"You have my e-mail," I say finally. "Use it." G. gives me a quick smile and saunters off, leaving me in a pool of light.