Humanities

Poem about student writing

Since the beginning of time
Everyone knows in society today
Student writing hasn’t gotten any better
Nor is it really any worse than usual.
The sentences are still afraid of commas
And plurals and possessives share a closet.
I don’t expect much improvement
Without better nutrition and stronger threats.
Plus, there are far too many sentences
That begin with This or There followed
By big empty boxes of Is and Are.
(Perhaps this student should take a year off
And read books with real people in them.)
And I’m only talking about sentences
Not the paragraphs that struggle along
Between the left and right margins
But miraculously start and finish
At the top and bottom of each page.
Also, I was really hoping for an original title
And just once my name spelled right.

 

Laurence Musgrove is professor and chair of English and modern languages at Angelo State University.

Section: 
Editorial Tags: 

Essay on the idea that non-philosophers should judge philosophers

One of the oldest questions of philosophy is, "Who guards the guardians?" When Plato posed this question -- if not quite this succinctly -- his concern was with how a community can keep its leaders focused on the good of the whole. Plato's answer was that guardians should govern themselves — philosophy would train their souls so that they would choose wisely rather than unjustly. Kings would become philosophers, and philosophers kings.

This is not how we do things today. In representative forms of government the people rule, at least intermittently, through processes such as voting, recalls, and referenda. Particularly within the American experiment everybody guards everyone else — through a system of "checks and balances." But there is at least one major institution that still follows Plato's lead, relying on self-governance and remaining proudly nondemocratic: the academy.

We academics have long argued that we have a special justification for self-rule. We claim that our activities — which consist of the production of knowledge, and its dissemination via presentations, publications, and teaching — are so specialized and so important that ordinary people cannot properly judge our work. Instead, we have devised a way to evaluate ourselves, through a process known as peer review.

Whether it is a matter of articles or books, grant applications, or tenure and promotion, review by one's academic peers has long been the standard. And who are one's peers? The academy gives a disciplinary answer to this question. Biologists are the ones competent to judge work in biology, and only chemists can judge the research of other chemists. Nonexperts — whether within or outside the academy — will only disrupt the process, leading to misguided or even disastrous results. Best to leave such questions to the experts.

But what of philosophy? Across the 20th century and now into the 21st, philosophers have been evaluated in the same way. Even while claiming that philosophy has a special relevance to everyday life, philosophers have mostly written for and been evaluated by their disciplinary peers. Philosophy became more and more professionalized in the 20th century, with nonexperts increasingly unable to comprehend, much less judge, the work of philosophers. A philosopher today is not considered successful unless he or she contributes to professional, peer-reviewed publications in the field.

But should philosophy really act like the other disciplines in this regard? Should philosophy be considered a "discipline" at all? And if not, what are the consequences for the governance of philosophy?

One of the oddities of present-day philosophy is how rarely this question is asked. Go to a philosophy department with a graduate program, and sign up for a course in ancient philosophy: the professor will be expected to know ancient Greek, and to be well-read in the scholarly literature in the area. The irony is that there was no secondary literature for the Greeks — no scholarship at all, in fact, in the sense that we mean it today. Philosophers were thinkers, not scholars. Socrates would never get tenure: what did he write?

This situation was partly a matter of technology; paper was expensive and reproduction of a manuscript laborious. But it is still odd to assume that Plato and Aristotle would have been good scholars if only they’d had access to the Philosopher's Index and an Internet connection. Nor were the Greeks good disciplinarians. Socrates was notorious for speaking with people from all walks of life; and when he came to be evaluated it was by a jury of his peers consisting of 500 Athenians. He may not have liked the verdict, but he did not dispute the jury's right to pass judgment.

Across the long sweep of Western history we find the point repeated: Bacon, Machiavelli, Descartes, Leibniz, Locke, Marx and Nietzsche all wrote for and sought the judgment of peers across society. One wonders what they would think of what counts as philosophy across the 20th century — a highly technical, inward-looking field that values intellectual rigor over other values such as relevance or timeliness.

Questions about who should count as a philosopher's peers are timely today, for our standard notions of academic peer review are now under assault. Publicly funded science is being held more socially accountable. At the National Science Foundation, grant proposals are now judged by both disciplinary and transdisciplinary criteria — what are called, respectively, "intellectual merit" and "broader impacts." Universities are also being held responsible for outcomes, with state funding increasingly being tied to graduation rates and other achievement measures. Philosophers, too, have begun to feel the pinch of accountability, especially in Britain, where the so-called "impact agenda" has advanced more rapidly than in the United States.

We view this situation as more of an opportunity than as a problem. Philosophers should get creative and treat the question of who counts as our peers as itself a philosophic question. There are a variety of ethical, epistemological, and political issues surrounding peer review worthy of philosophic reflection. But perhaps the most pressing is the question of whether we should extend the notion of peer beyond disciplinary bounds.

This could occur in a number of different ways. Not only could we draw nonphilosophers or nonacademics into the peer review process. We could also consider a variety of other criteria, such as the number of publications in popular magazines or newspaper articles; number of hits on philosophic blogs; number of quotes in the media; or the number of grants awarded by public agencies to conduct dedisciplined philosophic work.

Now, some will claim that extending the idea of our philosophical peers to include nonphilosophers will expose philosophy to the corruptions of the demos. Is philosophizing to become a sheer popularity contest, where philosophers are promoted based on their Klout score, or the number of Facebook likes their blog posts garner? Aren’t we proposing that the Quineans be replaced by the Bieberians?

Such objections stem, in part at least, from what we could call a Cartesian ethos — the idea that philosophers should strive above all to avoid error. We should withhold our assent to any claim that we do not clearly and distinctly perceive to be true. This Cartesian ethos dominates philosophy today, and nowhere is this clearer than in regard to peer review. Our peers are our fellow philosophers, experts whose rigor stands in for Descartes' clear and distinct ideas.

For a counterethos we could call upon William James's "The Will to Believe." James argues that the pursuit of truth, even under conditions where we cannot be certain of our conclusions, is more important than the strict avoidance of error. Those who object that this will open philosophy up to all sorts of errors that would otherwise have been caught by expert peer review are exhibiting excessive Cartesianism. In fact, those who insist on the value of expertise in philosophy are reversing the Socratic approach. Whereas Socrates always asked others to contribute their opinions in pursuit of truth, Descartes trusted no one not to lead him into error. A Jamesian approach to peer review, on the other hand, would be generous in its definition of who ought to count as a peer, since avoiding error at all costs is not the main goal of philosophy. On a Jamesian approach, we would make use of peers in much the way that Socrates did — in an effort to pursue wisdom.

It is true that when philosophers broaden their peer group, they lose some control over the measures used to define philosophic excellence. This raises another risk — that philosophy will be merely an instrument for an exterior set of ends. The fear here is not that abandoning disciplinary peer review will lead us into error. Instead, it is that the only alternative to value as judged by disciplinary peers is a crass utilitarianism, where philosophic value is judged by how well it advances a paymaster’s outcome. One philosopher may be labeled a success for helping a racist political candidate hone his message, while another may be labeled a failure for not sufficiently fattening a corporation's bottom line. Isn’t a dedisciplined philosophy actually a return to sophistry rather than to Socrates? Won’t it sell its services to whoever is buying, adjusting its message to satisfy another’s agenda and criteria for success? In order to survive until the turn of the 22nd century, must we sell the soul of philosophy at the beginning of the 21st?

We have two replies to such concerns. First, philosophy existed long before the 20th-century model of academic disciplinarity came to define its soul. The struggle between philosophy and sophistry is a perennial one, and one does not necessarily sell out by writing for a larger audience — or remain pure by staying within disciplinary boundaries.

Second, disciplinary and dedisciplinary approaches to philosophy should be seen as complementary rather than antagonistic to one another. Rigor should be seen as pluralistic: the rigor of disciplinary work is different from, but neither better nor worse, than the philosophic rigor required to adjust one’s thinking to real world exigencies. This is a point that bioethicists have long understood.  In his seminal 1973 "Bioethics as a Discipline," Daniel Callahan already saw that doing philosophical thinking with physicians, scientists, and other stakeholders demands "rigor … of a different sort than that normally required for the traditional philosophical or scientific disciplines." Bioethics exists in disciplinary and in nondisciplinary forms — in ways that synergize. It shows that we need not be forced, as a matter of general principle, to choose one set of peers over another.

Practically speaking, examining the question of who should count as a peer means that philosophers will need to revisit some of the core elements of our field. For one, our criteria for tenure and promotion would need to be reviewed. The current strict hierarchy surrounding where we publish — say, in "The Stone" (the New York Times philosophy blog) or in Mind — would need to be re-evaluated. And really, what is the argument for claiming that the latter sort of publication is of higher value? If you reply that the latter is peer-reviewed, excuse us for pointing out that your answer begs the question.

And what about the question of multiple authorship? Should this article count for less because three of us wrote it? How much less? Why? Co-authoring is actually just as challenging as producing single-authored works, as we can attest, so the justification cannot be that it is less work. Should we value independent scholarship over collaboration? Why? This is the Cartesian ethos coming back to haunt philosophy: I think; I exist; I write; I am a scholar. We doubt it.

As universities face growing demands for academic accountability, philosophers ought to take the lead in exploring what accountability means. Otherwise we may be stuck with Dickens’s Mr. Gradgrind. ("Now, what I want is Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life.") But a philosophical account of accountability will also require redefining the boundaries of what counts as philosophy. We ought to engage those making accountability demands from the outside in just the way that Socrates engaged Euthyphro on piety. If there are indeed Bieberians at the gate, we say let them in — as long as they are willing to engage in dialogue, we philosophers should do all right. Unless it is we philosophers who refuse to engage.

 

Author/s: 
Robert Frodeman, J. Britt Holbrook and Adam Briggle
Author's email: 
info@insidehighered.com

Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas. He was editor in chief of the Oxford Handbook of Interdisciplinarity.

J. Britt Holbrook is research assistant professor of philosophy and assistant director of the Center for the Study of Interdisciplinarity at the University of North Texas. He is editor in chief of Ethics, Science, Technology, and Engineering: An International Resource, forthcoming from Gale-Cengage Learning.

Adam Briggle is assistant professor of philosophy at the University of North Texas. He is author, with Carl Mitcham, of Ethics and Science: An Introduction from Cambridge University Press, 2012.

Editorial Tags: 

Essay on landing an academic job when not expecting to

Category: 
On the Fence

When Eliza Woolf gave up on finding a good academic job, she landed one.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Essay critiques the ideas of Clay Shirky and others advocating higher ed disruption

Clay Shirky is a big thinker, and I read him because he’s consistently worth reading. But he’s not always right – and his thinking (and the flaws in it) is typical of the unquestioning enthusiasm of many thinkers today about technology and higher education. In his recent piece on "Napster, Udacity, and the Academy," for example, Shirky is not only guardedly optimistic about the ways that MOOCs and online education will transform higher education, but he takes for granted that they will, that there is no alternative. Just as inevitably as digital sharing turned the music industry on its head, he pronounces, so it is and will be with digital teaching. And as predictably as rain, he anticipates that "we" in academe will stick our heads in the sand, will deny the inevitable -- as the music industry did with Napster -- and will "screw this up as badly as the music people did." His views are shared by many in the "disruption" school of thought about higher education.

I suspect that if you agree with Clay Shirky that teaching is analogous to music, then you are likely to be persuaded by his assertion that Udacity -- a lavishly capitalized educational startup company -- is analogous to Napster. If you are not impressed with this analogy, however, you will not be impressed by his argument. And just to put my cards on the table, I am not very impressed with his argument. I think teaching is very different from music; that it is so different as to make the comparison obscure a lot more than it reveals.

But the bigger problem is that this kind of argument is weighted against academics, virtually constructed so as to make it impossible for an academic to reply. If you observe that "institutions will try to preserve the problem to which they are the solution," after all -- what has been called "The Shirky Rule" -- it can be easy to add the words "all" and "always" to a sentence in which they do not belong. This not a principle or a rule; it’s just a thing that often happens, and often is not always. But if you make the mistake of thinking that it is, you can become uniformly prejudiced against "institutions," since you literally know in advance what they will do and why. Because you understand them better than they understand themselves -- because they don’t or can’t realize that they are simply "institutions" -- you can explain things about them that they can neither see, nor argue against. "Why are you so defensive?" you ask, innocently, and everything they say testifies against them.

If someone like me -- a graduate student for many years, currently trying to find an academic job -- looks at MOOCs and online education, and sees the downsides very clearly, it’s also true that no one has a more strongly vested interest in arguing the benefits of radically transforming the academe than Clay Shirky and a number of others who talk about the inevitability of radical change. As Chuck Klosterman unkindly put it, once, "Clay Shirky must argue that the Internet is having a positive effect – it’s the only reason he’s publicly essential." Which is not to say that Shirky is wrong, simply that he must prove, not presume, that he is right.

I have to go through this excessively long wind-up because of the ways that Shirky has stacked the rhetorical deck in his favor. He uses the word "we" throughout his piece, and in this powerful final paragraph, he hammers us over the head with it, so precisely that we might mistake it for a caress:

"In the academy, we lecture other people every day about learning from history. Now it's our turn, and the risk is that we’ll be the last to know that the world has changed, because we can’t imagine — really cannot imagine — that story we tell ourselves about ourselves could start to fail. Even when it’s true. Especially when it’s true."

But what do you mean "we," Mr. Distinguished Writer in Residence? I asked Shirky on Twitter if he considered himself primarily an academic, and though he didn’t respond, it’s important that he frames his entire post as if he’s an insider. But while it’s certainly true that I am biased in favor of academic labor continuing to exist in something like its present form, he is no less biased by having nothing to lose and everything to gain if academe is flipped on its head. And yet the cumulative rhetorical effect of his framing is to remind us that no one within the institution can speak knowledgeably about their institution, precisely because of their location within it; when Shirky speaks of "we" academics, he does so only to emphasize that "we" can’t imagine that the story we tell ourselves is wrong.

It's because he is willing to burn the village to save it that Shirky can speak for and of academe. Because Shirky never has to show evidence that online education will ever be any good; he notes an academic’s assessment of a Udacity course as "amazingly, shockingly awful" and is then, apparently, satisfied when Udacity admitted that its courses "can be improved in more than one way." A defensive blog post written by Udacity’s founder is enough to demonstrate that change for the better is happening. And when the academic who criticized the Udacity course mentions a colleague whose course showed some of the same problems -- but does not name the colleague -- Shirky is triumphant. The academic in question "could observe every aspect of Udacity’s Statistics 101 (as can you) and discuss them in public," Shirky observes, "but when criticizing his own institution, he pulled his punches."

This is Clay Shirky’s domain, and also the domain of so many others who point to one or another failing of traditional higher ed to suggest that radical change is needed. The anecdote that illustrates something larger. In this case, the fact that academe is a "closed" institution means it cannot grow, change, or improve. By contrast, "[o]pen systems are open" seems to be the end of the discussion; when he contemplates the openness of a MOOC, the same definitional necessity applies. "It becomes clear," he writes, "that open courses, even in their nascent state, will be able to raise quality and improve certification faster than traditional institutions can lower cost or increase enrollment.” It becomes clear because it is clear, because "open" is better, because it is open.

But how "open" is Udacity, really? Udacity’s primary obligation is to its investors. That reality will always push it to squeeze as much profit out of its activities as it can. This may make Udacity better at educating, but it also may not; the job of a for-profit entity is not to educate, but to profit, and it will. There’s nothing necessarily wrong with for-profit education -- and most abuses can be traced back to government deregulation, not tax status -- but the idea that "openness," as such, will magically transform how a business does business is a massively begged question. A bit of bad press can get Sebastian Thrun to write a blog post promising change, but actually investing the resources necessary to follow through on that is actually a very different question. The fact that someone like Shirky takes him at face value -- not only gives him the benefit of the doubt, but seems to have no doubt at all -- speaks volumes to me.

Meanwhile, did the academic that Shirky criticizes really "pull his punches"? Did he refrain from naming his colleague because of the way academics instinctively shield each other from criticism? It’s far from clear; if you read the original blog post, in fact, it’s not even apparent that the academic knew who this "colleague" actually was. All we really know is that a student referred to something her "last teacher" did. But suppose he did know who this student’s last teacher was; suppose the student mentioned the teacher by name. Would it have been appropriate to post someone’s name on the Internet just because a secondhand source told you something bad about them? Does that count as openness?

Open vs. closed is a useful conceptual distinction, but when it comes down to specific cases, these kinds of grand narratives can mislead us. For one thing, far from the kind of siege mentality that characterized an industry watching its business model go up in smoke -- an industry that was not interested in giving away its product for free -- academics are delighted to give away their products for free, if they can figure out a way to do it. Just about every single public and nonprofit university in the country is working to develop digital platforms for education, or thinking hard about how they can. This doesn’t mean they are doing it successfully, or well; time will tell, and the proof will be in the pudding. But to imagine that Silicon Valley venture capitalists are the only people who see the potential of these technologies requires you to ignore the tremendous work that academics are currently doing to develop new ways of doing what they do. The most important predecessors to MOOCs, after all, were things like Massachusetts Institute of Technology's OpenCourseWare, designed entirely in the spirit of openness and not in search of profit.

The key difference between academics and venture capitalists, in fact, is not closed versus open but evidence versus speculation. The thing about academics is that they require evidence of success before declaring victory, while venture capitalists can afford to gamble on the odds. While Shirky can see the future revolutionizing in front of us, he is thinking like a venture capitalist when he does, betting on optimism because he can afford to lose. He doesn’t know that he’s right; he just knows that he might not be wrong. And so, like all such educational futurologists, Shirky’s case for MOOCs is all essentially defensive: he argues against the arguments against MOOCs, taking shelter in the possibility of what isn’t, yet, but which may someday be.

For example, instead of arguing that MOOCs really can provide "education of the very best sort," Shirky explicitly argues that we should not hold them to this standard. Instead of thinking in terms of quality, we should talk about access: from his perspective, the argument against MOOCs is too narrowly focused on the "18-year-old who can set aside $250k and four years" and so it neglects to address students who are not well-endowed with money and time. "Outside the elite institutions," Shirky notes, "the other 75 percent of students — over 13 million of them — are enrolled in the four thousand institutions you haven’t heard of." And while elite students will continue to attend elite institutions, "a good chunk of the four thousand institutions you haven’t heard of provide an expensive but mediocre education."

This is a very common argument from MOOC boosters, because access is a real problem. But while a "good chunk" of 13 million students are poorly served by the present arrangement, it is quite telling that his example of "expensive but mediocre education" is Kaplan and the University of Phoenix, for-profit institutions that are beloved by the same kinds of venture capitalists who are funding Udacity. He is right: For-profit education has amassed a terrible track record of failure. If you are getting a degree at a for-profit institution, you probably are paying too much for too little. But would it be any less mediocre if it were free?

Udacity’s courses are free to consumers (though not, significantly, to universities), at least for now. And Shirky is not wrong that "demand for knowledge is so enormous that good, free online materials can attract extraordinary numbers of people from all over the world." But Shirky doesn’t mean "demand" in the economic sense: demand for a free commodity is just desire until it starts to pay for the thing it wants. Since there is a lot of unmet desire for education out there, and since that desire is glad to have the thing it wants when it finds it for free, it seems all to the good that students can find courses for free. But while we should ask questions about why venture capitalists are investing so heavily in educational philanthropy, we also need to think more carefully about why is there so much unmet desire in the first place, and why so many people want education without, apparently, being able to pay for it. Why hasn’t that desire already found a way to become demand, such that it must wait until Silicon Valley venture capitalists show up, benevolently bearing the future in their arms?

The giveaway is when Shirky uses the phrase "non-elite institutions": for Shirky, there are elite institutions for elite students and there are non-elites for everyone else. The elite institutions will remain the same. No one will ever choose Udacity over Harvard or U.Va., and while elite institutions like MIT, Stanford, Princeton, and my own University of California are leaping into the online education world head first, anyone who thinks these online brands will ever compete with "the real thing" will be exactly the kind of sucker who would fork over full price for a watered-down product.

MOOCs are only better than nothing and speculation that this will someday change is worth pursuing, but for now, remains just that, speculation. It should be no surprise that venture capital is interested in speculation. And it should be no surprise that when academics look at the actual track record, when we try to evaluate the evidence rather than the hope, we discover a great deal to be pessimistic about.

Why have we stopped aspiring to provide the real thing for everyone? That’s the interesting question, I think, but if we begin from the distinction between "elite" and "non-elite" institutions, it becomes easy to take for granted that "non-elite students" receiving cheap education is something other than giving up. It is important to note that when online education boosters talk about "access," they explicitly do not mean access to "education of the best sort"; they mean that because an institution like Udacity provides teaching for free, you can’t complain about its mediocrity. It’s not an elite institution, and it’s not for elite students. It just needs to be cheap.

Talking in terms of "access" (instead of access to what?) allows people like Shirky to overlook the elephant in the room, which is the way this country used to provide inexpensive and high-quality education to all sorts of people who couldn’t afford to go to Yale -- people like me and my parents. While state after state is defunding its public colleges and universities (and so tuition is rising while quality is declining), the vast majority of American college students are still educated in public colleges and universities, institutions that have traditionally provided very high-quality mass higher education, and which did it nearly for free barely a generation ago.

"Access" wouldn’t even be a problem if we didn’t expect mass higher education to still be available: Americans only have the kind of reverence for education that we have because the 20th century made it possible for the rising middle class to have what had previously been a mark of elite status, a college education. But the result of letting these public institutions rot on the vine is that a host of essentially parasitic institutions -- like Udacity -- are sprouting like mushrooms on the desire for education that was created by the existence of the world’s biggest and best public mass higher education system.

Shirky talks dismissively about his own education, at Yale, and recalls paying a lot of money to go to crowded lectures and then to discussion sections with underpaid graduate students. Let me counter his anecdote with my own, When I was a high school student, in Appalachian Ohio, I told my guidance counselor that I wanted to go to Harvard, and he made me understand that people from Fairland High School do not really go to Harvard. I was a dumb high school student, so I listened to him. But although both of my parents worked in West Virginia, they had moved to Ohio when I was young so that I could go to Ohio schools, and this meant that although my grades were only moderately good -- and I had never had access to Advanced Placement classes -- I was able to apply to Ohio State University, get in, afford it, and get an education that was probably better than the one that Shirky got at Yale, and certainly a heck of a lot cheaper. My parents paid my rent, but I paid my tuition myself -- with part time jobs and $20,000 in loans -- and I didn’t have a single class in my major with more than 30 students. I had one-on-one access to all of my professors, and I took advantage of it.

It's a lot harder to do this now, of course; tuition at Ohio State is more than double what it was when I started in 1997. More important, you not only pay a lot more if you go to a school like Ohio State, you’re also a lot less likely to get in; the country’s college-age population has continued to grow, but the number of acceptance letters that public universities like OSU send out has not increased. As Mike Konczal and I have argued, this shortfall in quality higher education creates what economists call "fake supply." If you don’t get in to a college specializing in education "of the best sort" (or if your guidance counselor tells you not to apply), where do you go, if you go? You go to an online university, to Kaplan, or maybe now you try a MOOC or a public college relying on MOOCs to provide general education, as Texas now envisions. Such things are better than nothing. But "nothing" only seems like the relevant point of comparison if we pretend that public higher education doesn’t exist. And if we ignore the fact that we are actively choosing to let it cease to exist.

Beware anyone who tries to give you a link to WebMD as a replacement for seeing a real doctor.

 

Aaron Bady is a doctoral candidate in English literature at the University of California at Berkeley, and he writes and tweets for The New Inquiry as @zunguzungu.

Editorial Tags: 

Essay on confronting academic perfectionism in yourself

Overcoming Academic Perfectionism

Concluding a series, Kerry Ann Rockquemore suggests three ways to move forward.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Essay on what professors can learn from preschool teachers

Preschool teachers are the Rodney Dangerfields of the teaching profession, the "we don’t get no respect" gang. They’re often dismissed, even by their K-12 colleagues, as babysitters and not "real" teachers, but nothing could be further from the truth. The time I’ve recently spent crouching in classrooms, watching how 3- and 4-year-olds explore their universe with the aid of an inspiring guide, convinces me that these teachers are the best in the business. They're changing the arc of children’s lives — and they have a lot to teach the rest of us.

The job of a prekindergarten teacher is unbelievably demanding — if you doubt it, just spend a morning in a classroom filled with 3- and 4-year-olds. Because of the rapidity with which their brains are developing, those kids learn far more rapidly than even our smartest students — think of them as little Lewises and Clarks on their own journeys of discovery. Every teacher relishes the teachable moments, the occasions when you can almost see the lightbulbs of dawning comprehension, because for many students after their early years they’re so rare and special. Each day in a preschool classroom brings a meteor shower of these moments.

College professors usually know what needs to be taught. But for many academics, that knowledge of our own field is the only thing we bring to the classroom. We spend almost no time thinking about how to teach. Though new instructional strategies have proliferated, professors aren’t taught how to teach. They must pick up these new tools on their own, and many don’t bother.

There’s  abundant evidence, for instance, that lectures rarely engage students' minds: college students pay attention to the lecturer just 40 percent of the time and retain even less of what’s being said. Still, the "sage on the stage" remains the norm, and in big universities classes of 100 and more are common. Lectures offer a way of saving money and professors’ time, dressed up in the rationale that students are empty vessels into which knowledge can be poured. To the question, "How did your class go?" an all-too-common response is "I gave a good lecture." But this isn’t how learning usually occurs.

Good prekindergarten teachers not only know what to teach; they also know how they can have the biggest impact. They’ve learned varied a host of ways to teach reading and math, art and science, gymnastics and music and much more. What’s equally important, they’ve studied how children’s minds and emotions develop. They understand that learning isn’t a spectator sport.

To be sure, preschoolers spend part of the day in "circle time," huddled together with their eyes glued to the teacher; that’s the pre-k equivalent of a lecture, though often considerably more enticing. But those lightbulbs really turn on when these three and four year olds are trying out ideas, either on their own or with a few classmates, making mistakes and trying again, as the teacher scans the room, chipping in when kids get stumped.

In these classrooms a lot is occurring simultaneously — while the teacher may be writing down children’s stories that will later be acted out by fellow students, some kids may be painting, others constructing bridges, performing experiments, staff manning a doctor’s office or ordering pizza. And some will be curled up with a picture book from the classroom library.

I became familiar with this world when I spent time crouching in classrooms in Union City, New Jersey, across the Hudson from Manhattan. Union City is the most crowded and one of the most impoverished municipalities in America, and students in such communities are often marked for failure. That’s not the case here — these schools, which I write about in Improbable Scholars, are bringing poor immigrant Latino kids (school officials estimate that 30 percent are undocumented) into the educational mainstream. In 2011, the last year for which official figures are available, the high school graduation rate was 89 percent — that’s about 15 percent higher than the national average — and 60 percent of the graduates enrolled in college. Ask the administrators how Union City manages this feat and they’ll tell you that delivering good early education is makes a critical difference.

The best way to appreciate what’s so remarkable about prekindergarten is by looking closely at what’s going on there. Walk into Suzy Rojas’s classroom and you’ll see art plastering the walls, plants hanging from the ceiling. In every niche there’s something to seize a child’s imagination. Three boys whom I’ll call Angel, Victor and Rodrigo are peering at insects through a microscope, and they’re happy to explain to me what they’re seeing. "Remember when we went to the museum and the butterfly landed on my arm?" Angel asks his friends.

Suzy has joined the conversation. "Are these all insects?" she wonders aloud. "How do you know?" "That one has eight legs," Victor responds, “and that means it’s not an insect.” Then Suzy brings over a prism. "What do you see when you look through it?" she asks, and Rodrigo looks up to say that he can’t tell them apart, that they look like leaves. "Why do you think so?" she inquires. The boys have already learned about lenses, and she tells them that the prism is a special kind of lens.

There’s still more to be gleaned from these creatures. "How about an insect salad — would you want to eat it?" Suzy inquires, and when the boys chorus "ugh," she bounces it back to them: "How come?" They stare once more at the insects. "How many parts does an insect body have? Do you remember what they’re called?" Neville knows the answer: "Three parts — the antenna, abdomen and legs."

"It’s all about exposure to concepts — wide, narrow, long, short,” Suzy tells me. “ 'I have three brothers, three sisters and an uncle — let’s graph that.’ I bring in breads from different countries. 'Let’s do a pie chart showing which one you liked the best.' " Stop for a moment to consider how we expect to absorb concepts — passively, for the most part. "I don’t ask them to memorize 1, 2, 3 or A, B, C," Suzy adds. "I could teach a monkey to count." So much for making college students memorize facts and regurgitate them on the midterm, only to see realize that in a couple of weeks most of that information has been forgotten.

Suzy Rojas’ students aren’t  simply acquiring an understanding of cognitive concepts. They’re also coming to understand why you should wait your turn, how to share, how to manage your own feelings — the emotional skills that report cards once summarized as "works and plays well with others." (I’ve attended faculty meetings whose participants must have missed those lessons.) Back in the classroom, Suzy leaves Rodrigo and his friends, turning to several students who are solving a puzzle on a computer. But when she sees Victor and Rodrigo fighting over who gets the next look at the insects, she quickly returns. "Use your words," she says — familiar teacher-talk — but then she adds a twist. "What can we do?” “We,” not “you”: the boys think about it. "How about adding another container for insects," she suggests. “That way you can all take turns.”

Cognitive and noncognitive, thinking and feeling, Descartes’ mind-body dualism — in a good preschool classroom these distinctions vanish. The teacher is always on the lookout for both kinds of lessons, aiming to reach both head and heart. College students are more mature, of course — fights don’t break out in our classrooms — but if we ignore their emotional responses we risk irrelevance. Our students often react to what’s being said in class at an emotional as well as an intellectual level, paying attention to how the message is being delivered, not just its content. If a professor is so busy imparting knowledge that he misses the students’ body language — the arms folded in "show me" posture or the fingers busily tweeting — he’s lost the class.

Suzy Rojas’s approach to teaching offers a reminder that professors should be relying can do better. We need to rely less on lectures, varying the classroom experience with give-and-take discussion and breakout groups, online learning, outside experts who can join the conversation, student-led classes and group research projects. And we should check in with the students — midcourse corrections can make a world of difference.

There are days when preschoolers come to school agog about what’s happening in their world, a fierce snowstorm or a great movie they’ve seen over the weekend, and a talented pre-k teacher like Suzy Rojas knows how to incorporate their excitement into her lesson. That’s another takeaway — finding ways to incite our students into thinking hard matters a lot infinitely more than marching them through the syllabus.

David L. Kirp is the James D. Marver Professor of Public Policy at the University of California at Berkeley. He is the author of the forthcoming Uncommon Scholars: The Rebirth of a Great American School System and a Strategy for American Education.

Section: 
Editorial Tags: 

Essay on how to list scholarship that hasn't been published yet

Category: 
Tyro Tracts

When your work is under review (but not published or maybe even accepted), can you include it on a C.V.? To do so, you must be honest and consistent, writes Nate Kreuter.

 

Job Tags: 
Ad keywords: 
Editorial Tags: 

Essay on academics who do too many things

Overcoming Academic Perfectionism

The tasks you are doing well may be holding you back from excelling in the tasks at which you need to do well, writes Kerry Ann Rockquemore.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Essay on the idea of a useful liberal arts

In my 14-year tenure as president I have often been asked to define and defend the notion of a "useful" liberal arts education. The general public has difficulty associating the liberal arts with anything useful. That obstacle prompts them to dismiss liberal arts colleges as repositories of graduates with majors such as philosophy, history, anthropology and American studies who cannot get jobs. The thought that these same colleges also have majors such as biology, chemistry, physics and economics is totally missed.

The public is not to blame. American higher education never really experienced the American Revolution. While we threw away the oppressive dictates of monarchy, we never threw off the privileged notion of an English upper class liberal education that was literally defined as being only for those with sufficient wealth to do nothing professionally but dabble in learning. We remained enthralled by the notion of learning for learning’s sake and despite our emerging pragmatic nature, wanted our education to remain sublime and removed from the business of life.

There were prominent founders of the nation who argued for a new kind of liberal education for a new kind of nation. Thomas Jefferson urged a "practical education" for his University of Virginia. And Benjamin Rush, the founder of Dickinson College, decried the unwillingness of Americans to reform education after the Revolution:

It is equally a matter of regret, that no accommodation has been made in the system of education in our seminaries [colleges] to the new form of our government and the many national duties, and objects of knowledge, that have been imposed upon us by the American Revolution. Instead of instructing our sons in the Arts most essential to their existence, and in the means of acquiring that kind of knowledge which is connected to the time, the country, and the government in which they live, they are compelled to spend [time] learning two languages which no longer exist, and are rarely spoken, which have ceased to be the vehicles of Science and literature, and which contain no knowledge but what is to be met with in a more improved and perfect state in modern languages. We have rejected hereditary power in the governments of our country. But we continue the willing subjects of a system of education imposed upon us by our ancestors in the fourteenth and fifteenth centuries. Had agriculture, mechanics, astronomy, navigation and medicine been equally stationary, how different from the present would have been the condition of mankind!

But these singular calls for a more pragmatic education in America to match a new form of government went largely unheeded. Rush’s founding of Dickinson is particularly illustrative. In his 1785 "Plan of Education" he called for a "useful liberal education." The curriculum was to be absent instruction in the writing and speaking of Greek and Latin, but rich in instruction of German, French, Spanish and even Native American languages as those would be highly useful to Americans striving to establish a native economy that would grow as it interacted linguistically with trading nations throughout the world and in the United States. Democracy was to be established through commerce informed by useful liberal education. Liberal education, commerce and democracy were interdependent. The Dickinson course of study was also to include chemistry as Rush thought this subject held the greatest number of connections to emerging knowledge useful to the nation.

The first president of the college and Rush’s fellow trustees ignored his plan. They recommitted to what Rush once called "the monkish" course of study, unchanged for centuries.

Latin and Greek were taught and a chemistry professor was not hired. Additionally, the college refused to hire a German professor. Rush was so angry that he founded nearby what was called Franklin College (today Franklin and Marshall College). It wasn’t until 1999 that Rush’s notion of a "useful" liberal education was reintroduced and embraced explicitly as part of a revised mission statement some 216 years after it was introduced.

Unfortunately for those in America today who wish to argue the usefulness, and thus the worthiness, of a liberal arts education, the founding fathers were not explicit. We know that a liberal education was to yield informed citizens who could build and protect the new government. We know that certain courses were to be taken out and others inserted — those that related more to emerging and immediately explicable knowledge, expanded the appreciation of democracy and created new knowledge and wealth that would materially power the nation’s development. A useful liberal arts education was essentially entrepreneurial. But for all the novelty and potent force in this "disruptive technology" in American higher education introduced by the founding fathers, we know little about how a liberal arts education actually becomes useful — that is, how the study of the liberal arts converts to material effect in the wider world.

Much is at stake to define explicitly and to reassert the usefulness of a distinctively American liberal arts education. The liberal arts are under assault by those who, under the mantle of affordability and efficiency, would reject it for the immediate, but often temporary, benefit of higher education defined as job training. My own experience offers a definition for the 21st century, in fact, for any century, where economic uncertainty prevails. I was a German and philosophy double major. At first glance, what could be more useless? And yet, my professional life has proven such a conclusion wrong.

I have been — sometimes simultaneously — a military officer, a pre-collegiate teacher, administrator and coach. I founded an athletic team, developed a major center at a prestigious research university, acted as a senior consultant to the U.S. Department of State with diplomatic status, served as a corporate officer at two publicly traded companies and now serve as president of Dickinson College. For none of these careers did I ever study formally or take a class.

I gained competency through independent reading, experience and observation. I appreciated that the breadth of knowledge and the depth of cognitive skill that my undergraduate courses in social science, political science, art and science prepared me for any field of professional pursuit. I was prepared for professional chance.  I knew how to ask the right questions, how to gather information, how to make informed decisions, how to see connections among disparate areas of knowledge, how to see what others might miss, how to learn quickly the basics of a profession, how to discern pertinent information from that which is false or misleading, how to judge good, helpful people from those who wish you ill. All of this I gathered in a useful liberal education — in and out of the classroom — and in an intense residential life where experimentation with citizenship and social responsibility were guiding principles.

There were no formal, discrete courses to learn these habits of mind and action — no courses devoted to brain exercises, critical-thinking skills, leadership and citizenship; rather, professors and staff were united in all interactions to impress upon students day after day, year after year a liberal arts learning environment that was intellectually rigorous and defining. This was contextual learning at its fullest deployment. We absorbed and gradually displayed ultimately useful knowledge and skill not in a studied manner, but discretely and naturally. Time after time in my various careers, I applied these liberal arts skills to solve materially wider-world problems. And most important, except for my military service and my college presidency, none of my jobs existed before I assumed them. My useful education has enabled me to maximize opportunity within highly fluid and changing employment rhythms. As I now face another job transition in my life, I go forward with confidence that something appropriate will develop. I have no concrete plans and I like it that way. I know I am prepared on the basis of my liberal arts education to maximize chance. Something will develop. Something that probably doesn’t yet exist.

I am not alone in my appreciation of the liberal arts. Those of privilege have appreciated liberal education historically. It has contributed to their access and hold on power and influence. Their sons and daughters, generation after generation, have attended liberal arts institutions without hesitation. There is no job training in their educational landscape. It would be tragic if all the new and previously underserved populations now having access to higher education missed the opportunity for their turn at leadership and influence simply because of the outspoken — arguably purposeful — dismissal of the liberal arts as "useless," often by those who received a liberal arts education themselves and intend nothing less for their own children.
 

William G. Durden is president of Dickinson College.

Section: 
Editorial Tags: 

Essay on similarities between dating and academic job-hunting

Lori A. Flores writes that both activities are bound to produce some heartbreak along the way.

Job Tags: 
Ad keywords: 
Editorial Tags: 

Pages

Subscribe to RSS - Humanities
Back to Top