I am an Edu-Traitor. I am a college professor. What I am about to say may well be perceived as supporting attitudes thought to be against the interests and well-being of college professors. Here goes: I do not think going to university should be the be-all and end-all of K-12 education. The importance of going to college should be intrinsically the rationale by which we justify public support of higher education. Higher education is incredibly valuable, even precious, for many. But it is bad for individuals and society to be retrofitting learning all the way back to preschool, as if the only skills valuable, vital, necessary in the world are the ones that earn you a B.S., BA, or a graduate and professional degree.
Do I think it is criminal that we are de-funding higher education now? Yes. Do I think it is appalling to think we are charging larger and larger tuitions at state institutions (and private ones, but that is a different issue)? Of course. Is it shocking that such a rich country is not supporting free education? Absolutely. Do I believe there are benefits that accrue from a highly educated workforce, with an appreciation of an array of subjects (liberal arts to computer science) that are not strictly pre-professional training? Definitely. But here’s the Edu-Traitor part: Do I believe we need to justify the investment in higher education in terms of it being a necessity for the 21st century for everyone? Absolutely not.
We justify higher ed so often because many of the careers of the 21st century need (reformed, definitely it needs to be reformed) higher ed. But many occupations do not. That is not my main concern, however. I argue that, right now, we are deforming the entire enterprise of education, from preschool onward, by insisting it be measured implicitly by the standard of, "Will this help you get into college?" The result is the devaluation of myriad important ways of learning that are not, strictly speaking, "college material."
The world of work -- the world we live in -- is so much more complex than the quite narrow scope of learning measured and tested by college entrance exams and in college courses. There are so many viable and important and skilled professions that cannot be outsourced to either an exploitative Third World sweatshop or a computer, that require face-to-face presence, and a bucketload of skills – but that do not require a college education: the full range of IT workers, web designers, body workers (such as deep tissue massage), yoga and Pilates instructors, fitness educators, hairdressers, retail workers, food industry professionals, entertainers and entertainment industry professionals, construction workers, dancers, artists, musicians, entrepreneurs, landscapers, nanny’s, elder-care professionals, nurse's aids, dog trainers, cosmetologists, athletes, sales people, fashion designers, novelists, poets, furniture makers, auto mechanics, and on and on.
All those jobs require specialized knowledge and intelligence, but most people who end up in those jobs have had to fight for the special form their intelligence takes because, throughout their lives, they have seen never seen their particular ability and skill set represented as a discipline, rewarded with grades, put into a textbook, or tested on an end-of-grade exam. They have had to fight for their identity and dignity, their self-worth and the importance of their particular genius in the world, against a highly structured system that makes knowledge into a hierarchy with creativity, imagination, and the array of so-called "manual skills" not just at the bottom but absent.
Everyone benefits from more education. No one benefits from an educational system that defines learning so narrowly that whole swaths of human intelligence, skill, talent, creativity, imagination, and accomplishment do not count.
I have been teaching in higher ed since I was 25. I am a passionate and dedicated college teacher, a researcher, and I’ve been privileged to teach at many kinds and types of institutions. And I think we have education all wrong. Since the end of the 19th century, with the birth of the modern research university and the beginning of professional schools of education and graduate schools for training teachers, the grail of all education, from preschool to the present, is implicitly higher education. All of the multiple ways that we learn in the world, all the multiple forms of knowing we require in order to succeed in a life of work, is boiled down to an essential hierarchical subject matter tested in a way to get one past the entrance requirements and into a college. Actually, I agree with Ken Robinson that, if we are going to be really candid, we have to admit that it’s actually more narrow even than that: we’re really, implicitly training students to be college professors. That is our tacit criterion for "brilliance." For, once you obtain the grail of admission to higher ed, you are then disciplined (put into majors and minors) and graded as if the only end of your college work were to go on to graduate school where the end is to prepare you for a profession, with university teaching of the field at the pinnacle of that profession.
The abolishing of art, music, physical education, and shop from schools means that the requirement for excellence has shrunk more and more right at the time when creativity, imagination, dexterity, adaptability to change, and all the rest require more, not less, diversity. The shrinking of "what counts" would be counterproductive and dehumanizing in any era, but in this world of constant, global change it is simply destructive. (For an excellent and inspiring and witty discussion of this topic, I highly recommend Ken Robinson’s TED talk.)
By funneling all the different ways we learn the world into a very few subjects that count and are tested – what I’ll call "pre-professorial training" – we make education hell for so many kids, we undermine their skills and their knowledge, we underscore their resentment, we emphasize class division and hierarchy, and we shortchange their future and ours, underestimating talents that should be nourished and thereby forcing them to fight for themselves against odds, giving them obstacles to their own integrity and self-worth and value to fight when we should be giving them inspiration to flourish.
I’m appalled that we judge learning in such narrow collegiate terms as that which is taught in college and "gets you into" college. Decoupling the goal of "going to college" from the goal of “learning” is not actually detrimental to the importance of higher ed for society; it’s not even detrimental to college professors, those putatively in a position to be most privileged by the current system. The opposite is the case. For now, many kids who have the means are going to college because they are supposed to. That’s not good for anyone. Conversely, many brilliant kids who passionately want to go to college cannot afford to. Another travesty. And, finally, many brilliant, talented young people are dropping out of high school because they see high school as implicitly “college prep” and they cannot imagine anything more dreary than spending four more years bored in a classroom when they could be out actually experiencing and perfecting their skills in the trades, the skills, and the careers that inspire them.
Right now, they feel like failures. They are not. They are only "failures" if judged by the narrow hierarchy of values by which we currently construct educational success. As an educator, I want to change that hierarchy of values in order to support a more abundant form of education that honors the full range of intellectual possibility and potential for everyone, regardless of whether they are college material or not.
If the title rings no bells, that is hardly surprising. Like Freaks and Geeks, the previous show by Undeclared's creators, it was the victim of remarkably clueless network executives who never quite knew what to do with it. Neither program seemed to air more than two consecutive weeks in a row. And when a loyal following emerged anyway (with television critics lauding the shows' humor and intelligence) it didn't make much difference. The ratings were too low.
Freaks and Geeks lasted from 1999 to 2000 -- just as the major networks were discovering that the audience really wanted to watch people eat bugs and marry strangers. The first episode of Undeclared aired on Fox two weeks after 9/11. Its picture of dorm life at the imaginary University of North Eastern California was neither cathartic nor particularly escapist. And the realities it portrayed (for example, "free money" being handed out to students, i.e., credit card companies signing them up) were probably too campus-specific.
The revival of Undeclared on DVD is in part a matter of its cult status. As with Freaks and Geeks, it seems to have developed a solid core of fans on college campuses, with videotapes circulating long after cancellation. (Last year, F&G was issued on DVD to generally rapturous acclaim.)
Adolescence is usually portrayed by pop culture in terms that are themselves pretty adolescent. Which is to say, either cloying in its sentimentality or histrionic in its cynicism -- arguably, two sides of the same coin. Teenage life is presented as a time of profound life lessons ("And that was when I understood that things would never be the same again..."). Either that, or as a spell of wrenching agony (same voiceover, but with a bitter snarl).
In either case, growing up is portrayed as the loss of innocence, or some moment of hideous realization that innocence was always a sham, rather than a process of gaining new powers and responsibilities. So adolescence becomes a privileged phase in life -- a period when you haven't yet succumbed to all of the compromises and disappointment that must follow. Whole sectors of the economy are devoted to reinforcing this belief, in however bizarrely distorted a form. The desirability of being able to recapture part of adolescence must be the subtext of half the SUV advertisements. And the themes and attitudes associated with that part of life (alienation, vulnerability, irony) are pretty much identical with the dominant tone of mass media now, as as Andrew Calcutt shows in Arrested Development: Pop Culture and the Erosion of Adulthood (Cassell, 1998).
What made both Freaks and Geeks and Undeclared stand out is that each broke out of this pattern. They depicted adolescence in a recognizable way, but the sensibility was more adult than anything the characters themselves could have manifested.
That was especially true of Freaks and Geeks, set in a suburban high school in Michigan in 1980-1. The central character, Lindsay, was an academically talented middle-class kid who, in the wake of her grandmother's death, begins to idealize the underachieving stoner kids. She turns her back on the Mathletes and starts hanging out with the school's clique of rebels-without-a-cause. The pop-culture norm would be to celebrate Lindsay's metamorphosis from brainy "geek" to disengaged and sardonic "freak" (a bit of period slang that didn't last much longer than the subculture itself).
But the program was a lot more astute than that. It tracked her disillusionment with disillusionment. Gradually, Lindsay saw that the romantic vision of her friends as outsiders is totally inadequate. They were as prone to self-deception, inauthenticity, and inner numbness as anyone. The result was a story of real maturation, rather than of easy epiphanies.
As Judd Apatow, the producer for both programs, writes in the booklet accompanying Undeclared, he started the second series while mourning for the first. He hoped to gather some of the earlier cast together and recreate its chemistry.
While Undeclared certainly has its moments, I don't think that quite happened. For one thing, Paul Feig, who created F&G, didn't write any of the scripts for Undeclared, though he did direct a couple of episodes. His book Kick Me: Episodes in Adolescence (2002) is the quintessential account of nerd life -- a memoir that is excruciatingly funny, when not simply excruciating. Feig has described its recently published sequel, Superstud, or How I Became a 24-Year-Old Virgin, as the second part of his "trilogy of shame." (Both books are from Three Rivers Press.)
Minus Feig's scriptwriting, Undeclared doesn't have much of an introspective edge. But its picture of life in a coed dormitory (at a not-very-impressive university) is funny often, often enough, to be memorable. In particular, it renders an utterly believable (and slightly quease-making) account of a high-school romance that goes rancid when one person heads off to college. The line between affectionate cuteness and emotional breakdown can get pretty thin. Only small nuances of tone make it comic, rather than horrifying.
Watching the program again after three years, I'm also struck by how often it shows presumably full-grown adults regressing to an adolescent state, thereby making themselves ridiculous. The singer-songwriters Loudon Wainwright plays a father who starts hanging out at the dorm during his divorce, happy to be treated as a cool guy by his son's friends. A history professor (Fred Willard) is challenged by a student to liven up his lectures. So he reenacts the Kennedy administration using costumes and props -- his idea of what the kids want. Edutainment has seldom been so awkward.
And Will Farrell makes an appearance as a familiar type: the guy living just off campus who will, for a fee, write term papers. (He's even equipped to take credit cards.) Clearly a brilliant student in his prime, he now sits around the house in a robe playing video games. For a student who needs a paper on Jackson Pollock or The Brothers Karamazov, he's ready to churn one out in a few hours.
How does he do it? "I read eight or nine books a week," he tells a customer. "I also take a lot of speed.
Apart from the usual (and wildly uneven) selection of deleted scenes and commentary tracks, the DVD set offers an episode called "God Visits" that never aired during the original run of the series.
It shows one dorm resident embracing pure nihilism (well, as pure as the sitcom format will allow) and another becoming a Bible-totting religious zealot. Naturally, each student returns, in due course, to a state of nonphilosophical normality. That is to be expected. But what's remarkable for a network television show of any kind, let alone a comedy, is that both worldviews get a bit of airtime -- and neither is held up for ridicule, as such.
In fact, Undeclared may be one of the first times that TV has shown the proselytizing of incoming students by evangelical Christians. The phenomenon (a fact of life on any reasonably large campus) was portrayed on at least a couple of episodes.
Evidently it made the in-house watchdogs at Fox nervous. Perhaps it wasn't in keeping with the official line that universities are reeducation camps for left-wing indoctrination?
Then again, it's possible that something else bothered the executives. Television networks are, after all, easily frightened. Anything involving brain activity would tend to do it. Watching Freaks and Geeks and Undeclared when they first appeared, you knew they were too smart to survive. But it's good to have both in durable form. That way, they'll last longer than the courage of any given broadcast executive.
When I heard that advocates of “Intelligent Design” were urging schools to "teach the controversy" between their view and Darwinian evolution, I was dismayed.
About 20 years ago, I coined the phrase “teach the controversy” when I argued that schools and colleges should respond to the then-emerging culture wars over education by bringing their disputes into academic courses themselves. Instead of assuming that we have to resolve debates over, say, whether Huckleberry Finn is a racist text or a stirring critique of racism, teachers should present students with the conflicting arguments and use the debate itself to arouse their interest in the life of the mind. I elaborated the argument in numerous essays and in a 1992 book, Beyond the Culture Wars, which is subtitled, How Teaching the Conflicts Can Revitalize American Education.
So I felt as if my pocket had been picked when the Intelligent Design crowd appropriated my slogan, and even moreso when President Bush endorsed its proposal, saying that "both sides ought to be properly taught" so "people can understand what the debate is about." As a secular left-liberal, I felt that my ideas were being hijacked by the Christian Right as a thinly-veiled pretext for imposing their religious dogma on the schools.
And yet, setting intellectual property questions aside, the more I ponder the matter and read the commentators on both sides, the more I tend to think that a case can be made for teaching the controversy between ID and Darwin.
Not that the sides in this debate are equal, as Bush’s comment suggests. If we judge the issues strictly on their scientific merits, the Intelligent Designers don’t seem to have much of a case. In a lengthy and detailed article in The New Republic (August 22 & 29), the evolutionary scientist Jerry Coyne persuasively shows that the supposed "flaws" in the theory of natural selection that IDers claim to point out simply don’t exist. H. Allen Orr had made a similarly persuasive refutation of ID in The New Yorker (May 30), and these arguments have been further reinforced in articles by Daniel C. Dennett in The New York Times (August 28) and by Coyne again and Richard Dawkins in The Guardian (September 1).
Taken together, these writers make an overwhelming case that Darwinian evolution, if not a total certainty, is as certain as any scientific hypothesis can be. As Coyne puts it, "it makes as little sense to doubt the factuality of evolution as to doubt the factuality of gravity." From a strictly scientific standpoint, there seems to be no real "controversy" here that's worth teaching, just a bogus one that the IDers have fabricated to paper over the absence of evidence in their critique of evolutionary science.
And this would indeed be the end of the story if the truth or validity of an idea were the sole thing to consider in deciding whether it is worth presenting to students. But when we measure the pedagogical merits of an idea, its usefulness in clarifying an issue or provoking students -- and teachers -- to think can be as important as its truth or validity. In some cases even false or dubious notions can have heuristic value.
This point has been grasped by several commentators unconnected with the Christian Right who defend the teaching of the controversy. In a column of June 2000, before ID had become prominent in the news, Richard Rothstein, then The New York Times education columnist, proposed that students be exposed to the debate between creationism and evolution. And in a piece on the controversy earlier this year in Slate, Christopher Hitchens, asks, “Why not make schoolchildren study the history of the argument? It would show them how to weigh and balance evidence, and it would remind them of the scarcely believable idiocy of the ancestors of 'intelligent design.'"
Hitchens’ argument has been challenged by the editors of The New Republic, who caustically retort that getting kids to weigh and balance evidence is not exactly “what Bush -- or IDers -- want at all.” What they want "instead is to teach ID as a substantive scientific argument. If anything, what Bush is calling for is anti-historical, the exact opposite of what Hitchens praises." This is true, but so what? Hitchens doesn’t claim that his argument is one the IDers themselves would make, but only that students would learn something important about how to think from the kind of debate the IDers propose.
Secular liberals will object that Hitchens is overly confident that the good guys would win if the debate were aired in schools. In his scenario, the students would see the "idiocy" of ID’s ancestors and also presumably of its current advocates. What secular liberals fear, however, is that in many classrooms the scientific truth would be overwhelmed by dogma and prejudice.
Behind such fear -- and behind the liberal secularist objections to teaching the debate -- one senses the shellshock and impotence of the Blue-state Left in the wake of the 2004 election, and the worry that the Left will only lose again if it allows itself to be suckered into debating "values" with the religious Right on its own terms. This worry is deepened by the feeling that American public debate is not a level playing field, but an arena in which conservative money and Fox News control the agenda.
Though I share these fears, there seems to me a certain failure of nerve here on the part of the Left. After all, if evolution and intelligent design were debated in academic courses, the religious Right would have the same risk of losing as the liberal secularists -- maybe greater risk, if Hitchens is correct. In any case, it’s not clear that one wins a battle of beliefs by hunkering down, circling the wagons, and refusing to engage the other side. And if the Right has more money and media clout with which to shape such a debate, that may be all the more reason to enter the debate: if you don’t have money and media clout, arguments are your best bet.
Seen this way, the anti-evolution assaults of the Intelligent Designers and the creationist Right could be viewed less as a threat than an opportunity. This moral is suggested by a recent news story in The New York Times that reports that museum staffs that are being challenged by religious patrons to explain why they should believe in evolution “are brushing up on their Darwin and thinking on their feet” (September 20, 2005). One museum has developed training sessions for staff members “on ways to deal with visitors who reject settled precepts of science on religious grounds.”
What is most interesting in the article, and most germane to the recent debate, is the suggestion, reflected in quoted statements by museum people, that though this religious rejection of science may be misguided, it needs to be listened to and answered rather than ignored or dismissed, and that being forced to defend evolution can actually be a good thing. The implication is that it’s not unreasonable for patrons to press museum people to explain the grounds on which evolutionary science is more credible than ID or creationism. As one director of a paleontological research institution puts it, "Just telling" such patrons "they are wrong is not going to be effective." As another museum staffer advises docents, "it's your job not to slam the door in the face of a believer," and another says, "your job is ... to explain your point of view, but respect theirs."
Arguably, this is precisely the job of teachers as well, though admittedly museums serve different functions than educational institutions. If the goal of education is to get students to think, then just telling students their doubts about Darwin are wrong is not going to be effective. And teachers being forced to engage their religious critics and explain why they believe in evolution might be a healthy thing for those teachers just as it seems to be for museum workers. In fact, I would like to ask Coyne, Dennett, Orr, and others who have written so cogently in defense of evolution if they don’t feel just a tiny bit grateful to the IDers for pushing them to think harder about -- and explain to a wider audience -- how they know what they know about evolution.
Scientists like Coyne and Dawkins concede that debate should indeed be central to science instruction, but they hold that such debate should be between accredited hypotheses within science, not between scientists and creationist poseurs. That's hard to dispute, but, like Rothstein and Hitchens, I can at least imagine a classroom debate between creationism and evolution that might be just the thing to wake up the many students who now snooze through science courses. Such students might come away from such a debate with a sharper understanding of the grounds on which established science rests, something that even science majors and advanced graduate students now don’t often get from conventional science instruction.
How might such a debate be taught? Ideally in a way that would not become fixated on the clash of faith and science, which might quickly produce an unedifying stalemate, but would open out into broader matters such as the history of conflicts between science and religion and the question of how we determine when something qualifies as "science." At the broadest level, the discussion could address whether the ID-evolution debate is a smoke screen for the larger political and cultural conflict between Red and Blue states. Representing such a many-sided debate would demand the collaboration of the natural sciences, the social sciences, and the humanities, a collaboration that could make a now disconnected curriculum more coherent. Such a collaboration would also answer the scientists’ objection that there just isn’t time to debate these issues, given everything else they have to cover. Then, too, explaining how we know what we know against skeptical questioning is not an add-on, but an intrinsic part of teaching any subject.
In any case, science instructors may soon have no choice but to address the controversy posed by ID and creationism. If many American students now bring faith-based skepticism about evolution with them into classrooms, as it seems they do, then there’s a sense in which the controversy has already penetrated the classroom, just as it has penetrated museums, whether ID or creationism is formally represented in the syllabus or not. Schools and colleges may not be teaching the controversy between faith and science, but it’s there in the classroom anyway insofar as it’s on some students’ minds. Teachers can act as if their students’ doubts about evolution don’t exist, but pretending that your students share your beliefs when you know they don’t is a notorious prescription for bad teaching.
Gerald Graff is a professor of English and education at the University of Illinois at Chicago. He is the author of Beyond the Culture Wars: How Teaching the Conflicts Can Revitalize American Education (W. W. Norton, 1992) and Clueless in Academe: How Schooling Obscures the Life of the Mind (Yale University Press, 2003).
The annual college-admissions tournament is in full swing, encouraged by newspapers and magazines that have made a good business of promoting status anxiety among parents and students by touting the latest rankings and secrets of Ivy League admissions offices. But in reality, the universe of students with the luxury of being overwhelmed by long application forms, AP courses, extracurricular activities, and grueling SAT-prep classes is small: Only 11 percent of college-bound seniors enroll at institutions that reject a majority of their applicants. For most students, the hard part of college isn’t getting in -- it’s getting out.
The numbers are stark: Only 37 percent of college students graduate in four years, less than two-thirds finish in six. For low-income and minority students, graduation rates are even worse. This is happening at the worst possible moment in history -- the market for unskilled labor has already gone global and higher-skill jobs aren’t far behind. We aren’t going to be bigger or cheaper than our Chinese and Indian competitors in the 21st century; our only option is to be smarter. Yet we’re squandering the aspirations and talent of hundreds of thousands of college students every year.
Clearly, major changes are needed.
We can start by restructuring high schools, which continue to act as if most students don’t go to college when in fact most of them do. Two-thirds of high school graduates enter postsecondary education soon after graduation, and more than 80 percent matriculate by their mid-20s. But many arrive unaware that their high school diploma doesn’t mean they’re ready for college work. Far from it. More than 25 percent of college freshmen have to take remedial courses in basic reading, writing, or math -- victims of high schools that systematically fail to enroll many of their college-bound students in college-prep classes.
It’s true that many students arrive in high school behind academically, but high schools need to buckle down and prepare them for college anyway because that’s where they’re going, ready or not. College-prep curricula should be the norm unless students and parents decide otherwise.
We also need to make college more affordable for first-generation college students at the greatest risk of dropping out. We’ve been losing ground here in recent years -- federal Pell Grants pay a far smaller portion of college costs than they once did, while states and institutions are shifting many of their student-aid dollars to so-called “merit” programs that mostly benefit middle-and upper-income families. Meanwhile, the ongoing erosion of state funding for public colleges and universities, combined with the unwillingness of those institutions to look hard at becoming more efficient, has produced huge increases in tuition.
As a result, low-income college students have an unpleasant choice: Take out massive student loans that greatly limit their options after graduation, or work full-time while they’re in school, and thereby greatly decrease their odds of graduating. In addition to a renewed federal commitment to college affordability, state lawmakers should resist the urge to pour vast amounts of money into need-blind merit aid programs. And institutions should think twice before taking the advice of for-profit "enrollment management" consultants who counsel reducing aid to the low-income students who need it most.
We need to get serious about creating universities that are actually designed to educate undergraduates successfully. Many institutions are far too concerned with status, research, athletics, fundraising -- almost everything except the quality of undergraduate education. Yet research has shown that those institutions that truly focus on high-quality instruction, combined with guidance and support in the critical freshmen year, have much higher graduation rates than their peers. Our colleges need to be held more accountable for the things that matter most: teaching their students well and helping as many as possible earn a degree.
The education secretary's commission appears poised to put higher education accountability squarely on the national agenda. That's a good thing. But the panel's proposal shouldn't focus on a No Child Left Behind-style top-down system based exclusively on standardized tests, government-defined performance goals, and mandated interventions. Rather, the panel should pursue accountability through transparency, mandating a major expansion of the performance data universities are required to create and report to students, parents, and the public at large.
Finally, the media should look beyond their own lives and aspirations when they shape the public perception of higher education and the admissions process. Caught up in the same status competition they help perpetuate, many simply don’t realize how many college students arrive unprepared, struggle financially, and never finish a degree. For the vast majority of students, and for the nation as a whole, the stakes are far higher than who gets into which Ivy League institution.