My nephew always wanted to design things, and as a teenager he seemed to be on the fast track to a good engineering degree. Took a community college calculus course while in high school. Worked for a computer aided design (CAD) company part-time. Got admitted to a top-notch engineering program.
In the middle of his first year, however, he dropped out. Now he’s enrolled in a technical institute, learning CAD skills and only CAD skills. He’s very happy, apart from the fact that he’s had to move back home.
Where did I, his college-professor aunt, go wrong? Or did I?
What upsets me is that he doesn’t see any inherent value in a liberal education, in a college degree rather than a tech-school certificate. But I also wonder whether mine isn’t a narrow attitude -- after all, the kid wants to do CAD, so why should he have to get a degree? What doors would a bachelor’s degree open for him? He knows what kind of work he likes to do, and he can start earning money at it a lot faster with a certificate in “Drafting/CAD.”
I checked out the Web site for the school he’s chosen. The school’s URL is a .com, not a .edu. It boasts that "Our short term curriculums focus only on courses directly related to your field of study, without any fluff. These classes are taught by instructors that have professional experience in the industry."
Besides bringing out my usually-held-in-check pedantry (“the plural of curriculum is not curriculums, dammit” and “instructors WHO, not instructors that!”), the Web site rankles because it’s pitching itself directly against the idea of a liberal arts education or “fluff.”
I worry that I’m being a snob. Why shouldn’t he study a job skill instead of spending his time reading books he doesn’t care about? Why not go to class at a place that teaches auto body repair instead of philosophy? I’m sure they’ll get him a job when he graduates, which is more than I do for my students.
My nephew is a working-class kid. Neither of his parents went to college, and most of his friends won’t. Still, I had assumed that because he got good grades in high school and wanted to be an engineer he’d want to get a degree. The problem is, I think, no one ever told him what a degree was. No one ever talked to him about the difference between higher education and job training. No one ever said why he should want to read books he wouldn’t choose on his own or take a class in chemistry or go to a lecture by a political scientist. And now he probably will never do any of those things.
As a professor I have to mourn that choice. Yet as an aunt who wants to see her nephew happy, I have to agree with my brother that the tech school is probably an OK place for my nephew right now. I don’t mean to imply that he’s doomed. I just worry that he’s severely limited his future choices.
I wonder how it is that higher education is still, in the 21st century, failing to get its message across. Failing to explain the difference between a college degree and a tech-school certificate to the high school student whose parents never went to college. Failing to convince the engineering-school freshman that he should bother to stick around and learn the things that don’t seem immediately applicable to his future career. We have not convinced legislatures that the state has a stake in higher education, that a citizenry trained in critical thinking, writing, and research skills beyond the high-school level is citizenry better able to make informed decisions.
Both my nephew and I want him to be happy in his work. But he sees short-term, where I am trained to see long-term. He wants a job, very soon, in the computer work he has come to like in his part-time job. His model is his father, a union worker who will spend his whole adult life doing the same well paid work with good union benefits. But such jobs are fast becoming extinct in the United States, and the jobs that are replacing them, especially information-industry jobs, are nowhere near as secure.
As the sister who didn’t start building up her retirement fund until 15 years after her younger brother (all those years of college and graduate school), I may have limited credibility here. Nevertheless, I believe that a degree would offer my nephew more options in the long run, more opportunities years down the line, in an age in which most people change careers multiple times in their working lives.
The problem is that I cannot be convincing in a larger culture that does not actively promote the value of higher education. Cushioned in a liberal arts college whose mostly middle- and upper-class students enroll (presumably) because they already understand that there’s value in an education that is not job training, I sometimes forget that a college degree can still be a tough sell. That’s why my nephew’s rejection of college came as such a jolt. But it’s reminded me that American anti-intellectualism can have personal consequences. It’s reminded me that I can’t assume the product I’m selling will advertise itself. If I believe that a better educated citizenry would make for a better state or country or world, then I’d better start writing letters and contacting legislators and talking to kids. Guess I should have started with my nephew.
Paula Krebs is professor of English at Wheaton College, in Massachusetts.
A well-known geology professor asks to see the manager of the Media Center at a university in northern California. He is angry because Media Services has dumped the 16mm film collection and inadvertently retired his favorite plate tectonics film from the 1970s. Initially sympathetic, the manager searches several collections and vendors to find that the film is out of print. When she tries to show him some newer titles on the topic, he storms out of the library, shaking his head.
At a large community college, students have been filing into the department chair’s office every semester for seven years complaining about a poetry class. Seems that the long-time professor of a particular course has been using the same typed-and-copied handouts for over 20 years. He not only refused to use a computer to type up handouts, but didn't see the rationale for updating the information "since the poets were dead." The complaints continued and each successive chair finds a reason not to bother this in-house scholar.
An established speech professor can't understand why students can’t seem to get assignments in on time -- much less refer to sources properly. Although many students have complained about his WebCT supplemental, he never makes time to update information. The current undated syllabus, handouts and course documents are dated 2003. None of the links refer students to information properly -- and frustrated students finally formed a study group to unravel the mess.
One math professor has copied handouts that were original purple dittos produced in 1974. Her syllabus is two pages long and does not reflect the current course. She assigns rote-memorization exercises from many-times-copied sheets dated 1976. Not only do students avoid her, but her colleagues have also accepted her refusal to participate in committees as a sign that she is no longer a viable member of the teaching team at this university.
When do instructors decide that teaching is "doing time?" When do research and other activities become so important that teaching becomes such a chore that any effort is considered wasted time? At times I have been shocked that colleagues with so much knowledge seem intent on making that information indecipherable to their students. What is the motivation here? Under pressure, a few colleagues have admitted that they simply don't care. A few others simply seem oblivious that good teaching requires effort. After watching a senior professor hold committee members hostage to pontificate and grouse, a colleague scratched the words "God-complex" on a tablet and flashed it to me. Could this also be the reason that our older colleague's materials are decades old?
One colleague has confided that he thinks this refusal to update materials, assessment methods and teaching methods basically makes difficult graduate-level courses into "academic cod liver oil." I have to agree.
Yes, some topics are decades old -- does that mean there truly is nothing new on the horizon? Maybe. Maybe not. In graduate school, I took courses from a tenured professor who was well known as a Joycean scholar. True, Ulysses was finally published in 1922 –- yet he made this course exciting. Not only did he continue to publish, he also constantly read current interpretations on this writer’s work. Yes, he was an expert, but he was fresh. He squeezed our brains until we spit out something worthwhile -- and on occasion an original idea surfaced.
Excited that a tiny new thread was developing, he'd showcase it and develop it in the next class. True, he almost never gave handouts, but his quizzes, midterms and final were constantly changing to reflect developments in current theory. His courses were always overflowing with students fighting to get in -- a true reflection of his desire to learn and develop as an instructor after thirty years of teaching.
Why rework old lessons and develop new exercises and experiments to teach subjects that were being taught as early as the 19th century? Several reasons:
1. Students feel valued.
2. After identifying a student population (or even a particular class), targeting their area of need is simply effective teaching. It also gives rise to flexibility in teaching difficult subjects -- not that syllabi are discarded, but lessons may be developed to re-teach an area that students find difficult.
3. Textbooks are updated -- so why not professor’s lessons?
4. It's invigorating to find new ways to teach old materials. By researching and redoing assignments, instructors often feel "refreshed" and able to teach it more effectively to students. This also may encourage instructors to use less passive teaching methods.
5. New examples of old subjects can make an instructor seem reachable -- and interested in his or her students. This often opens the door for communication.
Yes, sometimes the information is as old as the hills; but the way that we teach it does not have to be antiquated. Reaching for new worksheets, developing new handouts and assessment tools is simply good teaching. At the least, the willingness to develop as an instructor can stave off the dreaded replacement when younger instructors are being interviewed at record pace.
I know one woman who, at 72, commands a lively classroom and is constantly being recommended by students. Her trick? She often digs up current information from the media (and at times from her own students) to develop topics. Newspaper clippings circulate the classroom -- and the use of less traditional teaching techniques keeps her students interested. No longer content to rely on lectures, she has incorporated short audio or video tape clips into readings, has students on debate teams for tricky subjects, and often has students reply either in small groups -- or through index cards -- which gives students the anonymity they need to really express themselves. After hearing her students roar with laughter one day, I sat in on her class and was excited by the energy she generated. Now I have borrowed some of her techniques, and often drop by her office to see what she is developing for her classes. I can only hope to be this interested in teaching at 72! The department chair has warned her that she will never be allowed to retire -- she jokingly says she will "die at the chalkboard." And when she does, it will be quite a loss.
I understand that in some cases, professors have their hands tied. A department-approved syllabus may give an instructor very little "wiggle" room. Those who haven't yet earned tenure, non-tenure track professors, or adjuncts may give in to pressure to simply adapt a syllabus from a template or existing format. A list of "acceptable" texts or only one particular text allowed in a course can be stifling. Still, I know many instructors who have worked with materials, topics, and experiments within the curriculum to pep up what could be a dull subject.
Once asked to teach business management at a small, private university, I was limited to one text and a standardized test bank. I not only worked to choose questions that reflected our in-class work, but also choose current (and often edgy) business news stories to represent the less interesting business practices outlined in our text. Students were much more engaged when they could see how the concepts in the book were taking place in the real business world. Instead of just memorizing facts for standardized tests, these students had the opportunity to apply what they were studying to case studies in several papers. Many went through several drafts to ensure that they were drawing the correct inferences. I was thrilled to see a variety of topics with each set of papers -- it was obvious that these were thinking adults with interests of their own.
I’m not sure if it's because I've only been teaching for six years, or because I've never had the security that comes with tenure, but I am constantly developing new handouts and materials for my courses. In fact, to secure the job I have now at a university in the Midwest, I created a sentence structure handout that included several local landmarks. Butt-kissing? Sure. But the students paid attention and were able to identify dependent and independent clauses by last example.
In addition to this mini-lesson, I also developed a new lesson for them on logical fallacies. Since the students had done a somewhat dry section in the textbook requiring the memorization of approximately 20 forms of ineffective argument, I focused on review. Realizing that JimCarey's Liar Liar had a wonderful scene full of faulty arguments, I reviewed a short three-minute scene of the movie many times and typed up a transcript of the dialogue, word for word. I then created a handout with instructions, this dialogue and a quick review of the faulty arguments from the textbook. In class, I put on the video, and played the three-minute scene a few times. Not only were students thrilled to be "watching a movie" in class -- they quickly started to put into place what they know and correctly identified line after line of faulty arguments. The discussion was lively.
While on staff at this university, I’ve developed a number of handouts based on in-class discussion and my students' assignments. Last week I asked students to write a well-structured paragraph on a current television show, "My Name is Earl." After following my instructions online, students read several short articles on the show and were instructed to write a solid topic sentence and supporting details -- including three short, direct quotes from the linked articles. They were instructed to e-mail the paragraph directly from their computers to my office e-mail account. In addition to immediately e-mailing them concrete suggestions with what worked and what needed improvement, I then developed a worksheet for use in class the next day.
I chose a paragraph that had a somewhat confused topic sentence, a few good supporting details, some off-topic writing, and wonderful sentence structure. I featured this paragraph at the top of the handout (anonymously, of course), and using MSWord, set up columns for my typed comments. In the columns, I typed in specific comments and questions about each sentence and used short lines with arrows to point to the sentence in question. Below I typed up four other examples of paragraphs -- to be used for group work in class. After going over the first sample paragraph, students felt much more confident in evaluating the remaining paragraphs and reporting back to the class. It was a rousing success, with students able to see their own writing evaluated and use those tools to evaluate classmates’ work (and later their own). I was rewarded with visible improvement in paragraph development in the next stack of essays from this class.
I realize that this development of assignments and handouts takes time. It's true that I don’t always have time and sometimes rely on standard tools that have been in use for years. But when I take the time to look at my students’ abilities and develop something just for that class, I am always greatly rewarded in my effort. And I suspect that the students feel important, too. Students really enjoy seeing their own work in print -- even if they're going to be constructively criticized in class. I am always careful to start out with the positive and gently guide the discussion as we go along.
On occasion, I "mix up" the examples from several sections of the same course so that students don't feel threatened. But my experience is that students love the attention and often brag to friends that they "made the handout." At a campus where I taught in California, my developmental writing students often turned in work with the comment, "Oh, please use mine." They really wanted to improve -- and the reward they got when the class praised them built confidence. Specific comments in the context of a lesson helped them more than a vague C at the top of a page. I’ll admit that at this college, a posse of undergraduates followed me from course to course -- and when they ran the gamut, they asked me to recommend "another cool instructor."
An online colleague accuses me of being an "academiwhore." She believes I am pandering to students rather than looking toward the loftier goal of asking students to learn about history while doing English grammar exercises. She may be right. I’ve never been the pompous type. But I have enjoyed teaching these last six years -- and deans seem to want to hire me. So I have no complaints. To me, recent texts, up-to-date syllabi and materials are part of the "teaching solution." Good education, a desire to help people, and the ability to be open-minded seem integral as well. Like many instructors, I rely on in-services where instructors share effective lesson plans, industry publications, am sure I need all of it to avoid becoming disdainful of my students -- and ultimately un-teachable myself.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.
Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.
In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these questions, and as a result, write, teach and learn with passion about them.
But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.
I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.
Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?
I see four possible reasons:
1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students. But that’s not what exploring the Big Questions is all about. Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.
2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.
3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?
4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”
Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?
W. Robert Connor
W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.
I just finished grading a hefty stack of final examinations for my introductory-level U.S. history survey course. The results were baleful.
On one section of the exam, for example, I supplied some identification terms of events and personages covered in class, asking students to supply a definition, date, and significance for each term. In response to “Scopes Monkey Trial,” one student supplied the following:
"The scopes monkey trial was a case in the supreme court that debated teaching evolution in the schools. It happened in 1925. Mr. Scope a teacher in a school wanted to teach about God and did not want to teach about evolution. The ACLU brought in lawyers to help with the case of Mr. Scopes. In the end Mr. Scopes side did not have the people's opinion. Evolution won. It is significant because now you have to teach evolution in school, you can't teach about God."
This answer might be considered a nearly perfect piece of evidence against intelligent design of the universe, since it gets just about everything (apart from the date) wrong: punctuation, spelling, grammar, and historical fact.
For those needing a refresher, Tennessee high school biology teacher John T. Scopes assigned a textbook informed by evolutionary theory, a subject prohibited by the state legislature. The court ruled against Scopes, who had, obviously, broken the law. But the defense won in the court of public opinion, especially after the ACLU’s lawyer, Clarence Darrow, tore apart William Jennings Bryan, the former Democratic presidential candidate, witness for the prosecution, and Biblical fundamentalist. The press dubbed it the "Scopes Monkey Trial" (inaccurately, since the theory of human evolution centered upon apes) and pilloried Bryan. As Will Rogers put it, "I see you can't say that man descended from the ape. At least that's the law in Tennessee. But do they have a law to keep a man from making a jackass of himself?"
An outside observer might ascribe my student’s mistakes to the political culture of this Midwestern city, where barely a day goes by without a letter to the editor in the local paper from some self-appointed foot soldier of the religious right.
That, however, wouldn’t explain another student who thought the 1898 war between the United States and Spain, fought heavily in Cuba, was about communism (not introduced into Cuba until after the 1959 revolution). Nor would it explain a third student who thought that the Scopes verdict condoned Jim Crow racial segregation.
A minority of students performed admirably, receiving grades in the range of A, while hewing, of course, to varied interpretations. Their success proved the exam was based upon reasonable expectations. However, the median exam grade was a C -- the lowest I’ve yet recorded, and fairly devastating for a generation of students who typically aspire to a B.
I was wondering what to make of this dispiriting but solitary data set when I read about the Education Department study released late last week that shows that the average literacy of college-educated Americans declined precipitously between 1992 and 2003. Just 25 percent of college graduates scored high enough on the tests to be deemed “proficient” in literacy.
By this measure, literacy does not denote the mere ability to read and write, but comprehension, analysis, assessment, and reflection. While “proficiency” in such attributes ranks above “basic” or “intermediate,” it hardly denotes rocket science. It simply measures such tasks as comparing the viewpoints in two contrasting editorials.
The error-ridden response I received about the Scopes Monkey Trial speaks less to the ideological clash of science and faith than to a rather more elemental matter. As students in the 1960s used to say, the issue is not the issue. The issue is the declining ability to learn. The problem we face, in all but the most privileged institutions, is a pronounced and increasing deficiency of student readiness, knowledge, and capacity.
Neither right nor left has yet come to terms with the crisis of literacy and its impact on higher education. The higher education program of liberals revolves around access and diversity, laudable aims that do not speak to intellectual standards. Conservatives, for their part, are prone to wild fantasies about totalitarian leftist domination of the campuses. They cannot imagine a failure even more troubling than indoctrination -- the inability of students to assimilate information at all, whether delivered from a perspective of the left or the right.
It would be facile to blame the universities for the literacy crisis, since it pervades our entire culture at every level. The Education Department’s statistics found a 10-year decline in the ability to read and analyze prose in high school, college, and graduate students alike.
However, the crisis affects the university profoundly, and not only at open-enrollment institutions like the regional campus on which I teach. Under economic pressure from declining government funding and faced with market competition from low-bar institutions, many universities have increasingly felt compelled to take on students whose preparation, despite their possession of a high-school degree, is wholly inadequate. This shores up tuition revenue, but the core project of the higher learning is increasingly threatened by the ubiquitousness of semi-literacy.
How can human thought, sustained for generations through the culture of the book, be preserved in the epoch of television and the computer? How can a university system dedicated to the public trust and now badly eroded by market forces carry out its civic and intellectual mission without compromising its integrity?
These questions cry out for answer if we are to stem a tide of semi-literacy that imports nothing less than the erosion of the American mind.
Â Christopher Phelps is associate professor of history at Ohio State University at Mansfield.
At the small liberal arts college where I teach, we have recently undertaken a wholesale revision of our core liberal arts curriculum. This is the set of requirements -- some specific courses, some chosen from a range of options -- that all students at the college must take before graduation. For professors in the natural sciences, this revision has required a good deal of thought about the content and nature of science courses offered to a non-major audience.
Conventional wisdom -- usually unquestioned -- has it that there are three basic elements that go into making up a good non-majors science course. First, the class should cover a relatively narrow range of topics. The classic "Physics for Poets" survey class, which attempts to cover an entire field in one semester, is almost always a disaster, satisfying neither the students taking it nor those teaching it. It's better to restrict the course to a subset of a given field, and spend more time covering a smaller range of topics.
Second, the topic chosen as the focus of the course should be something relatively modern. Students respond much more positively when they can immediately see the relevance of the material. Ideally, a good non-major science class should deal with either a "hot topic" in current research, or something connected to an ongoing public policy debate. It's much easier to engage the students in a subject if they're likely to read about it in The New York Times.
The third element is perhaps the most important: the course should involve the minimum possible amount of math. Many of the students who are the target audience for these classes are uncomfortable with mathematical reasoning, and react badly when asked to manipulate and interpret equations. This final characteristic is also the main reason why I am profoundly ambivalent about such classes.
Science for non-majors offers an important chance to reach out to students outside the sciences, and try to give them some appreciation for scientific inquiry. This is critically important, as we live in a time where science itself is under political assault from both the left and right. People with political agendas are constantly peddling distorted views of science, from conspiracy theories regarding pharmaceutical companies and drug development, to industry-backed attempts to challenge the scientific findings regarding global climate change, to the well-documented attempts to force religion into science curricula under the guise of "intelligent design." It's more important than ever for our students to be able to understand and critically evaluate competing claims about science.
I worry, however, that our approach to teaching science as a part of a liberal education is undermining the goals we have set for our classes. Despite the effort we put into providing classes that are both relevant and informative, I am troubled by the subtext of these classes. By their very existence, these classes send two damaging messages to students in other disciplines: first, that science is something alien and difficult, the exclusive province of nerds and geeks; and second, that we will happily accommodate their distaste for science and mathematics, by providing them with special classes that minimize the difficult aspects of the subject.
The first of these messages is sadly misguided. Science is more than just a collection of difficult facts to be learned. It's a way of looking at the universe, a systematic approach to studying the world around us, and understanding how things work. As such, it's as fundamental a part of human civilization as anything to be found in art or literature. The skills needed to do science are the same skills needed to excel in most other fields: careful observation, critical thinking, and an ability to support arguments with evidence.
The second subtext, however, is disturbingly accurate. We do make special accommodations for students who are uncomfortable with science, and particularly mathematics. We offer special classes that teach science with a minimum of math, and we offer math classes at a level below what ought to be expected of college students. Admissions officers and student tour guides go out of their way to reassure prospective students that they won't be expected to complete rigorous major-level science classes, but will be provided with options more to their liking.
It's difficult to imagine similar accommodations being made for students uncomfortable with other disciplines. The expectations for student ability in the humanities are much higher than in the sciences. If a student announced that he or she was not comfortable with reading and analyzing literary texts, we would question whether that student belonged in college at all (and rightly so). We take the existence of "Physics for Poets" for granted, but nobody would consider advocating a "Poetry for Physicists" class for science majors who are uncomfortable with reading and analyzing literature.
The disparity in expectations goes well beyond simple literacy. I was absolutely stunned to hear a colleague suggest, to many approving nods, that all first-year students should be required to read The Theory Toolbox. We would never consider asking all entering students to read H. M. Schey's Div, Grad, Curl, and All That: An Informal Text on Vector Calculus, even though the critical theory described in The Theory Toolbox is every bit as much a specialized tool for literary analysis as vector calculus is a specialized tool for scientific analysis. Yet faculty members in the humanities can seriously propose one as essential for all students in all disciplines, while recoiling from the other.
This distaste for and fear of mathematics extends beyond the student body, into the faculty, and our society as a whole. Richard Cohen, writing in The Washington Post, wrote a column in February in which he dismissed algebra as unimportant, and proclaimed his own innumeracy.
"I confess to be one of those people who hate math. I can do my basic arithmetic all right (although not percentages) but I flunked algebra (once), barely passed it the second time -- the only proof I've ever seen of divine intervention -- somehow passed geometry and resolved, with a grateful exhale of breath, that I would never go near math again."
It's a sad commentary on the state of our society that a public intellectual (even a low-level one like Cohen) can write such a paragraph and be confident that it will be met with as many nods of agreement as howls of derision. If a scientist or mathematician were to say "I can handle simple declarative sentences all right (although not transitive verbs)," they could never expect to be taken seriously again. Illiteracy among the general public is viewed as a crisis, but innumeracy is largely ignored, because everybody knows that Math is Hard.
Fundamentally, this problem begins well below the college level, with the sorry state of science and math teaching in our middle schools and high schools. The ultimate solution will need to involve a large-scale reform of math and science teaching, from the early grades all the way through college. As college professors, though, we can begin the process by demanding a little more of our students, and not being quite so quick to accommodate gaps in their knowledge of math and science. We should recognize that mathematical and scientific literacy are every bit as important for an educated citizen as knowledge of history and literature, and insist that our students meet high standards in all areas of knowledge.
Of course, the science faculties are not without responsibilities in this situation. Forcing non-science majors to take the same courses as science majors seems like an unappealing prospect in large part because so many introductory science courses are unappealing. If we are to force non-science majors to take introductory science major courses, we will also need to commit to making those courses more acceptable to a broader range of students. One good start is the teaching initiative being promoted by Carl Wieman, a Nobel laureate in physics Carl Wieman who is leaving the University of Colorado to pursue educational reforms at the University of British Columbia, but more effort is needed. If we improve the quality of introductory science teaching and push for greater rigor in the science classes offered to non-majors, we should see benefits well outside the sciences, extending to society as a whole.
As academics, we are constantly asked to look below the surface to the implications of our actions. We are told that we need to consider the hidden messages sent by who we hire, what we assign, how we speak to students, and even what we wear. Shouldn't we also consider the hidden message sent by the classes we offer, and what they say about our educational priorities?
Edward Morley is the psuedonym of an assistant professor of physics.
"Sat on his arse and had group presentations teach class the last 5 weeks of qtr." --a "Rate My Professors" comment
Quick now: when did you first hear the phrase, "collaborative learning" (and, if possible, where were you)? I seem to have heard it first only last year, when asked by two faculty members of a local community college about my "position" on collaborative learning during an informal chat about the possibility of teaching there. Of course I immediately replied I was all for collaborative learning, which in fact made me what I am today. Students need to learn from each other, blah,blah.
Later, I asked a man I chanced to know who taught at the same place about collaborative learning. "It's all bullshit," he replied. "Everything is 'student-centered' this and 'student-centered' that. You e-mail them if they're absent, you give them make-ups if they fail. And to teach, just get them in a circle and stay out of the way while they talk to each other about anything except what you ask them to talk about."
Time to do some research -- into my own experience as well as the professional literature. What could something now termed "collaborative learning" in fact have been called a decade ago? The jargon of that period used to have a former colleague and I joking about having to stage a "circle jerk." Was the regnant term instead, "student centered discussion"? Or "interactive competence"? Does it matter? Something, which I will term "collaborative learning" and hereafter abbreviate "C-L," was at that time, as now, the Next Big Thing, either already arrived or about to.
Of course it turns out that C-L has been with us for a longer period of time, under even more various guises. As a pedagogy, we can trace C-L back at least as far as "discovery learning" notions of the 1960s, designed to enable students to acquire knowledge through their own interaction both with various subjects (principally math and the sciences) and with their classmates. As a philosophical orientation, we can take C-L back at least as far as John Dewey. In practical terms though -- and in this context the terms are remorselessly practical -- C-L becomes part of a rich terminological stew, otherwise listed on the institutional menu as "cooperative learning," "collective learning," "peer teaching," "study circles," and so on.
Distinctions among these dishes can of course can be made. Nonetheless, we can distinguish crucial common ingredients. These include most importantly a rearrangement of chairs in the classroom, whereby groups of students face each other rather than the professor. Whether or not this rearrangement is thereby deemed a "community," each group of students is expected to be primarily dependent upon itself in order to understand something, ranging from a question on a particular day to the whole sequence of the course throughout the semester.
Sometimes, it works; students among themselves actually discover solutions, ideas or directions that they never could have possessed either so securely or so wholly if they had instead been led by their professor, lecturing. But what doesn't prove to be effective in the classroom, at some times, in some cases? Indeed, the fact that anything can be made or seem so is almost a definition of education. I used to know a guy (in psychology) who regularly had his students fan out on the floor and lie concentrically head-to-head. "Works for me," he used to say.
The goals of C-L, however, are at once more aggressive and more ambitious. C-L purchases its authority against the bad old figure of the Lecturer, whose model of learning is top-down, rather than peer-to-peer (and face-to-face). In a most distinct sense, the purpose of C-L is to substitute a collaborative notion of education for a hierarchical one. That hierarchy is bad goes as unquestioned as the assumption that interaction is good, period.
Is hierarchy bad? Set aside the more literal question of what the professor is supposed to do while his or her students are merrily collaborating? (To join the circle seems a cop out, and to set up elaborate rules or even individual roles in order to define student interaction appears to smuggle back an authority already left at the front door.) In C-L discourse, who exactly is this professor in the first place?
In one sense, this is easily answered. The professor is a "facilitator" or an "enabler." He or she falls into place as part of the larger cast of characters in the vocabulary of group dynamics, with its formidable list of favored terms, such as "group processing," "teamwork skills," and (my own favorite) "positive interdependence." In a classic account of lecturing available in his Forms of Talk, Erving Goffman states that in a lecture "the subject matter is meant to have its own enduring claims upon the listeners apart from the felicities or infelicities of the presentation." Not so in C-L.
There is no lecturer because there is no subject matter. We see this best perhaps in community colleges, such as the one above, where adjuncts are enjoined to participate in "professional development" sessions on C-L. Not about their respective disciplines. About C-L itself. Indeed, the "content free" nature of C-L as a pedagogy is revealed in these settings as its most compelling feature; not only is there no learning without "interactive competence" -- such competence constitutes all there is to learn, on the part of teachers as well as students.
If the professor were to lecture, he or she would lecture about that -- and this is precisely what he or she does in the regime of C-L, depending upon how the spirit moves to explain to groups, excuse me, communities, how intricately or carefully all has been designed and organized for them. Except that it seems wrong to characterize the speakers as "professors." Professors profess a subject. The subject has been learned though specialized study. Instructors (to chose a more neutral term) instruct a method. Anybody can learn it.
But, I think, we have still not completely answered all questions about C-L until we seek to account for its popularity at the present time. A wide-ranging answer would be to link C-L to the "ideology of excellence" Bill Readings examines in The University in Ruins, whereby the appeal to excellence marks the following fact: "All that the system requires is for activity to take place, and the empty notion of excellence refers to nothing other than the optimal limit-output ratio in matters of information."
"Collaboration," in other words, now functions as at once the definition of activity as well as the value term sponsoring it. A less wide-ranging answer, though, would simply argue that C-L attained its present popularity at approximately the same time as the widespread use of adjuncts in college teaching. When I e-mailed a friend of mine about his thoughts on C-L, he replied: "Another way to hire fewer teachers and have more students." Exactly. The best way to hire fewer teachers is to hire more adjuncts. The best way for them to teach (especially to students with poor preparation for college) is to have them teach C-L.
Not only do adjuncts not necessarily have to possess specialized -- not to say "terminal" -- knowledge in their respective disciplines. We don't have to worry about them aspiring to become "professors." Just as important, adjuncts by definition lack the job security to be able to resist C-L's claims not so much as a pedagogy as an ideology. We don't have to consider them wondering out loud about, say, whether you can really teach "interpersonal skills," much less whether the imperative to learn them in an ostensibly noncompetitive setting is itself not designed to promote what Readings at one point characterizes as "the condition of the political subject under contemporary capitalism."
No wonder the Rate My Professors student complains. (A political subject who can't complain wouldn't be a political subject.) In its ideological phase, a pedagogy now as powerful as C-L risks becoming available only in terms of its lowest common denominator -- the circle, in which students are left to their own devices. This is not fair to C-L.
But justice, alas, explains little about why things are as they are in higher education, or anywhere else. What explains more?
Let me suggest another word: ignorance. I am thinking of the great American literary critic, R.P. Blackmur. Once he uttered the following objection to the system of Basic English devised by I.A. Richards: "What, should we get rid of our ignorance, the very substance of our lives, merely in order to understand one another?"
The best thing about the bad old lecture method may be simply that it leaves us alone in our ignorance, whether we want to be or not. The worst thing about the bad new collaborative method is that, any more than cell phones or cable news, it never leaves us alone. Instead, C-L demands that we must understand one another as a function of learning anything.
Terry Caesar's last column compared academic conferences -- and other kinds of conferences.
At my university, I chair a faculty committee charged with reviewing and revising our general education curriculum. Over the past two and a half years, we have examined programs at similar colleges and studied best practices nationwide. In response, we have begun to propose a new curriculum that responds to some of the weaknesses in our current program (few shared courses and little curricular oversight), and adds what we believe will be some new strengths (first-year seminars and a junior-level multidisciplinary seminar).
In addition, we are proposing that we dispense with our standard second course in research writing, revise our English 101 into an introduction to academic writing, and institute a writing-across-the-curriculum program. Our intention is to infuse the general education curriculum with additional writing practice and to prompt departments to take more responsibility for teaching the conventions of research and writing in their disciplines. As you might imagine, this change has fostered quite a bit of anxiety (and in some cases, outright outrage) on the part of a few colleagues who believe that if we drop a course in writing, we have dodged our duty to ensure that all students can write clearly and correctly. They claim that their students don’t know how to write as it is, and our proposal will only make matters worse.
I believe most faculty think that when they find an error in grammar or logic or format, it is because their students don’t know “how” to write. When I find significant errors in student writing, I chalk it up to one of three reasons: they don’t care, they don’t know, or they didn’t see it. And I believe that the first and last are the most frequent causes of error. In other words, when push comes to shove, I’ve found that most students really do know how to write -- that is, if we can help them learn to value and care about what they are writing and then help them manage the time they need to compose effectively.
Still, I sympathize with my colleagues who are frustrated with the quality of writing they encounter. I have been teaching first-year writing for many years, and I have directed rhetoric and compositions programs at two universities. During this time, I have had many students who demonstrate passive aggressive behavior when it comes to completing writing projects. The least they can get away with or the later they can turn it in, the better. I have also had students with little interest in writing because they have had no personally satisfying experiences in writing in high school. Then there are those students who fail to give themselves enough time to handle the complex process of planning, drafting, revising, and editing their work.
But let’s not just blame the students. Most college professors would prefer to complain about poor writing than simply refuse to accept it. Therefore, students rarely experience any significant penalties for their bad behaviors in writing. They may get a low mark on an assignment, but it would a rare event indeed if a student failed a course for an inadequate writing performance. Just imagine the line at the dean’s door!
This leads me to my modest proposal. First, let me draw a quick analogy between driving and writing. Most drivers are good drivers because the rules of the road are public and shared, they are consistently enforced, and the consequences of bad driving are clear. I believe most students would become better writers if the rules of writing were public and shared, they were consistently enforced, and the consequences of bad writing were made clear.
Therefore, I propose that all institutions of higher learning adopt the following policy. All faculty members are hereby authorized to challenge their students’ writing proficiency. Students who fail to demonstrate the generally accepted minimum standards of proficiency in writing may be issued a “writing ticket” by their instructors. Writing tickets become part of students’ institutional “writing records.” Students may have tickets removed from their writing records by completing requirements identified by their instructors. These requirements may include substantially revising the paper, attending a writing workshop, taking a writing proficiency examination, or registering for a developmental writing course. Students who fail to have tickets removed from their records will receive additional penalties, such as a failing grade for the course, academic probation, or the inability to register for classes.
What would the consequences of such a policy be? First of all, it would mean that we would have to take writing-across-the curriculum more seriously than most of us do now. We would have to institute placement and assessment procedures to ensure that students receive effective introductory instruction and can demonstrate proficiency in writing at an appropriate level before moving forward.
Professors would also be required to get together, talk seriously and openly, and come to agreements about what they think are “generally accepted minimum standards of proficiency in writing” at various levels, in each discipline, and across the board. We would be required to develop more consistent ways of assigning, responding to, and evaluating writing. We would also have to join with our colleagues in academic support services to recruit, hire, and train effective tutors.
And we would have to issue tickets. Lots of them. But not so many after awhile when students soon learn the consequences of going too fast, too slow, or in the wrong direction, stopping in the wrong place or failing to stop altogether, forgetting to signal when making a turn, or just ending up in a wreck. Then there is that increasing problem of students who take someone else’s car for a joy ride.
Here’s your badge.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
The other day, I received an e-mail from a colleague who teaches part-time at my university. She read an earlier piece I had written for Inside Higher Ed on why I thought students wrote poorly in college, and she wanted to talk to me about strategies for improving the quality of her students’ writing. She had just completed grading their final papers for the term, and she was frustrated with the number of grammar and citation errors.
During the week after grades were due, we met in my office, and she asked if I encountered the same kinds of mistakes. She also wondered what students were actually learning in our two-semester sequence of required writing courses. Were her expectations unreasonable? Should she assume students should be able to write correctly and cite secondary sources? As a member of the English and foreign languages department and past director of the writing program, I assured her that her expectations were not unreasonable and that students who had taken research writing at our school had received a general introduction to managing sources.
Then she shared with me her syllabus, which contained a one paragraph description of one of her writing assignments. My experience tells me that one of the main problems students have with successfully completing writing projects is the design of the assignments. I’ve found assignments left in the copier by colleagues, and I’ve cringed at the unnecessary complexity of the tasks described or the insufficient explanations of what must be accomplished by the student.
Many assignments, like the one contained in her one paragraph, jumble what we want students to do and what we want students to present. In other words, many assignments I’ve seen fail to clearly delineate between the kind of thinking students need to perform and the kind of communication students need to present. So instead of adequately providing students the information they need to succeed, faculty often distribute a sloppily designed task that is cognitively difficult, if not impossible, for students to sort out. Here’s an example of what I mean:
Describe your agreement or disagreement to the statement below. I would also expect you to include at least 3 references from the course readings. Your response should be in the form of a clearly written and logically organized paper of no fewer than 1500 words. No works cited page is necessary for this assignment, but use MLA format for citations. If you wish to show me an early draft, send it to me by e-mail no later than 2 days before paper due date. Also use no smaller than 12 point font and be sure to proof for grammar and spellcheck. As I explained in class, underline your thesis statement in your introductory paragraph, and try to come up with an original title for this paper as well.
Garbage In, Garbage Out. And then come the many complaints that students don’t know how to write.
I don’t mean to place all of the blame on faculty -- though some serious reflection on our culpability in these matters would certainly help. However, I did say to my colleague that students often fail to understand the complexity and time-consuming nature of writing, and instead of just demanding writing projects and assume students come to us as primed and ready to fire away, we need to help them manage their writing projects by providing carefully constructed assignments and a few opportunities to practice writing as a process over the course of the term.
Helping students practice writing as a process has long been taught as a solution to poorly composed papers, yet I don’t think it’s promoted much across the disciplines. But I also told her that there are cultural dimensions to this problem as well. I believe most students equate writing with transcription because the texts they most often encounter are the perfectly polished written products found in books, newspapers, and magazines. Since the hard work of composing those texts is hidden from readers, they believe that good writers think up what they want to say and then copy down their fully-formed thoughts onto the page. Thus, many students think they can’t begin to write until they have decided what they want to say. This, of course, is no news to composition theorists and teachers of rhetoric. But an alternative approach is rarely presented to students.
I did pitch my colleague some strategies for designing assignments and for providing models of what she expected, and I wish her the very best as she rethinks how to best support her students’ writing. Still, we have a cultural battle to fight. So here is another pitch: a new reality TV series called “The American Writer.”
Since contest shows on television have always generated enormous fascination and appeal in our culture, I would like to pitch a basic cable series (A&E, are you listening? PBS? Bravo? Hey, Oprah!) that follows a select group of college students, faculty, and authors as they meet together for a month at a writers’ retreat. The students will have been selected by a jury of college professors and professional writers based upon three writing samples: a short poem, a personal narrative essay, and an opinion piece. The faculty members will be selected from a variety of academic disciplines, and the authors will be selected based upon their abilities to write in more than one genre. At the end of the program, students will be judged on the quality of three new pieces of writing composed at the retreat, and the winner will receive a very generous cash prize.
The series will provide background about each of the students, faculty members, and authors, emphasizing their writing histories, as well as their favorite kinds of reading. The series will also follow these participants as they come to the retreat, reflect upon their selection to participate in the contest, share meals, attend workshops and tutorials, and describe their perceptions of the other participants. But the primary focus of the program will be on the participants’ descriptions of how they go about the act of writing. We will see them planning, drafting, revising, and editing works in progress. And we will sit in on writing workshops and individual tutoring sessions.
This is the basic pitch. Interested agents and producers should contact me for a more developed treatment. (Then there are the spin-offs: “The American Artist” and “The American Actor.”) But more to the point, my proposal is intended to introduce into our most popular cultural medium powerful knowledge all college students should have: an inside view of what really happens when writers struggle with the inescapable difficulties of communicating their ideas and emotions and stories and values through words on the page.
Maybe professors will learn a thing or two along the way as well.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Whether or not your college or university offers a course in public speaking probably has escaped your notice. Nevertheless, it might be worthwhile to give the matter a minute or two of consideration. You might find that the availability or unavailability of this course says something about how diligently a college meets its students' needs, and also about how robust are its humanities offerings.
At first glance, public speaking is an unassuming course of study -- not apparently a canary in a coal mine. Taught in many places by grad students with teaching stipends, or by last-minute, part-time hires, public speaking is no glamour queen, and has less prestige than even college composition. Writing in 1970, in Language Is Sermonic, Richard Weaver noted that whereas once intellectual giants, men of subtle reasoning and wit, taught rhetoric, now it is taught by "beginners, part-time teachers, graduate students, faculty wives, and various fringe people...." Being a fringe sort of person myself -- a former administrator, adjunct, and perpetual faculty wife -- I can see his point. But it was not always so.
Up until the beginning of the 20th century, rhetoric was the most important course of study for young men who wanted to get ahead in the world. In Classical Greece, it was the only one. In the agora, if you found yourself a good sophist, you were a made man. So what if being rhetorically trained and well spoken disqualified you from becoming Plato's philosopher-king. Plato was telling a morally edifying fairy tale for a mundus imaginalis, while the sophists were teaching Athenians to communicate effectively with fellow citizens in the real world.
But at top universities, Plato's view of rhetoric has won out, and not simply as a result of a kind of puritanical suspicion of smooth talking. About rhetoric's fall from grace, Weaver argues that the elevation of science as a mode of thought is significant. It would seem that rhetoric, with its focus on probability has been the victim of the irresistible charm and glamour of the scientific method. Weaver also argues that in our relations with other human beings to appeal only to logic, as science would have us do, is to appeal to part of a human being. Placing such a limit on intellectual inquiry and communication ignores important complexities. He points out that the rhetorician addresses "historical man," a person experiencing the stream of history and the political and moral exigencies history presents and the choices these exigencies require.
Literature, of course, does the same thing, but in a more attenuated way. In teaching a person how to communicate with other persons, practical rhetoric inculcates along with appreciation of human complexity those devoutly worshiped "critical thinking skills." A discipline steeped in human complexity and teaching the skills to deal with convoluted layers of human experience would seem to fit very well within the traditional province of the humanities.
Unfortunately, as Weaver has pointed out, appreciating human complexity means exploring human emotion. And this kind of exploration has been a problem for an academy wed to science. So, along with its unusually modest goal of deliberative probability instead of scientific certainty, rhetoric's teaching of emotional appeals along with logical and ethical ones has seriously undermined academic confidence in the discipline. This rejection of emotion in persuasion by the academic top-tier is probably priggish and short-sighted. Anyone who uses language to persuade knows that it is impossible to fully engage others in an argument without using emotion. Considering emotional appeals to be simply matters of superficial style rather than of argumentative substance is to fail to appreciate rhetoric as a fully humane discipline.
Given the humanity and practicality of rhetoric, it is interesting to observe how the discipline has fared vis a vis literary studies. The downward trajectory of rhetoric's academic standing is the exact opposite of the fate of its academic cousin -- "literary studies" -- which in rhetoric's heyday was, as Weaver points out, the domain of intellectual plebeians, those faculty wives and other marginal types. Now literature departments are so intellectually lofty that to offer completely non-instrumental instruction is a badge of honor, while to teach something for use in the marketplace, something not solely for the sake of pure, inapplicable knowledge, is to be intellectually despoiled.
It is no wonder, then, that Harvard does not teach public speaking. As Emily Nelson writing in The Harvard Crimson notes, "A quick browse through the Courses of Instruction will yield classes on topics as specific as medieval Welsh literature and the theory of the individual in Chinese literary culture. However, even a thorough search would not reveal the words 'Public Speaking' in any course title."
Granted, the Harvard College Committee on Curricular Review recommended in 2004 that the college's writing program be subjected to review and that those supervising instruction in college majors ensure that "instruction and feedback on written and oral communication [are] an integral part of the concentration program." If the committee reviewing instruction in oral and written communication finds a need for public speaking at Harvard, and if the college then does not ignore the recommendation, Harvard would be lonely in the Ivy League in offering a separate course in public speaking to liberal arts students.
Engineering programs, on the other hand, do widely offer and require that students demonstrate competence in oral argument. This is the case in the Ivy League and in the top-tier of public universities. While such universities as Michigan, Virginia , and Berkeley do not offer courses in rhetoric to their liberal arts majors, their engineering and business students generally are required to take courses in rhetoric, discrete offerings available only to engineering and business students.
Why are liberal arts students denied this resource? Provosts explain that liberal arts majors receive ample opportunity to hone skills in oral reasoning by means of class discussion. However, given class sizes at public universities, and the not universal tendency to speak out in class, this rationale seems overly optimistic.
So, Harvard and Berkeley (oddly, having its own department of Rhetoric) do not teach liberal arts majors public speaking skills. On the other hand, rhetoric -- in its most common form, public speaking -- is taught all over the country. You would be hard-pressed to find a land grant university or community college that did not offer public speaking to its students, who enroll in these courses in large numbers.
Top-tier rejection of rhetorical instruction, especially in the form of public speaking, seems to be about fundamental failures of undergraduate education in general and about failures of the humanities in particular. It is especially curious that in the face of calls for accountability in regard to student learning public universities have opted out of providing students with some very useful knowledge, while also failing to recognize the value of the discipline to humane studies.
The Association of American Universities may call for "reinvigorating the humanities," and the joint conference of the American Council of Learned Societies and the AAU may express the intention "to develop a shared agenda for raising the profile of the humanities inside and outside of academia," but criticism of the status quo is stifled by reassuring boilerplate about the "vigor" of the humanities in today's higher education. Case Western Reserve University's then-president, Edward Hundert announced at the conference that the humanities are in great shape except "when it comes to funding, when it comes to new ways of harnessing information technology for new kinds of research and new collaborative paradigms for that research, and in communicating a more coherent message so that the humanities might gain more visibility, public support, prestige, and funding both within the university and society at large." Perhaps before issuing reports and convening conferences about the status of the humanities, someone should pick up a copy of Aristotle's Rhetoric.
Margaret Gutman Klosko
Margaret Gutman Klosko formerly taught public speaking at the University of Virginia and at Piedmont Virginia Community College. She is a freelance writer based in Charlottesville.
One obstacle to reasonable public and scholarly dialogue on the alleged political biases of liberal or leftist professors has been the tendency of David Horowitz, the American Council of Trustees and Alumni, and many of their allies to fall into various versions of the ad populum fallacy, to the effect that there is something wrong with professors because they are out of step with the majority of the American people, who (at least in public institutions) pay their salary through taxes. Thus Larry Mumper, the Republican introducing Horowitz’s “Academic Bill of Rights” in the Ohio legislature, asked in an interview with The Columbus Dispatch, “Why should we, as fairly moderate to conservative legislators, continue to support universities that turn out students who rail against the very policies that their parents voted us in for?” The implication is that professors and their students should tailor their political views to follow the latest public opinion polls or election results.
Politicians like Mumper, along with many media blowhards and members of the public who revile professors, appear to have little more familiarity with the nature of humanistic scholarship than they do with that of brain surgery -- though they would not presume to tell brain surgeons how they should operate, even in a tax-supported hospital. The former field is at the disadvantage that it addresses public issues on which everyone does and should have an opinion. There is a difference, however, between just any such opinions and those derived from standards of professional accreditation (upwards of 10 years graduate study for a Ph.D. and 7 more for tenure), systematic scholarship, and academic discourse. That discourse is based on the principles of reasoned argument, rules of evidence and research procedures, wide reading and experience, an historical perspective on current events, open-minded pursuit of complex, often-unpopular truths, and openness to diverse viewpoints. (For a fuller, excellent discussion of the differences between popular and academic discourse, see “From Ideology to Inquiry,” by Anne Colby and Thomas Ehrlich). This also means that academic discourse should stand independent from government pressure and public opinion, in a similar manner to the ideal of a free, independent press. That is why taxpayers should be willing to support the autonomy of the academy, within reasonable limits, whether or not it agrees with their personal views.
I have spent 30-some years in conservative communities and state universities, teaching lower-division English argumentative writing and literary history courses that are general education requirements for students in business or technological majors, many of whom would not have chosen to take any such courses and resent them as increasingly costly obstacles to the most direct path to a high-paying job. Most such students are conservative, not in any intellectual sense, but in the sense (which they admit) of fearfully conforming to the political and economic status quo, to the attitudes that will be expected of them as compliant employees, and to the necessity of looking out for number one in the “Survivor” sweepstakes of the global economy. Such students are not likely to welcome the cognitive dissonance forced on them by humanities courses demanding Socratic self-questioning of their sociopolitical or religious dogmas, and they are wont to express their resentment, if not in complaints to Horowitz, in the course evaluations that have been debased into consumer-satisfaction surveys in which the top-ranked teachers provide the fewest demands and the highest grades.
Now, we might expect both liberal and conservative scholars and other intellectuals to agree, at the least, in opposition to all of these forces that are detrimental to humanistic education. Conservative disciples of Plato, Matthew Arnold, Leo Strauss, and Allan Bloom decry the contamination of both elite education and enlightened government by the ignorant masses and “philistine” (in Arnold’s term) commercial interests. Conservative intellectuals from the early formulators of neoconservatism like Irving Kristol and Nathan Glazer to recent figures like spokespersons for the National Association of Scholars, Lynne Cheney (when she ran the National Endowment for the Humanities), and even Horowitz have positioned themselves as champions of high academic standards, the humanistic traditions of Western Civilization, and Arnoldian disinterestedness -- against the alleged debasement of those principles by academic and cultural leftists. Shouldn’t they be equally outspoken against the debasement of higher education by turning it over to public opinion polls, partisan legislation, job training and other service to corporations or professions, and student-consumer popularity contests, as well as by ever-mounting tuition and declining financial aid restricting access to the wealthy and white (except for varsity athletes, of course)?
To the contrary of the facile equation, by some conservative and left intellectuals alike, of “the Western humanistic tradition,” with political conservatism, we liberal scholars have on our side the central role in that tradition of dissent and resistance to the authority of governments, churches, the wealthy, and majority opinion. We invoke Thomas Jefferson’s Enlightenment skepticism in urging his nephew Peter Carr, “Question with boldness even the existence of a God; because, if there is one, He must more approve of the homage of reason than that of blindfolded fear.” And we cite Jefferson’s model of tax-funded, free, universal public education through the university level, which, if it had been adopted nationally, “would have raised the mass of people to the high ground of moral respectability necessary to their own safety, and to orderly government; and would have completed the great object of qualifying them to select the veritable aristoi, for the trusts of government, to the exclusion of the pseudalists.” (That is, the aristocracy of merit over that of wealth and hereditary power.)
We also invoke Ralph Waldo Emerson’s exhortations for scholars and other intellectuals to “defer never to the popular cry,” to stand up against majority opinion, unjust governmental power (specifically on issues of his time like support for slavery and the Mexican-American War), and corporate plutocracy; in “The American Scholar” he speaks of “the disgust which the principles on which business is managed inspire.” We follow Emerson up with his disciple Henry David Thoreau’s “Life Without Principle” (“There is nothing, not even crime, more opposed to poetry, to philosophy, ay to life itself, than this incessant business”), and “Civil Disobedience”: “Why does [government] not cherish its wise minority?.... Why does it not encourage its citizens to be on the alert to point out its faults, and do better than it would have them? Why does it always crucify Christ, and excommunicate Copernicus and Luther, and pronounce Washington and Franklin rebels?”
This conception of liberal education as a minimal counter-force to the political and economic status quo, as well as to majority opinion, is fraught with difficulties and possible abuses, to be sure. Can we, or should we, avoid revealing our own moral or political sympathies in class? Should we, for example, teach Plato, Jefferson, Emerson, and Thoreau (or Frederick Douglass, Rosa Parks, and Martin Luther King) as inspirations for existential moral choices, or simply as subjects of neutral study, perhaps as representatives of a particular viewpoint or “bias,” always to be balanced against sources on “the other side,” including equal time for defenses of slavery and segregation? Moral judgments are of course less disputable in reference to such past conflicts than to present ones like the war in Iraq or affirmative action; neither conservative nor liberal polemicists have provided a clear road map for how teachers should deal with current moral disputes and public opinion about them.
In broader terms, both conservative and liberal educators have long lamented the political illiteracy of the American public in general and college students in particular. However, amid all the mutual recriminations about this and related issues in academic politics, there has been sadly little constructive discussion of the appropriate time, place, and manner for the fostering of civic literacy in either secondary or college education. My impression is that the exhortations of NAS, ACTA, and other conservative educators for core liberal arts curriculum and more requirements in history -- with which I happen to agree -- fall short of outlining a coherent curriculum and pedagogy for critical citizenship. (On the flip side, many liberal advocates of multiculturalism and diversity have failed to delineate what kind of studies American students of all ethnic, gender, and social-class groups need for minimal common knowledge as citizens.) In such a curriculum and pedagogy, students would not merely be indoctrinated into American chauvinism and simplistic “virtues,” as some on the right advocate, but would be encouraged to think critically about competing ideological or moral viewpoints (in party politics, journalistic and entertainment media, as well as scholarly sources) about American and world history, as well as about the present world.
The pedagogical approach that I personally have developed over the years applies Gerald Graff’s principle of “teaching the conflicts,” in presenting students out front with the current debates on such issues and disclosing my own left-of-liberal viewpoint on them, as exactly that -- one perhaps biased viewpoint among other possible ones, to be understood in relation to opposing ones and studied through the best conservative vs. liberal or leftist research sources that students can find, leaving it up to them to evaluate the opposing arguments, and grading them on their skill in researching and analyzing sources. I do not claim that mine is a foolproof approach, but most of my students have found it a fair one throughout the years, and I have heard few alternatives, especially from conservative educators.
There are daunting problems here in persuading the public, politicians, and students to respect academic expertise, autonomy, and the role of higher education as a Socratic gadfly to the body politic. At the same time, scholars have a responsibility to show consideration and discretion toward public opinion, and toward students who dissent from our opinions. But cannot conservative and liberal scholars at least join in endorsing these general principles, while scrupulously addressing the difficulties in implementing them, through civil dialogue? And shouldn’t some of the foundations, professional organizations, or government agencies that have channeled their resources into partisan battles in the culture wars be willing to sponsor a bipartisan task force pursuing such a dialogue in quest of resolutions to these problems?
Donald Lazere is professor emeritus of English at California Polytechnic University at San Luis Obispo and currently teaches at the University of Tennessee at Knoxville. He is the author of Reading and Writing for Civic Literacy: The Critical Citizen’s Guide to Argumentative Rhetoric (Paradigm Publishers).