At the small liberal arts college where I teach, we have recently undertaken a wholesale revision of our core liberal arts curriculum. This is the set of requirements -- some specific courses, some chosen from a range of options -- that all students at the college must take before graduation. For professors in the natural sciences, this revision has required a good deal of thought about the content and nature of science courses offered to a non-major audience.
Conventional wisdom -- usually unquestioned -- has it that there are three basic elements that go into making up a good non-majors science course. First, the class should cover a relatively narrow range of topics. The classic "Physics for Poets" survey class, which attempts to cover an entire field in one semester, is almost always a disaster, satisfying neither the students taking it nor those teaching it. It's better to restrict the course to a subset of a given field, and spend more time covering a smaller range of topics.
Second, the topic chosen as the focus of the course should be something relatively modern. Students respond much more positively when they can immediately see the relevance of the material. Ideally, a good non-major science class should deal with either a "hot topic" in current research, or something connected to an ongoing public policy debate. It's much easier to engage the students in a subject if they're likely to read about it in The New York Times.
The third element is perhaps the most important: the course should involve the minimum possible amount of math. Many of the students who are the target audience for these classes are uncomfortable with mathematical reasoning, and react badly when asked to manipulate and interpret equations. This final characteristic is also the main reason why I am profoundly ambivalent about such classes.
Science for non-majors offers an important chance to reach out to students outside the sciences, and try to give them some appreciation for scientific inquiry. This is critically important, as we live in a time where science itself is under political assault from both the left and right. People with political agendas are constantly peddling distorted views of science, from conspiracy theories regarding pharmaceutical companies and drug development, to industry-backed attempts to challenge the scientific findings regarding global climate change, to the well-documented attempts to force religion into science curricula under the guise of "intelligent design." It's more important than ever for our students to be able to understand and critically evaluate competing claims about science.
I worry, however, that our approach to teaching science as a part of a liberal education is undermining the goals we have set for our classes. Despite the effort we put into providing classes that are both relevant and informative, I am troubled by the subtext of these classes. By their very existence, these classes send two damaging messages to students in other disciplines: first, that science is something alien and difficult, the exclusive province of nerds and geeks; and second, that we will happily accommodate their distaste for science and mathematics, by providing them with special classes that minimize the difficult aspects of the subject.
The first of these messages is sadly misguided. Science is more than just a collection of difficult facts to be learned. It's a way of looking at the universe, a systematic approach to studying the world around us, and understanding how things work. As such, it's as fundamental a part of human civilization as anything to be found in art or literature. The skills needed to do science are the same skills needed to excel in most other fields: careful observation, critical thinking, and an ability to support arguments with evidence.
The second subtext, however, is disturbingly accurate. We do make special accommodations for students who are uncomfortable with science, and particularly mathematics. We offer special classes that teach science with a minimum of math, and we offer math classes at a level below what ought to be expected of college students. Admissions officers and student tour guides go out of their way to reassure prospective students that they won't be expected to complete rigorous major-level science classes, but will be provided with options more to their liking.
It's difficult to imagine similar accommodations being made for students uncomfortable with other disciplines. The expectations for student ability in the humanities are much higher than in the sciences. If a student announced that he or she was not comfortable with reading and analyzing literary texts, we would question whether that student belonged in college at all (and rightly so). We take the existence of "Physics for Poets" for granted, but nobody would consider advocating a "Poetry for Physicists" class for science majors who are uncomfortable with reading and analyzing literature.
The disparity in expectations goes well beyond simple literacy. I was absolutely stunned to hear a colleague suggest, to many approving nods, that all first-year students should be required to read The Theory Toolbox. We would never consider asking all entering students to read H. M. Schey's Div, Grad, Curl, and All That: An Informal Text on Vector Calculus, even though the critical theory described in The Theory Toolbox is every bit as much a specialized tool for literary analysis as vector calculus is a specialized tool for scientific analysis. Yet faculty members in the humanities can seriously propose one as essential for all students in all disciplines, while recoiling from the other.
This distaste for and fear of mathematics extends beyond the student body, into the faculty, and our society as a whole. Richard Cohen, writing in The Washington Post, wrote a column in February in which he dismissed algebra as unimportant, and proclaimed his own innumeracy.
"I confess to be one of those people who hate math. I can do my basic arithmetic all right (although not percentages) but I flunked algebra (once), barely passed it the second time -- the only proof I've ever seen of divine intervention -- somehow passed geometry and resolved, with a grateful exhale of breath, that I would never go near math again."
It's a sad commentary on the state of our society that a public intellectual (even a low-level one like Cohen) can write such a paragraph and be confident that it will be met with as many nods of agreement as howls of derision. If a scientist or mathematician were to say "I can handle simple declarative sentences all right (although not transitive verbs)," they could never expect to be taken seriously again. Illiteracy among the general public is viewed as a crisis, but innumeracy is largely ignored, because everybody knows that Math is Hard.
Fundamentally, this problem begins well below the college level, with the sorry state of science and math teaching in our middle schools and high schools. The ultimate solution will need to involve a large-scale reform of math and science teaching, from the early grades all the way through college. As college professors, though, we can begin the process by demanding a little more of our students, and not being quite so quick to accommodate gaps in their knowledge of math and science. We should recognize that mathematical and scientific literacy are every bit as important for an educated citizen as knowledge of history and literature, and insist that our students meet high standards in all areas of knowledge.
Of course, the science faculties are not without responsibilities in this situation. Forcing non-science majors to take the same courses as science majors seems like an unappealing prospect in large part because so many introductory science courses are unappealing. If we are to force non-science majors to take introductory science major courses, we will also need to commit to making those courses more acceptable to a broader range of students. One good start is the teaching initiative being promoted by Carl Wieman, a Nobel laureate in physics Carl Wieman who is leaving the University of Colorado to pursue educational reforms at the University of British Columbia, but more effort is needed. If we improve the quality of introductory science teaching and push for greater rigor in the science classes offered to non-majors, we should see benefits well outside the sciences, extending to society as a whole.
As academics, we are constantly asked to look below the surface to the implications of our actions. We are told that we need to consider the hidden messages sent by who we hire, what we assign, how we speak to students, and even what we wear. Shouldn't we also consider the hidden message sent by the classes we offer, and what they say about our educational priorities?
Edward Morley is the psuedonym of an assistant professor of physics.
"Sat on his arse and had group presentations teach class the last 5 weeks of qtr." --a "Rate My Professors" comment
Quick now: when did you first hear the phrase, "collaborative learning" (and, if possible, where were you)? I seem to have heard it first only last year, when asked by two faculty members of a local community college about my "position" on collaborative learning during an informal chat about the possibility of teaching there. Of course I immediately replied I was all for collaborative learning, which in fact made me what I am today. Students need to learn from each other, blah,blah.
Later, I asked a man I chanced to know who taught at the same place about collaborative learning. "It's all bullshit," he replied. "Everything is 'student-centered' this and 'student-centered' that. You e-mail them if they're absent, you give them make-ups if they fail. And to teach, just get them in a circle and stay out of the way while they talk to each other about anything except what you ask them to talk about."
Time to do some research -- into my own experience as well as the professional literature. What could something now termed "collaborative learning" in fact have been called a decade ago? The jargon of that period used to have a former colleague and I joking about having to stage a "circle jerk." Was the regnant term instead, "student centered discussion"? Or "interactive competence"? Does it matter? Something, which I will term "collaborative learning" and hereafter abbreviate "C-L," was at that time, as now, the Next Big Thing, either already arrived or about to.
Of course it turns out that C-L has been with us for a longer period of time, under even more various guises. As a pedagogy, we can trace C-L back at least as far as "discovery learning" notions of the 1960s, designed to enable students to acquire knowledge through their own interaction both with various subjects (principally math and the sciences) and with their classmates. As a philosophical orientation, we can take C-L back at least as far as John Dewey. In practical terms though -- and in this context the terms are remorselessly practical -- C-L becomes part of a rich terminological stew, otherwise listed on the institutional menu as "cooperative learning," "collective learning," "peer teaching," "study circles," and so on.
Distinctions among these dishes can of course can be made. Nonetheless, we can distinguish crucial common ingredients. These include most importantly a rearrangement of chairs in the classroom, whereby groups of students face each other rather than the professor. Whether or not this rearrangement is thereby deemed a "community," each group of students is expected to be primarily dependent upon itself in order to understand something, ranging from a question on a particular day to the whole sequence of the course throughout the semester.
Sometimes, it works; students among themselves actually discover solutions, ideas or directions that they never could have possessed either so securely or so wholly if they had instead been led by their professor, lecturing. But what doesn't prove to be effective in the classroom, at some times, in some cases? Indeed, the fact that anything can be made or seem so is almost a definition of education. I used to know a guy (in psychology) who regularly had his students fan out on the floor and lie concentrically head-to-head. "Works for me," he used to say.
The goals of C-L, however, are at once more aggressive and more ambitious. C-L purchases its authority against the bad old figure of the Lecturer, whose model of learning is top-down, rather than peer-to-peer (and face-to-face). In a most distinct sense, the purpose of C-L is to substitute a collaborative notion of education for a hierarchical one. That hierarchy is bad goes as unquestioned as the assumption that interaction is good, period.
Is hierarchy bad? Set aside the more literal question of what the professor is supposed to do while his or her students are merrily collaborating? (To join the circle seems a cop out, and to set up elaborate rules or even individual roles in order to define student interaction appears to smuggle back an authority already left at the front door.) In C-L discourse, who exactly is this professor in the first place?
In one sense, this is easily answered. The professor is a "facilitator" or an "enabler." He or she falls into place as part of the larger cast of characters in the vocabulary of group dynamics, with its formidable list of favored terms, such as "group processing," "teamwork skills," and (my own favorite) "positive interdependence." In a classic account of lecturing available in his Forms of Talk, Erving Goffman states that in a lecture "the subject matter is meant to have its own enduring claims upon the listeners apart from the felicities or infelicities of the presentation." Not so in C-L.
There is no lecturer because there is no subject matter. We see this best perhaps in community colleges, such as the one above, where adjuncts are enjoined to participate in "professional development" sessions on C-L. Not about their respective disciplines. About C-L itself. Indeed, the "content free" nature of C-L as a pedagogy is revealed in these settings as its most compelling feature; not only is there no learning without "interactive competence" -- such competence constitutes all there is to learn, on the part of teachers as well as students.
If the professor were to lecture, he or she would lecture about that -- and this is precisely what he or she does in the regime of C-L, depending upon how the spirit moves to explain to groups, excuse me, communities, how intricately or carefully all has been designed and organized for them. Except that it seems wrong to characterize the speakers as "professors." Professors profess a subject. The subject has been learned though specialized study. Instructors (to chose a more neutral term) instruct a method. Anybody can learn it.
But, I think, we have still not completely answered all questions about C-L until we seek to account for its popularity at the present time. A wide-ranging answer would be to link C-L to the "ideology of excellence" Bill Readings examines in The University in Ruins, whereby the appeal to excellence marks the following fact: "All that the system requires is for activity to take place, and the empty notion of excellence refers to nothing other than the optimal limit-output ratio in matters of information."
"Collaboration," in other words, now functions as at once the definition of activity as well as the value term sponsoring it. A less wide-ranging answer, though, would simply argue that C-L attained its present popularity at approximately the same time as the widespread use of adjuncts in college teaching. When I e-mailed a friend of mine about his thoughts on C-L, he replied: "Another way to hire fewer teachers and have more students." Exactly. The best way to hire fewer teachers is to hire more adjuncts. The best way for them to teach (especially to students with poor preparation for college) is to have them teach C-L.
Not only do adjuncts not necessarily have to possess specialized -- not to say "terminal" -- knowledge in their respective disciplines. We don't have to worry about them aspiring to become "professors." Just as important, adjuncts by definition lack the job security to be able to resist C-L's claims not so much as a pedagogy as an ideology. We don't have to consider them wondering out loud about, say, whether you can really teach "interpersonal skills," much less whether the imperative to learn them in an ostensibly noncompetitive setting is itself not designed to promote what Readings at one point characterizes as "the condition of the political subject under contemporary capitalism."
No wonder the Rate My Professors student complains. (A political subject who can't complain wouldn't be a political subject.) In its ideological phase, a pedagogy now as powerful as C-L risks becoming available only in terms of its lowest common denominator -- the circle, in which students are left to their own devices. This is not fair to C-L.
But justice, alas, explains little about why things are as they are in higher education, or anywhere else. What explains more?
Let me suggest another word: ignorance. I am thinking of the great American literary critic, R.P. Blackmur. Once he uttered the following objection to the system of Basic English devised by I.A. Richards: "What, should we get rid of our ignorance, the very substance of our lives, merely in order to understand one another?"
The best thing about the bad old lecture method may be simply that it leaves us alone in our ignorance, whether we want to be or not. The worst thing about the bad new collaborative method is that, any more than cell phones or cable news, it never leaves us alone. Instead, C-L demands that we must understand one another as a function of learning anything.
Terry Caesar's last column compared academic conferences -- and other kinds of conferences.
At my university, I chair a faculty committee charged with reviewing and revising our general education curriculum. Over the past two and a half years, we have examined programs at similar colleges and studied best practices nationwide. In response, we have begun to propose a new curriculum that responds to some of the weaknesses in our current program (few shared courses and little curricular oversight), and adds what we believe will be some new strengths (first-year seminars and a junior-level multidisciplinary seminar).
In addition, we are proposing that we dispense with our standard second course in research writing, revise our English 101 into an introduction to academic writing, and institute a writing-across-the-curriculum program. Our intention is to infuse the general education curriculum with additional writing practice and to prompt departments to take more responsibility for teaching the conventions of research and writing in their disciplines. As you might imagine, this change has fostered quite a bit of anxiety (and in some cases, outright outrage) on the part of a few colleagues who believe that if we drop a course in writing, we have dodged our duty to ensure that all students can write clearly and correctly. They claim that their students don’t know how to write as it is, and our proposal will only make matters worse.
I believe most faculty think that when they find an error in grammar or logic or format, it is because their students don’t know “how” to write. When I find significant errors in student writing, I chalk it up to one of three reasons: they don’t care, they don’t know, or they didn’t see it. And I believe that the first and last are the most frequent causes of error. In other words, when push comes to shove, I’ve found that most students really do know how to write -- that is, if we can help them learn to value and care about what they are writing and then help them manage the time they need to compose effectively.
Still, I sympathize with my colleagues who are frustrated with the quality of writing they encounter. I have been teaching first-year writing for many years, and I have directed rhetoric and compositions programs at two universities. During this time, I have had many students who demonstrate passive aggressive behavior when it comes to completing writing projects. The least they can get away with or the later they can turn it in, the better. I have also had students with little interest in writing because they have had no personally satisfying experiences in writing in high school. Then there are those students who fail to give themselves enough time to handle the complex process of planning, drafting, revising, and editing their work.
But let’s not just blame the students. Most college professors would prefer to complain about poor writing than simply refuse to accept it. Therefore, students rarely experience any significant penalties for their bad behaviors in writing. They may get a low mark on an assignment, but it would a rare event indeed if a student failed a course for an inadequate writing performance. Just imagine the line at the dean’s door!
This leads me to my modest proposal. First, let me draw a quick analogy between driving and writing. Most drivers are good drivers because the rules of the road are public and shared, they are consistently enforced, and the consequences of bad driving are clear. I believe most students would become better writers if the rules of writing were public and shared, they were consistently enforced, and the consequences of bad writing were made clear.
Therefore, I propose that all institutions of higher learning adopt the following policy. All faculty members are hereby authorized to challenge their students’ writing proficiency. Students who fail to demonstrate the generally accepted minimum standards of proficiency in writing may be issued a “writing ticket” by their instructors. Writing tickets become part of students’ institutional “writing records.” Students may have tickets removed from their writing records by completing requirements identified by their instructors. These requirements may include substantially revising the paper, attending a writing workshop, taking a writing proficiency examination, or registering for a developmental writing course. Students who fail to have tickets removed from their records will receive additional penalties, such as a failing grade for the course, academic probation, or the inability to register for classes.
What would the consequences of such a policy be? First of all, it would mean that we would have to take writing-across-the curriculum more seriously than most of us do now. We would have to institute placement and assessment procedures to ensure that students receive effective introductory instruction and can demonstrate proficiency in writing at an appropriate level before moving forward.
Professors would also be required to get together, talk seriously and openly, and come to agreements about what they think are “generally accepted minimum standards of proficiency in writing” at various levels, in each discipline, and across the board. We would be required to develop more consistent ways of assigning, responding to, and evaluating writing. We would also have to join with our colleagues in academic support services to recruit, hire, and train effective tutors.
And we would have to issue tickets. Lots of them. But not so many after awhile when students soon learn the consequences of going too fast, too slow, or in the wrong direction, stopping in the wrong place or failing to stop altogether, forgetting to signal when making a turn, or just ending up in a wreck. Then there is that increasing problem of students who take someone else’s car for a joy ride.
Here’s your badge.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
The other day, I received an e-mail from a colleague who teaches part-time at my university. She read an earlier piece I had written for Inside Higher Ed on why I thought students wrote poorly in college, and she wanted to talk to me about strategies for improving the quality of her students’ writing. She had just completed grading their final papers for the term, and she was frustrated with the number of grammar and citation errors.
During the week after grades were due, we met in my office, and she asked if I encountered the same kinds of mistakes. She also wondered what students were actually learning in our two-semester sequence of required writing courses. Were her expectations unreasonable? Should she assume students should be able to write correctly and cite secondary sources? As a member of the English and foreign languages department and past director of the writing program, I assured her that her expectations were not unreasonable and that students who had taken research writing at our school had received a general introduction to managing sources.
Then she shared with me her syllabus, which contained a one paragraph description of one of her writing assignments. My experience tells me that one of the main problems students have with successfully completing writing projects is the design of the assignments. I’ve found assignments left in the copier by colleagues, and I’ve cringed at the unnecessary complexity of the tasks described or the insufficient explanations of what must be accomplished by the student.
Many assignments, like the one contained in her one paragraph, jumble what we want students to do and what we want students to present. In other words, many assignments I’ve seen fail to clearly delineate between the kind of thinking students need to perform and the kind of communication students need to present. So instead of adequately providing students the information they need to succeed, faculty often distribute a sloppily designed task that is cognitively difficult, if not impossible, for students to sort out. Here’s an example of what I mean:
Describe your agreement or disagreement to the statement below. I would also expect you to include at least 3 references from the course readings. Your response should be in the form of a clearly written and logically organized paper of no fewer than 1500 words. No works cited page is necessary for this assignment, but use MLA format for citations. If you wish to show me an early draft, send it to me by e-mail no later than 2 days before paper due date. Also use no smaller than 12 point font and be sure to proof for grammar and spellcheck. As I explained in class, underline your thesis statement in your introductory paragraph, and try to come up with an original title for this paper as well.
Garbage In, Garbage Out. And then come the many complaints that students don’t know how to write.
I don’t mean to place all of the blame on faculty -- though some serious reflection on our culpability in these matters would certainly help. However, I did say to my colleague that students often fail to understand the complexity and time-consuming nature of writing, and instead of just demanding writing projects and assume students come to us as primed and ready to fire away, we need to help them manage their writing projects by providing carefully constructed assignments and a few opportunities to practice writing as a process over the course of the term.
Helping students practice writing as a process has long been taught as a solution to poorly composed papers, yet I don’t think it’s promoted much across the disciplines. But I also told her that there are cultural dimensions to this problem as well. I believe most students equate writing with transcription because the texts they most often encounter are the perfectly polished written products found in books, newspapers, and magazines. Since the hard work of composing those texts is hidden from readers, they believe that good writers think up what they want to say and then copy down their fully-formed thoughts onto the page. Thus, many students think they can’t begin to write until they have decided what they want to say. This, of course, is no news to composition theorists and teachers of rhetoric. But an alternative approach is rarely presented to students.
I did pitch my colleague some strategies for designing assignments and for providing models of what she expected, and I wish her the very best as she rethinks how to best support her students’ writing. Still, we have a cultural battle to fight. So here is another pitch: a new reality TV series called “The American Writer.”
Since contest shows on television have always generated enormous fascination and appeal in our culture, I would like to pitch a basic cable series (A&E, are you listening? PBS? Bravo? Hey, Oprah!) that follows a select group of college students, faculty, and authors as they meet together for a month at a writers’ retreat. The students will have been selected by a jury of college professors and professional writers based upon three writing samples: a short poem, a personal narrative essay, and an opinion piece. The faculty members will be selected from a variety of academic disciplines, and the authors will be selected based upon their abilities to write in more than one genre. At the end of the program, students will be judged on the quality of three new pieces of writing composed at the retreat, and the winner will receive a very generous cash prize.
The series will provide background about each of the students, faculty members, and authors, emphasizing their writing histories, as well as their favorite kinds of reading. The series will also follow these participants as they come to the retreat, reflect upon their selection to participate in the contest, share meals, attend workshops and tutorials, and describe their perceptions of the other participants. But the primary focus of the program will be on the participants’ descriptions of how they go about the act of writing. We will see them planning, drafting, revising, and editing works in progress. And we will sit in on writing workshops and individual tutoring sessions.
This is the basic pitch. Interested agents and producers should contact me for a more developed treatment. (Then there are the spin-offs: “The American Artist” and “The American Actor.”) But more to the point, my proposal is intended to introduce into our most popular cultural medium powerful knowledge all college students should have: an inside view of what really happens when writers struggle with the inescapable difficulties of communicating their ideas and emotions and stories and values through words on the page.
Maybe professors will learn a thing or two along the way as well.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Whether or not your college or university offers a course in public speaking probably has escaped your notice. Nevertheless, it might be worthwhile to give the matter a minute or two of consideration. You might find that the availability or unavailability of this course says something about how diligently a college meets its students' needs, and also about how robust are its humanities offerings.
At first glance, public speaking is an unassuming course of study -- not apparently a canary in a coal mine. Taught in many places by grad students with teaching stipends, or by last-minute, part-time hires, public speaking is no glamour queen, and has less prestige than even college composition. Writing in 1970, in Language Is Sermonic, Richard Weaver noted that whereas once intellectual giants, men of subtle reasoning and wit, taught rhetoric, now it is taught by "beginners, part-time teachers, graduate students, faculty wives, and various fringe people...." Being a fringe sort of person myself -- a former administrator, adjunct, and perpetual faculty wife -- I can see his point. But it was not always so.
Up until the beginning of the 20th century, rhetoric was the most important course of study for young men who wanted to get ahead in the world. In Classical Greece, it was the only one. In the agora, if you found yourself a good sophist, you were a made man. So what if being rhetorically trained and well spoken disqualified you from becoming Plato's philosopher-king. Plato was telling a morally edifying fairy tale for a mundus imaginalis, while the sophists were teaching Athenians to communicate effectively with fellow citizens in the real world.
But at top universities, Plato's view of rhetoric has won out, and not simply as a result of a kind of puritanical suspicion of smooth talking. About rhetoric's fall from grace, Weaver argues that the elevation of science as a mode of thought is significant. It would seem that rhetoric, with its focus on probability has been the victim of the irresistible charm and glamour of the scientific method. Weaver also argues that in our relations with other human beings to appeal only to logic, as science would have us do, is to appeal to part of a human being. Placing such a limit on intellectual inquiry and communication ignores important complexities. He points out that the rhetorician addresses "historical man," a person experiencing the stream of history and the political and moral exigencies history presents and the choices these exigencies require.
Literature, of course, does the same thing, but in a more attenuated way. In teaching a person how to communicate with other persons, practical rhetoric inculcates along with appreciation of human complexity those devoutly worshiped "critical thinking skills." A discipline steeped in human complexity and teaching the skills to deal with convoluted layers of human experience would seem to fit very well within the traditional province of the humanities.
Unfortunately, as Weaver has pointed out, appreciating human complexity means exploring human emotion. And this kind of exploration has been a problem for an academy wed to science. So, along with its unusually modest goal of deliberative probability instead of scientific certainty, rhetoric's teaching of emotional appeals along with logical and ethical ones has seriously undermined academic confidence in the discipline. This rejection of emotion in persuasion by the academic top-tier is probably priggish and short-sighted. Anyone who uses language to persuade knows that it is impossible to fully engage others in an argument without using emotion. Considering emotional appeals to be simply matters of superficial style rather than of argumentative substance is to fail to appreciate rhetoric as a fully humane discipline.
Given the humanity and practicality of rhetoric, it is interesting to observe how the discipline has fared vis a vis literary studies. The downward trajectory of rhetoric's academic standing is the exact opposite of the fate of its academic cousin -- "literary studies" -- which in rhetoric's heyday was, as Weaver points out, the domain of intellectual plebeians, those faculty wives and other marginal types. Now literature departments are so intellectually lofty that to offer completely non-instrumental instruction is a badge of honor, while to teach something for use in the marketplace, something not solely for the sake of pure, inapplicable knowledge, is to be intellectually despoiled.
It is no wonder, then, that Harvard does not teach public speaking. As Emily Nelson writing in The Harvard Crimson notes, "A quick browse through the Courses of Instruction will yield classes on topics as specific as medieval Welsh literature and the theory of the individual in Chinese literary culture. However, even a thorough search would not reveal the words 'Public Speaking' in any course title."
Granted, the Harvard College Committee on Curricular Review recommended in 2004 that the college's writing program be subjected to review and that those supervising instruction in college majors ensure that "instruction and feedback on written and oral communication [are] an integral part of the concentration program." If the committee reviewing instruction in oral and written communication finds a need for public speaking at Harvard, and if the college then does not ignore the recommendation, Harvard would be lonely in the Ivy League in offering a separate course in public speaking to liberal arts students.
Engineering programs, on the other hand, do widely offer and require that students demonstrate competence in oral argument. This is the case in the Ivy League and in the top-tier of public universities. While such universities as Michigan, Virginia , and Berkeley do not offer courses in rhetoric to their liberal arts majors, their engineering and business students generally are required to take courses in rhetoric, discrete offerings available only to engineering and business students.
Why are liberal arts students denied this resource? Provosts explain that liberal arts majors receive ample opportunity to hone skills in oral reasoning by means of class discussion. However, given class sizes at public universities, and the not universal tendency to speak out in class, this rationale seems overly optimistic.
So, Harvard and Berkeley (oddly, having its own department of Rhetoric) do not teach liberal arts majors public speaking skills. On the other hand, rhetoric -- in its most common form, public speaking -- is taught all over the country. You would be hard-pressed to find a land grant university or community college that did not offer public speaking to its students, who enroll in these courses in large numbers.
Top-tier rejection of rhetorical instruction, especially in the form of public speaking, seems to be about fundamental failures of undergraduate education in general and about failures of the humanities in particular. It is especially curious that in the face of calls for accountability in regard to student learning public universities have opted out of providing students with some very useful knowledge, while also failing to recognize the value of the discipline to humane studies.
The Association of American Universities may call for "reinvigorating the humanities," and the joint conference of the American Council of Learned Societies and the AAU may express the intention "to develop a shared agenda for raising the profile of the humanities inside and outside of academia," but criticism of the status quo is stifled by reassuring boilerplate about the "vigor" of the humanities in today's higher education. Case Western Reserve University's then-president, Edward Hundert announced at the conference that the humanities are in great shape except "when it comes to funding, when it comes to new ways of harnessing information technology for new kinds of research and new collaborative paradigms for that research, and in communicating a more coherent message so that the humanities might gain more visibility, public support, prestige, and funding both within the university and society at large." Perhaps before issuing reports and convening conferences about the status of the humanities, someone should pick up a copy of Aristotle's Rhetoric.
Margaret Gutman Klosko
Margaret Gutman Klosko formerly taught public speaking at the University of Virginia and at Piedmont Virginia Community College. She is a freelance writer based in Charlottesville.
One obstacle to reasonable public and scholarly dialogue on the alleged political biases of liberal or leftist professors has been the tendency of David Horowitz, the American Council of Trustees and Alumni, and many of their allies to fall into various versions of the ad populum fallacy, to the effect that there is something wrong with professors because they are out of step with the majority of the American people, who (at least in public institutions) pay their salary through taxes. Thus Larry Mumper, the Republican introducing Horowitz’s “Academic Bill of Rights” in the Ohio legislature, asked in an interview with The Columbus Dispatch, “Why should we, as fairly moderate to conservative legislators, continue to support universities that turn out students who rail against the very policies that their parents voted us in for?” The implication is that professors and their students should tailor their political views to follow the latest public opinion polls or election results.
Politicians like Mumper, along with many media blowhards and members of the public who revile professors, appear to have little more familiarity with the nature of humanistic scholarship than they do with that of brain surgery -- though they would not presume to tell brain surgeons how they should operate, even in a tax-supported hospital. The former field is at the disadvantage that it addresses public issues on which everyone does and should have an opinion. There is a difference, however, between just any such opinions and those derived from standards of professional accreditation (upwards of 10 years graduate study for a Ph.D. and 7 more for tenure), systematic scholarship, and academic discourse. That discourse is based on the principles of reasoned argument, rules of evidence and research procedures, wide reading and experience, an historical perspective on current events, open-minded pursuit of complex, often-unpopular truths, and openness to diverse viewpoints. (For a fuller, excellent discussion of the differences between popular and academic discourse, see “From Ideology to Inquiry,” by Anne Colby and Thomas Ehrlich). This also means that academic discourse should stand independent from government pressure and public opinion, in a similar manner to the ideal of a free, independent press. That is why taxpayers should be willing to support the autonomy of the academy, within reasonable limits, whether or not it agrees with their personal views.
I have spent 30-some years in conservative communities and state universities, teaching lower-division English argumentative writing and literary history courses that are general education requirements for students in business or technological majors, many of whom would not have chosen to take any such courses and resent them as increasingly costly obstacles to the most direct path to a high-paying job. Most such students are conservative, not in any intellectual sense, but in the sense (which they admit) of fearfully conforming to the political and economic status quo, to the attitudes that will be expected of them as compliant employees, and to the necessity of looking out for number one in the “Survivor” sweepstakes of the global economy. Such students are not likely to welcome the cognitive dissonance forced on them by humanities courses demanding Socratic self-questioning of their sociopolitical or religious dogmas, and they are wont to express their resentment, if not in complaints to Horowitz, in the course evaluations that have been debased into consumer-satisfaction surveys in which the top-ranked teachers provide the fewest demands and the highest grades.
Now, we might expect both liberal and conservative scholars and other intellectuals to agree, at the least, in opposition to all of these forces that are detrimental to humanistic education. Conservative disciples of Plato, Matthew Arnold, Leo Strauss, and Allan Bloom decry the contamination of both elite education and enlightened government by the ignorant masses and “philistine” (in Arnold’s term) commercial interests. Conservative intellectuals from the early formulators of neoconservatism like Irving Kristol and Nathan Glazer to recent figures like spokespersons for the National Association of Scholars, Lynne Cheney (when she ran the National Endowment for the Humanities), and even Horowitz have positioned themselves as champions of high academic standards, the humanistic traditions of Western Civilization, and Arnoldian disinterestedness -- against the alleged debasement of those principles by academic and cultural leftists. Shouldn’t they be equally outspoken against the debasement of higher education by turning it over to public opinion polls, partisan legislation, job training and other service to corporations or professions, and student-consumer popularity contests, as well as by ever-mounting tuition and declining financial aid restricting access to the wealthy and white (except for varsity athletes, of course)?
To the contrary of the facile equation, by some conservative and left intellectuals alike, of “the Western humanistic tradition,” with political conservatism, we liberal scholars have on our side the central role in that tradition of dissent and resistance to the authority of governments, churches, the wealthy, and majority opinion. We invoke Thomas Jefferson’s Enlightenment skepticism in urging his nephew Peter Carr, “Question with boldness even the existence of a God; because, if there is one, He must more approve of the homage of reason than that of blindfolded fear.” And we cite Jefferson’s model of tax-funded, free, universal public education through the university level, which, if it had been adopted nationally, “would have raised the mass of people to the high ground of moral respectability necessary to their own safety, and to orderly government; and would have completed the great object of qualifying them to select the veritable aristoi, for the trusts of government, to the exclusion of the pseudalists.” (That is, the aristocracy of merit over that of wealth and hereditary power.)
We also invoke Ralph Waldo Emerson’s exhortations for scholars and other intellectuals to “defer never to the popular cry,” to stand up against majority opinion, unjust governmental power (specifically on issues of his time like support for slavery and the Mexican-American War), and corporate plutocracy; in “The American Scholar” he speaks of “the disgust which the principles on which business is managed inspire.” We follow Emerson up with his disciple Henry David Thoreau’s “Life Without Principle” (“There is nothing, not even crime, more opposed to poetry, to philosophy, ay to life itself, than this incessant business”), and “Civil Disobedience”: “Why does [government] not cherish its wise minority?.... Why does it not encourage its citizens to be on the alert to point out its faults, and do better than it would have them? Why does it always crucify Christ, and excommunicate Copernicus and Luther, and pronounce Washington and Franklin rebels?”
This conception of liberal education as a minimal counter-force to the political and economic status quo, as well as to majority opinion, is fraught with difficulties and possible abuses, to be sure. Can we, or should we, avoid revealing our own moral or political sympathies in class? Should we, for example, teach Plato, Jefferson, Emerson, and Thoreau (or Frederick Douglass, Rosa Parks, and Martin Luther King) as inspirations for existential moral choices, or simply as subjects of neutral study, perhaps as representatives of a particular viewpoint or “bias,” always to be balanced against sources on “the other side,” including equal time for defenses of slavery and segregation? Moral judgments are of course less disputable in reference to such past conflicts than to present ones like the war in Iraq or affirmative action; neither conservative nor liberal polemicists have provided a clear road map for how teachers should deal with current moral disputes and public opinion about them.
In broader terms, both conservative and liberal educators have long lamented the political illiteracy of the American public in general and college students in particular. However, amid all the mutual recriminations about this and related issues in academic politics, there has been sadly little constructive discussion of the appropriate time, place, and manner for the fostering of civic literacy in either secondary or college education. My impression is that the exhortations of NAS, ACTA, and other conservative educators for core liberal arts curriculum and more requirements in history -- with which I happen to agree -- fall short of outlining a coherent curriculum and pedagogy for critical citizenship. (On the flip side, many liberal advocates of multiculturalism and diversity have failed to delineate what kind of studies American students of all ethnic, gender, and social-class groups need for minimal common knowledge as citizens.) In such a curriculum and pedagogy, students would not merely be indoctrinated into American chauvinism and simplistic “virtues,” as some on the right advocate, but would be encouraged to think critically about competing ideological or moral viewpoints (in party politics, journalistic and entertainment media, as well as scholarly sources) about American and world history, as well as about the present world.
The pedagogical approach that I personally have developed over the years applies Gerald Graff’s principle of “teaching the conflicts,” in presenting students out front with the current debates on such issues and disclosing my own left-of-liberal viewpoint on them, as exactly that -- one perhaps biased viewpoint among other possible ones, to be understood in relation to opposing ones and studied through the best conservative vs. liberal or leftist research sources that students can find, leaving it up to them to evaluate the opposing arguments, and grading them on their skill in researching and analyzing sources. I do not claim that mine is a foolproof approach, but most of my students have found it a fair one throughout the years, and I have heard few alternatives, especially from conservative educators.
There are daunting problems here in persuading the public, politicians, and students to respect academic expertise, autonomy, and the role of higher education as a Socratic gadfly to the body politic. At the same time, scholars have a responsibility to show consideration and discretion toward public opinion, and toward students who dissent from our opinions. But cannot conservative and liberal scholars at least join in endorsing these general principles, while scrupulously addressing the difficulties in implementing them, through civil dialogue? And shouldn’t some of the foundations, professional organizations, or government agencies that have channeled their resources into partisan battles in the culture wars be willing to sponsor a bipartisan task force pursuing such a dialogue in quest of resolutions to these problems?
Donald Lazere is professor emeritus of English at California Polytechnic University at San Luis Obispo and currently teaches at the University of Tennessee at Knoxville. He is the author of Reading and Writing for Civic Literacy: The Critical Citizen’s Guide to Argumentative Rhetoric (Paradigm Publishers).
As a teacher of writing and literature at Salem State College, I hear a lot of stories. My students, although they may never have ventured more than 20 miles from where they were born, bring hard lessons of endurance to the classroom that seem more profound than any I'd had at their age. For years I've believed that they bring a certain wisdom to the class, a wisdom that doesn't score on the SAT or other standardized tests. The old teaching cliché -- I learn from my students -- feels true, but it is hard to explain. I'm not particularly naïve. I know that life can be difficult. So it is not that my students initiate me into the world of sorrow. It is that they often bring their sorrows, and their struggles, to the material, and when they do, it makes life and literature seem so entwined as to be inseparable.
This past year, for the first time, I taught African American literature: two sections each semester of a yearlong sequence, around 22 students per section. The first semester we began with Phyllis Wheatley and ended with the Harlem Renaissance. The second semester we started with Zora Neale Hurston and Richard Wright and ended with Percival Everett's satire, Erasure, published early in the new millennium.
The students in these classes weren't the ones I typically had in my writing classes. About half were white, and the other half were black, Latino, or Asian. They were generally uninterested or inexperienced in reading, simply trying to satisfy the college's literature requirement. One day before spring break I was assigning the class a hundred pages from Toni Morrison's Sula, and one student looked aghast. "We have to read during vacation?" he sputtered. I learned from them the whole year.
In the fall semester, I was teaching W. E. B. Du Bois's The Souls of Black Folk. As classes go, it had been fairly dull. Du Bois's essays didn't have the compelling story line of the slave narratives that we had read earlier in the semester. We had just begun examining Du Bois's idea of "double consciousness." It is a complicated notion that an African American, at least around 1900 when Du Bois was writing, had "no true self-consciousness" because he was "always looking at one's self through the eyes of others ... measuring one's soul by the tape of a world that looks on in amused contempt and pity." In class, I read this definition, paraphrased it, then asked, "Does this make sense to you?"
There was the usual pause after I ask a question and then, from Omar, a large, seemingly lethargic African American, came a soulful, deep-throated "yeah." The word reverberated in the haphazard circle of desks as we registered the depths from which he had spoken. The room's silence after his "yeah" was not the bored silence that had preceded it. The air was charged. Someone had actually meant something he had said. Someone was talking about his own life, even if it was only one word.
I followed up: "So what do you do about this feeling? How do you deal with it?"
Everyone was staring at Omar, but he didn't seem to notice. He looked at me a second, then put his head down and shook it, slowly, as if seeing and thinking were too much for him. "I don't know, man. I don't know."
The rest of the heads in class dropped down, too, and students began reviewing the passage, which was no longer just a bunch of incomprehensible words by some long-dead guy with too many initials.
Every book that we studied after that day, some student would bring up double consciousness, incorporating it smartly into our discussion. Omar had branded the concept into everyone's minds, including mine.
One idea that arises from double consciousness is that, without "true self-consciousness," you risk giving in and accepting society's definitions of yourself, becoming what society tells you that you are. Such a capitulation may be what happens to Bigger Thomas, the protagonist of Richard Wright's Native Son, a novel we read during the second semester. Native Son is a brutal book. Bigger, a poor African American from the Chicago ghetto, shows little regret after he murders two women. His first victim is Mary, the daughter of a wealthy white family for whom Bigger works as a driver. After Bigger carries a drunk, semiconscious Mary up to her room, he accidentally suffocates her with a pillow while trying to keep her quiet so his presence won't be discovered. Realizing what he has done, he hacks up her body and throws it in the furnace. Emboldened rather than horrified, he writes a ransom note to the family and eventually kills his girlfriend, Bessie, whom he drags into the scheme. In the end, he's found out, and, after Chicago is thrown into a hysterical, racist-charged panic, he's caught, brought to trial -- a very long trial that contains a communist lawyer's exhaustive defense of Bigger that is an indictment of capitalism and racism -- and sentenced to death.
Readers, to this day, are not sure what to make of Bigger. Is he to be pitied? Is he a warning? A symbol? A product of American racism?
During the second week of teaching Native Son, I was walking through the college's athletic facility when I heard my name, "Mr. Scrimgeour. Mr. Scrimgeour..."
I turn and it is Keith, an African American from the class. "Hey, I wanted to tell you, I'm sorry."
"Sorry?" He has missed a few classes, but no more than most students. Maybe he hasn't turned in his last response paper.
"Yeah, I'm going to talk in class more." I nod. He looks at me as if I'm not following. "Like Bigger, I don't know.... I don't like it." His white baseball cap casts a shadow over his face so that I can barely see his eyes.
"What don't you like?"
"He's, like," Keith grimaces, as if he isn't sure that he should say what he is about to say. "He's like a stereotype -- he's like what people -- some people -- say about us."
On "us," he points to his chest, takes a step back, and gives a pained half grin, his teeth a bright contrast to his dark, nearly black skin.
"Yeah," I say. "That's understandable. You should bring that up in the next class. We'll see what other people think."
He nods. "And I'm sorry," he says, taking another step back, "It's just that...." He taps his chest again, "I'm shy."
Keith has trouble forming complete sentences when he writes. I don't doubt that my fourth-grade son can write with fewer grammatical errors. Yet he had identified the criticism of Wright's book made by such writers as James Baldwin and David Bradley, whose essays on Native Son we would read after we finished the novel. And he knew something serious was at stake -- his life -- that chest, and what was inside it, that he'd tapped so expressively. Was Bigger what Baldwin identified as the "inverse" of the saccharine Uncle Tom stereotype? Was Wright denying Bigger humanity? And, if so, should we be reading the book?
To begin answering these questions required an understanding of Bigger. For me, such an understanding would come not just from the text, but from my students' own lives.
That Keith apologized for his lack of participation in class is not surprising. My students are generally apologetic. "I'm so ashamed," one student said to me, explaining why she didn't get a phone message I'd left her. "I live in a shelter with my daughter." Many of them feel a sense of guilt for who they are, a sense that whatever went wrong must be their fault. These feelings, while often debilitating, enable my students, even Keith, to understand Bigger, perhaps better than most critics. Keith, who -- at my prompting -- spoke in class about being pulled over by the police, understood the accumulation of guilt that makes you certain that what you are doing, and what you will do, is wrong. Bigger says he knew he was going to murder someone long before he actually does, that it was as if he had already murdered.
Unlike his critics, Richard Wright had an unrelentingly negative upbringing. As he details in his autobiography, Black Boy, Wright was raised in poverty by a family that discouraged books in the violently racist South. There was little, if anything, that was sustaining or nurturing. Perhaps a person has to have this sense of worthlessness ground into one's life to conceive of a character like Bigger. Like my students, one must be told that one isn't much often enough so that it is not simply an insult, but a seemingly intractable truth.
"I'm sorry," Keith had said. It was something Bigger could never really bring himself to say, and in this sense the Salem State students were much different from Bigger. Their response to society's intimidation isn't Bigger's rebelliousness. Wright documents Bigger's sense of discomfort in most social interactions, particularly when speaking with whites, during which he is rendered virtually mute, stumbling through "yes, sirs" and loathing both himself and the whites while doing so.
Although my students weren't violent, they identified with Bigger's discomfort -- they'd experienced similar, less extreme discomforts talking to teachers, policemen, and other authority figures. As a way into discussing Bigger, I'd asked them to write for a few minutes in class about a time in which they felt uncomfortable and how they had responded to the situation. I joined them in the exercise. Here's what I wrote:
As a teenager, after school, I would go with a few other guys and smoke pot in the parking lot of the local supermarket, then go into the market's foyer and play video games stoned. While I felt uncomfortable about smoking pot in the parking lot, I didn't really do much. I tried to urge the guys I was with to leave the car and go inside and play the video games, but it wouldn't mean the same thing: to just go in and play the games would be childish, uncool, but to do it after smoking pot made it OK -- and once I was in the foyer, it was OK.; I wouldn't get in trouble. But mostly I did nothing to stop us. I toked, like everyone else. I got quiet. I didn't really hear the jokes, but forced laughter anyway. I was very attentive to my surroundings -- was that lady walking out with the grocery cart looking at us? Afterward, when we went in and manipulated those electronic pulses of light and laughed at our failures, we weren't just laughing at our failures, we were laughing at what we had gotten away with.
After they had worked in groups, comparing their own experiences to Bigger's, I shared my own writing with the class. Of course, there were smiles, as well as a few looks of astonishment and approbation. I had weighed whether to confess to my "crime," and determined that it might lead to learning, as self-disclosure can sometimes do, and so here I was, hanging my former self out on a laundry line for their inspection.
What came of the discussion was, first of all, how noticeable the differences were between my experience and Bigger's. I was a middle class white boy who assumed he would be going to college. I believed I had a lot to lose from being caught, while Bigger, trapped in a life of poverty, may not have felt such risks. Also, the discomfort I was feeling was from peer pressure, rather than from the dominant power structure. Indeed, my discomfort arose from fact that I was breaking the rules, whereas Bigger's arose from trying to follow the rules -- how he was supposed to act around whites.
But there was also a curious similarity between my experience and Bigger's. Playing those video games would have meant something different had we not smoked pot beforehand. The joy of wasting an afternoon dropping quarters into Galaga was about knowing that we had put one over on the authorities; it was about the thrill of getting away with something, of believing, for at least a brief time, that we were immune to society's rules. Like me after I was safely in the supermarket, Bigger, upon seeing that he could get away with killing Mary, felt "a queer sense of power," and believed that he was "living, truly and deeply." In a powerless life, Bigger had finally tasted the possibility of power.
My students know Bigger moderately well. They don't have his violent streak; they don't know his feelings of being an outsider, estranged from family and community despite hanging out with his cronies in the pool hall and being wept over by his mother.
What they understand is his sense of powerlessness. They have never been told that they can be players on the world stage, and, mostly, their lives tell them that they can't, whether it's the boss who (they think) won't give them one night off a semester to go to a poetry reading, or the anonymous authority of the educational bureaucracy that tells them that due to a missed payment, or deadline, they are no longer enrolled. As one student writes in his midterm: "Bigger is an African American man living in a world where who he is and what he does doesn't matter, and in his mind never will."
I went to a talk recently by an elderly man who had worked for the CIA for 30 years, an engineer involved with nuclear submarines who engaged in the cloak-and-dagger of the cold war. The layers of secrecy astonish. How much was going on under the surface! -- the trailing and salvaging of nuclear subs; the alerts in which cities and nations were held over the abyss in the trembling fingers of men as lost as the rest of us, though they generally did not realize it.
During the questions afterward, someone asked about the massive buildup of nuclear arsenals. "Didn't anyone look at these thousands of nuclear warheads we were making and say 'This is crazy?' "
The speaker nodded, his bald freckled head moving slowly. He took a deep breath. "It was crazy, but when you are in the middle of it, it is hard to see. No one said anything."
After the talk, I fell into conversation with the speaker's son, a psychologist in training. I was noting how tremendously distant this world of espionage was from the world of my students, how alien it was. And I said that the stories of near nuclear annihilation frightened me a lot more than they would frighten them. In essence, my students saw their lives like Bigger's: The great world of money and power was uninterested in them and moved in its ways regardless of what they did. Like Bigger, they would never fly the airplanes that he, who had once dreamed of being a pilot, watches passing over the Chicago ghetto.
"It's too bad they feel so disempowered," the son said, and it is. Yet there is something valuable in their psychology, too. It is liberating to let that world -- money and power -- go, to be able to see the outlines of your existence, so that you can begin to observe, and know, and ultimately make an acceptable marriage with your life. Some might say it is the first step to becoming a writer.
After September 11, 2001, a surprising number of students didn't exhibit the depth of horror that I had witnessed others display on television. "I'm sorry if I sound cold," one student said, "but that has nothing to do with me." One of my most talented students even wrote in an essay, "The war has nothing to do with my life. I mean the blood and the death disgusts me, but I'm sorry -- I just don't care."
And then I watched them realize how it did indeed have to do with them. It meant that they lost their jobs at the airport, or they got called up and sent to Afghanistan or Iraq. The world doesn't let you escape that easily. Bigger got the chair.
It has been two months since we finished Native Son. The school year is ending, and I rush to class, a bit late, trying to decide whether to cancel it so that I can have lunch with a job candidate -- we're hiring someone in multicultural literature, and I'm on the search committee. As I make my way over, I feel the tug of obligation -- my students would benefit from a discussion of the ending of Percival Everett's Erasure, even though, or perhaps especially because, almost none of them have read it. Yet it's a fine spring day, a Friday, and they will not be interested in being in class, regardless of what I pull out of my teaching bag of tricks. I weigh the options -- dull class for everyone or the guilt of canceling a class (despite the department chair's suggestion that I cancel it). Before I enter the room, I'm still not quite sure, but I'm leaning toward canceling. I take a deep breath and then breathe out, exhaling my guilt into the tiled hallway.
I open the door; the students are mostly there, sitting in a circle, as usual. Only a few are talking. I walk toward the board, and -- I freeze -- scrawled across it is:
Why are we even here for? You already gave us the final. It's not like you're going to help us answer it.
Looking at it now, I think the underline was a nice touch, but at that moment, for a rage-filled second, I think, "We're going to have class, dammit! Make them suffer." I stand with my back to them, slowing my breath, my options zipping through my mind while sorrow (despair?) and anger bubble in me and pop, pop into the afternoon's clear light.
So much for learning. Were our conversations simply for grades? Was that the real story of this year?
When we discussed Native Son, we talked about how easy it was to transfer feelings of guilt to rage at those who make you feel guilty. Bigger's hatred of whites stems from how they make him feel. He pulls a gun and threatens Mary's boyfriend, Jan, when Jan is trying to help him, because Jan has made him feel he has done wrong. In the book, Wright suggests that white society loathes blacks because they are reminders of the great sin of slavery. Is my rage from guilt -- guilt that we haven't really accomplished much this year, guilt that I was willing to cancel a class because I didn't want to endure 45 minutes of bored faces? Pop ... pop.
I dismiss the class and stroll over to the dining commons to collect my free lunch.
Erasure is a brilliant satire, one that contains an entire novella by the book's protagonist, a frustrated African American writer, Monk Ellison, who has been told one too many times by editors that his writings aren't "black enough." The novel within a novel lifts the plot of Native Son almost completely, and it presents a main character, Van Go Jenkins, as the worst stereotype of African American culture, someone without morals, whose only interests are sex and violence. At one point, Van Go slaps one of his sons around -- he has four children by four different women -- because the mentally handicapped three-year-old spilled juice on Van Go's new shirt.
It's clear that Erasure's narrator, Monk, is appalled by the book he writes, and that he's appalled by Native Son and the attitudes about race and writing the novel has fostered. When we do discuss the book in class, I point to a snippet of dialogue that Monk imagines:
D.W. GRIFFITH: I like your book very much.
RICHARD WRIGHT: Thank you.
"So this is a real question Erasure raises," I say. My pulse quickens. I can sense them listening, waiting. "Is this book right about Richard Wright? Is this book fair to him? To Native Son? Has the creation of Bigger Thomas been a disaster for African Americans? Has it skewered the country's view of race in a harmful way?" I pause, content. Even if no one raises a hand, even if no discussion ensues, -- and certainly some discussion will erupt -- I can see the question worming into their minds, a question that they might even try to answer themselves.
La Sauna, the student who never lets me get away with anything, raises her hand: "What do you think?"
What do I think? I wasn't ready for that. What do I think?
What I think, I realize, has been altered by what they think, and what they have taught me about the book, about the world.
There are no definite answers, but my students had helped identify the questions, and had pointed toward possible replies. After we had finished reading Native Son, I asked the class, "How many of you want Bigger to get away, even after he bashes in Bessie's head?" A good third of the class raised their hands, and, like the class itself, those who wanted this double murderer to escape were a mix of men and women, blacks and whites. There are several ways to interpret this, but I don't think it is a sign of callousness, the residue of playing too much Grand Theft Auto. They wanted Bigger to escape because Wright had gotten into Bigger's consciousness deeply and believably enough that he became real, more than a symbol or a stereotype.
I tell them this, how their response to Bigger has influenced my reading. I don't tell them Gina's story.
Gina was one of the students who read the books. She loved Tea Cake and Sula, was torn between Martin Luther King Jr. and Malcolm X. She even visited me in my office once or twice to seek advice about problems with a roommate, or a professor. An African American student from a rough neighborhood, she ended up leaving the college after the semester ended, unable to afford housing costs.
Sometime in March of that semester, Gina came to my office. She had missed class and wanted to turn in her response paper on Native Son. The class had read the essays by Baldwin and Bradley criticizing the novel, and had been asked to evaluate them. Baldwin, Gina tells me, was difficult, "but he was such a good writer."
Did she agree with Baldwin, I ask? Was Bigger denied humanity by Wright? How does she feel toward Bigger?
"I think he needs help," she says, "but I felt sorry for him. I wanted him to be able to understand his life--" I cut in, offering some teacherish observation about how Bigger shows glimmers of understanding in the last part of the book, but her mind is far ahead of me, just waiting for me to stop. I do.
"The book reminded me of the guy who killed my uncle. You probably saw it -- the trial was all over the TV last week."
I shake my head.
The man and an accomplice had murdered her uncle, a local storeowner, three years ago, and the previous week had been sentenced to life without parole. The two had been friends of the uncle's family, had played pool with the uncle the night before, planning to rob and kill him the next day.
"When I saw him sitting there, with his head down, looking all sad, I don't know, I felt sorry for him. I wanted to give him a copy of Native Son. I wanted to walk up to him and put it in his lap. It might help him to understand his life.
She looks at me, her brown face just a few shades darker than mine. She's 19. Her hair is pinned back, and some strands float loose. Her eyes are as wide as half dollars, as if she's asking me something. Without thinking, I nod slowly, trying to hold her gaze. On the shelves surrounding us are the papers and books of my profession, the giant horde that will pursue me until I die.
"My family wants him to suffer -- hard. But I want to talk to him. Do you think that's bad? I want to know why he did it, what happened. I wonder how he'd react if he saw me -- what he'd do if I gave him the book."
I imagined Native Son in the man's lap. The glossy, purple, green, and black cover bright against the courtroom's muted wood, the man's trousers. His hand, smooth with youth, holds its spine. His thumb blots out part of the eerie full-lipped face on the front. As the words of the court fall about him, the book rises and falls ever so slightly, as if breathing.
J.D. Scrimgeour coordinates the creative program at Salem State College and is the author of the poetry collection The Last Miles. This essay is part of his new collection, Themes for English B: A Professor's Education In and Out of Class, which is being released today by the University of Georgia Press and is reprinted here with permission.
As an undergraduate at a state university, I read the schedule of classes long before I had to register. I scanned instructors' names first. Next I considered courses, and finally I would take the action that would decide my class schedule -- I went to the university bookstore and looked at the textbooks each professor required.
Scanning the stacks, I was overwhelmed by the number of textbooks on the bookshelves. Every two or three books represented a semester's worth of learning. And for 16 weeks, I would be married to that book. I looked at how the textbooks were written, the amount of reading necessary, and the different tools offered to help a student understand a concept. I knew myself. I knew my learning style. And after flipping through a few shelves of textbooks at the university bookstore, I was making choices that would give me a better chance -- not only of passing the courses -- but of actually learning and carrying that knowledge with me into later courses.
When I became an instructor myself, I marveled at the autonomy of the job. To some degree I could make my own hours. As long as I aligned my courses with the course objectives set up by my department, I would receive positive peer evaluations and approval by the administration. At these campuses, I chose a textbook from the list provided by the department's textbook committee. At two campuses where I worked, the department chair told me that textbooks not on the list were often approved by the committee chair quickly enough that they could be used that semester.
When I moved to teach at a large urban community college, I faced something that looked like too much freedom. For one freshman composition course, I was given a choice of 59 textbooks to choose from. At the next level of composition, the list of approved textbooks was 104 titles long. Dazed, I contacted trusted colleagues and skimmed their textbooks.
Finally, I reverted to my old undergraduate habits and visited the college bookstore. This time, however, I was making a bigger decision. I now had to commit to a textbook that would serve three sections of a particular composition class. That meant that 99 students of varying academic abilities would have to live with my decision. And even though I could change the text the next semester if I needed to, there would be 16 long weeks with a book that did not serve our needs as well as it should.
Finally I would make my choices -- and start the laborious process of ordering desk copies and passing paperwork on to my department chair. It was exhausting, but tremendously rewarding. After all, I was able to choose a text that, for the most part, aligned with my own beliefs. I would be challenged to teach some new material and learn some new teaching techniques with this choice -- and my students would benefit.
In contrast, this week, at the university where I am on contract to teach full-time, my supervisor told a roomful of composition faculty which textbook they will be using for Fall 2007. To stunned silence, he held up three textbooks that he had chosen for what he called a "one year experiment." One text was to be used for incoming freshman taking composition; the next semester's instructors would have the choice of one of the remaining two textbooks. Refusing any discussion, he indicated that part of the reason for this change was the administration's edict that freshman students be given a "uniform experience" in our composition courses.
There was not a sound as more than 30 professors left the room. It was not until the next day that I first heard their collective unbridled response. One professor who had worked at this university for over a decade stopped our director in the copy room and said, "So, since you're choosing the textbook, are you going to give us standardized lesson plans, too." When his supervisor did not respond, the professor made one last attempt to communicate his disappointment, "Hey, why don't you just come in and teach my courses for me?"
"It's just the beginning," another professor told me. "This university has bought into the idea that education is a business." Sighing, he said, "The next step will be classes of a thousand with PowerPoint presentations instead of lessons." At the time I thought he was just being sarcastic and reactionary; yet I later wondered if he was on to something.
"Student as consumer" has become a driving force at many colleges. In the last few decades, a number of provosts, presidents, and chancellors have buckled under pressure from students, local businesspeople, and voting citizens to think of education as a simple equation -- quickly deliver the students information, get money. In some cases, accreditation boards have tried to hold the line; in other cases, they seem to be in collusion with this move toward efficiency at all costs. In any case, the art of teaching has been relegated to a much lower status -- or in some cases, completely disregarded.
Slowly and quietly, the freedoms that not only made teaching enjoyable, but effective, are being taken away by an administration that is more interested in uniformity -- as if education was a drive-through fast-food product. Perhaps they've forgotten that even the drive-through provides choices: a hamburger or cheeseburger, a chicken sandwich, a fish sandwich, chicken fingers, fries, curly fries, a few salad choices, a baked potato -- the list may be too big to fit on one menu board. Yet in something as important as education, some are thinking, "the fewer choices, the better."
I'm not sure if they're really thinking primarily of the students. True -- students would be reading the same material across the board. But perhaps this is the answer that receives less resistance from those concerned. Perhaps administrators have other motives as well. It would be less work for their secretaries if they only had to order one book. And standardization often makes it easier to assess students' learning. That means increased claims of success -- and a better shot at funding. And, of course, departments would be easier to manage with fewer variables. Whether they are pressed into service or welcome the chance, administrators need to spend time fund raising, informing the public, creating events that will reflect well in press releases, shaking hands at groundbreaking ceremonies, and attending mayoral functions; why not scale back in an area that already causes them concern?
After all, with the massification of education, universities are serving more "consumers." Many feel it is better to get as many students as they can in and out of the educational system quickly and show our culture that we have "produced" the workers we had promised. Yet in the short seven years I've been teaching, every professor I've come in contact with has expressed concerns: 1) that we are stooping to lowered standards; 2) that many students infer that memorizing facts and spilling them back to a proctor is enough; 3) and that certificates and programs are being created not because of student demand -- but because of pressure from those who fund the campus. These worries, among others, have made many professors aware that the pressure to provide a quality educational experience falls almost solely to them. And when they have an administration that does not support this goal, it becomes almost impossible to attain.
Last semester, administrators at my university told English composition faculty that they are going to implement a standardized syllabus in the near future; perhaps lesson plans will be faculty's own -- yet policies previously set by faculty will soon be dictated by the administration. This semester, my department chair dictated the textbooks that faculty will use in 2007. Will the next move be lesson plans created by administrators? Standardized testing? By removing the creativity and style that individual instructors provide, couldn't we, in effect, move to a system simply supervised by proctors? Some administrators will wince at this suggestion; I guarantee a few will actually gaze up at their office ceilings and think about it -- if only for a moment. And even if administrators do not instigate this kind of plan, moving toward overt control (or even elimination) of faculty can be a piecemeal business.
Yet in many disciplines, the use of standardized materials and lesson plans is already problematic. Because the materials were not developed for a particular course or student population, many students feel detached from the material. In liberal arts, especially, faculty must be in place to personalize the learning experience for students -- otherwise students feel as if their input is worthless. Faculty are of infinite value here; taking away their ability to teach well cannot be a recipe for a successful educational experience.
While writing this, I admit that I am feeling reactionary. In a few weeks or months, I may be less angry -- yet in that time, won't these edicts still be in place at my university? Yes. And my ability to teach in the way that I think works best will be curtailed more and more. I will feel less like an instructor and more like the employee of a corporate machine. I am reminded that I spent a decade in Silicon Valley purchasing semiconductors, and another decade in advertising, writing copy for products and services that I didn't care about. In these positions I felt useless -- and at times, degraded.
I moved to higher education because I believed in the service we provided. And I remembered the choices I was allowed to make as an undergraduate -- not only in major, but also in courses within that major. Those courses were represented by a textbook and, most importantly, a faculty "face" that helped me interpret that textbook and often inspired me to go beyond the classroom.
Perhaps I am an anomaly. I did not go to college under pressure from family or even the society that surrounded me. The pressure was from within. For me, college was not a "product" to be bought -- something to ensure my business future with promotions and 50-cent-an-hour raises. It was a challenge that I needed to meet to find out what I was capable of.
And, of course, I was interested in learning. In a way, not much has changed. I am still interested in learning -- not only my students' learning, but my own. And the freedom to choose my own textbook, create my own assignments, and pace my courses are a form of learning. I am constantly evaluating my teaching methods and pressing for improvement. I make notes on my course outline about each lesson, "good, students applied knowledge from last assignment," "met with little discussion -- rewrite or discard," or "connected to writing topic -- keep." I attend conferences in my discipline and read articles and books in and out of my discipline. I talk to colleagues and post to an online discussion board, constantly rooting around to find different ways to teach the objectives that my department has set out for me to achieve. It's my hope that in an age of diminished expectations, campuses will leave the art of teaching to those best suited to perform this challenging and unwieldy task: faculty.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
"I saw a small iridescent sphere of almost unbearable brightness. At first I thought it was spinning; then I realized that the movement was an illusion produced by the dizzying spectacles inside it." --Jorge Luis Borges, "The Aleph"
On December 17, 2005, “Saturday Night Live” ran a skit by Chris Parnell and Andy Samberg called "Lazy Sunday," a rap video about going out on a "lazy Sunday" to see The Chronicles of Narnia and procuring some cupcakes with "bomb frostings" from the Magnolia Bakery in New York City. The rap touches on the logistics of getting to the theater on the Upper West Side: "Let's hit up Yahoo Maps to find the dopest route./ I prefer Mapquest!/ That's a good one too./ Google Maps is the best!/ True that! Double true!/ 68th and Broadway./ Step on it, sucka!"
Parnell and Samberg make it to the Magnolia for their cupcakes, go to a deli for more treats, and hide their junk food in a backpack for smuggling past movie security. They complain about the high movie prices at the box office ("You can call us Aaron Burr from the way we're dropping Hamiltons") and brag about participating in the pre-movie trivia quiz. Doesn't seem like much if you've never seen it, but for pure joie de vivre, and white suburban dorkiness, "Lazy Sunday" just can't be beat. What makes "Lazy Sunday" special, however, is how its original airing coincided with the birth of Internet video-sharing, enabling the two minute clip to be viewed millions of times on YouTube, a free service that hosts videos posted by users. In fact, the popularity of the clip on YouTube was so great that NBC forced the site to remove it several months later, citing copyright infringement. The prospect of its programming being net-jacked by Internet geeks and magnified through YouTube's powerful interface was just too much for NBC.
I bring up "Lazy Sunday" to foreground my discussion of the pedagogical uses of YouTube because it sums up its spirit and helps us define the genre of video with which YouTube is most associated. Although YouTube is awash in clips from television and film, the sui generis YouTube video is the product of collaborative "lazy Sunday" moments when pals film each other or perform for the camera doing inane things like dancing, lip synching or making bottles of Diet Coke become volcanic after dropping Mentos candies in them.
Parnell and Samberg's references to Internet tools and movie trivia, as well as their parody of rap, perfectly capture a zeitgeist in which all pleasures can be recreated, reinvented and repeated ad nauseam through the magic of the Web. As Sam Anderson describes it in Slate, YouTube is "an incoherent, totally chaotic accretion of amateurism -- pure webcam footage of the collective unconscious." Whatever you're looking for (except porn) can be found in this Borgesian hall of mirrors: videos of puppies, UFO footage, ghosts on film, musical memento mori about recently deceased celebrities, movie and documentary clips, real and faux video diaries, virtuoso guitar picking performances and all kinds of amateur films. In my case, the video that sold me on YouTube was "Where the Hell is Matt Harding Dancing Now?" -- a strangely uplifting video of a guy called Matt Harding who traveled around the world and danced in front of landmarks such as Macchu Picchu in Peru, Area 51 in the U.S., the head-shaped monoliths of Easter Island, and the Great Wall of China, among many others.
OK, that's all nice, but what can YouTube do for professors, apart from giving them something to look at during their lunch breaks? Inside Higher Ed has reported on the ways in which YouTube is causing consternation among academics because it is being used by students to stage moments of guerilla theater in the classroom, record lectures without permission and ridicule their professors. Indeed, a search on YouTube for videos of professors can bring up disquieting clips of faculty behaving strangely in front of their students, like the professor who coolly walks over to a student who answers a ringing cell phone in class, politely asks for the device, and then violently smashes it on the floor before continuing on with his lecture as if nothing had happened. It could be staged (authenticity is more often than not a fiction on YouTube) but it is still disturbing.
But I would like to argue for an altogether different take on YouTube, one centered on the ways in which this medium can enrich the learning experience of college students by providing video realia to accompany their textbooks, in-class documentaries and course lectures. Although I can't speak to the applicability of YouTube to every discipline, in what follows I make a case for how the service can be harnessed by professors in the humanities and social sciences.
As a professor Latin American literature and culture, I often teach an introductory, third year course called Latin American Culture and Civilization in which students study history, literature and any other media that the instructor wishes to include in the course, such as music, film, comics and the visual arts. My version of the course emphasizes student engagement with foundational documents and writings that span all periods of Latin American history and that I have annotated for student use. One of the figures we study is President Hugo Chávez of Venezuela, whose outsized political persona has made him a YouTube star. Apart from having my students watch an excerpt of his "Bush as sulfurous devil" speech at the United Nations, I assigned a series of animated cartoons prepared by the Venezuelan state to educate children about the Bolivarian constitution championed by Chávez. These cartoons allow students see the ways in which the legacy of the 19th-century Venezuelan Liberator, Simon Bolívar, remains alive today.
The textual richness of these cartoons invites students to visually experience Bolivarian nationalism in a way that cannot be otherwise recreated in the classroom. It invites them to think critically about the ways in which icons such as Bolívar are creatively utilized to instill patriotism in children. In a similar vein, a Cuban cartoon about Cuba's founding father, José Martí, depicts how a child is transformed into the future champion of independence and social justice when he witnesses the horrors of slavery (this video has now been removed from YouTube). With regard to the Mexican Revolution, one of the most important units of the class, YouTube offers some fascinating period film of the revolutionary icons Emiliano Zapata and Pancho Villa, and especially their deaths. Although I cannot say that these are visual texts that lend themselves to the kind of rich dialogue provoked by the aforementioned cartoons, they are nonetheless an engaging visual complement to readings, discussions and lectures.
Another course in which YouTube has played a part in is my senior-level literature course on the Chilean Nobel Laureate Pablo Neruda. It may seem farfetched to use Internet video in a poetry class, but in this case, YouTube offers several useful media clips. I have utilized film clips in which Neruda's poetry appears (such as Patch Adams and Truly, Madly, Deeply), as well as music videos of Latin American singers who use lyrics by Neruda. More than anything that I could say in class, these videos illustrate the reach and enduring quality of Neruda's poetry in Latin American and North American culture. This said, there are a surprising number of student-produced videos about Neruda on YouTube that are cringe-worthy, the "Lazy Sunday" versions of the poet and his poetry. These are quite fascinating in of themselves as instances in which young people use video to interpret and stage Neruda, in ways that might be set into dialogue with more literary and canonical constructions of his legacy, but I confess that I am not yet convinced of their pedagogical value.
In this regard, the case of Neruda is not so different from that of other literary figures, such as Emily Dickinson, Nathaniel Hawthorne and Robert Frost, who are also the subject of interesting home-made YouTube videos. What do we do, for example, with a Claymation film that recreates Frost's "The Road Not Taken"? I would argue that this film is interesting because it captures the banality of a certain canonical image or version of Robert Frost that is associated with self-congratulatory, folksy Hallmark Card moments.
There are all kinds of video with classroom potential on YouTube. Consider, for example, one of YouTube's greatest stars, Geriatric1927, a 79 year-old Englishman whose video diaries document his memories of World War II, as well as of other periods of English history. Then there are the Michel Foucault-Noam Chomsky debates, in which Foucault sketches out, in animated, subtitled conversation, the key arguments of seminal works such as Discipline and Punish. There's an excellent short slide show of period caricatures of Leon Trotsky, news reels and lectures about the Spanish Civil War, rare footage of Woody Guthrie performing, Malcolm X at the University of Oxford, clips of Chicana activist Dolores Huerta discussing immigration reform and a peculiar musical montage, in reverse, about Che Guevara, beginning with images and reels of his death and ending with footage of him as a child.
Don't let me tell you what you can find; seek and ye shall receive.
YouTube is not necessary for good teaching, in the same way that wheeling a VCR into the classroom is not necessary, or bringing in PowerPoint slide shows with images, or audio recordings. YouTube simply makes more resources available to teachers than ever before, and allows for better classroom management. Rather than use up valuable time in class watching a film or video clips, such media can be assigned to students as homework in the same way that reading is assigned. However, to make it work, faculty should keep in mind that the best way to deliver this content is through a course blog. YouTube provides some simple code that bloggers can use to stream the videos on a blog, rather than having to watch them within the YouTube interface. This can be important because we may not want students to have to deal with advertisements or the obnoxious comments that many YouTube users leave on the more controversial video pages. On my free wordpress.com course blog, I can frame YouTube videos in a way that makes them look more professional and attractive ( sample page here). At this point, courseblogging is so easy that even the least technologically-minded can learn how to use services like blogger or wordpress to post syllabi, course notes and internet media.
There are problems however, the most glaring of which is the legality of streaming a clip that may infringe on copyright. If I am not responsible for illegally uploading a video of Malcolm X onto the web, and yet I stream it from my course blog, am I complicit in infringing on someone's copyright? Now that Google has bought YouTube, and a more aggressive purging of copyright protected works on the service has begun, will content useful for education dwindle over time? I don't have the answers to these urgent questions yet, but even in the worst of cases, we can assume that good, educational material will be made available, legally, on YouTube and other such services in the future, either for free or for a modest fee.
For example, I am confident that soon I will be able to tell my students that, in addition to buying One Hundred Years of Solitude for a class, they will have to purchase a $5 video interview with García Márquez off of the World Wide Web and watch it at home. And, even as I write this, podcasting technologies are already in place that will allow faculty members to tell their students that most of their lectures will be available for free downloading on Itunes so that class time can be used more productively for interactive learning activities, such as group work and presentations. Unlike more static and limited media, like PowerPoint and the decorative course Web page, video and audio-sharing help professors be more creative and ambitious in the classroom.
In sum, my friends, YouTube is not just for memorializing lazy Sundays when you want to "mack on some cupcakes." It can help your students "mack" on knowledge.
Christopher Conway is associate professor of modern languagesÂ and coordinator of the Spanish program at the University of Texas at Arlington, where he teaches Latin American literature and culture.
My undergraduate students can't accurately predict their academic performance or skill levels. Earlier in the semester, a writing assignment on study styles revealed that 14 percent of my undergraduate English composition students considered themselves "overachievers." Not one of those students was receiving an A in my course by midterm. Fifty percent were receiving a C, another third was receiving B's and the remainder had earned failing grades by midterm. One student wrote, "overachievers like myself began a long time ago." She received a 70 percent on her first paper and a low C at midterm.
A solid 40 percent of my undergraduate English composition students described themselves as "overachieving if they liked the subject." The grades for these students, understandably, were scattered. Twenty-nine percent of my undergraduates described their study styles as "normal." Of these, 36 percent were working at a C level by midterm; another 18 percent were receiving a B, with another 18 percent receiving a D. The remaining 27 percent were failing. One student who described his study style as "normal" confessed that he rarely started assignments when they were first given out, waited until a few days before work was due to get started, and did a lot of his writing over the weekend. At midterm, he was receiving an F.
A whopping 17 percent of my undergraduates confessed to being "underachievers"-studying at the last minute, not doing the reading, and only spending a few hours on major assignments.
My data -- though tremendously limited in scope-seems to be supported by Douglas Hacker's findings. In "Test Prediction and Performance in a Classroom Context," an article published in the Journal of Educational Psychology, Hacker and colleagues at the University of Memphis found that to a great degree, overconfidence is prevalent among low-performing students. True, Hacker's study was with introductory psychology undergraduates rather than English composition students. But it does give me a great deal of insight as to how students predict performance. And although I don't like the idea of considering my students "low-performers," I admit that my state does have a weak high school system, and my university doesn't turn paying students away. Even low-achievers are admitted under a "conditional" admission standard.
I don't think Hacker's experience is unique. Dozens of colleagues have told me that their undergraduates simply do not have the tools to criticize and evaluate their own work-much less predict how well they will do on assignments. What's behind this great drop in ability to assess performance?
A colleague of mine believes that primary and secondary schools, overwhelmed with students who were never well prepared for school, students with learning disabilities, addictions, and even severe discipline problems have found themselves delivering a weakened curriculum. Yet a recent article in American Educator, "Balancing the Educational Agenda," by Jean Johnson et al, indicates that academic standards for secondary schools are rising-a move supported not only by academics and administrators, but by parents as well. Perhaps this move is recent; those of us in postsecondary positions are, in effect, responding to the academic standards in place a decade ago. Or perhaps regions suffer differences in standards based on student population and demands of the surrounding community. Another possibility, among others, is that the curriculum shifts when administrators attempt to adopt each new trend in education.
Just as an inconsistent curriculum can cause students pain and confusion, the move from high school to college can be a hair-raising leap. High school systems with a weak curriculum (or one that is not consistently applied) can create tremendous problems later in the academic system. At my current university, a large percentage of our undergraduates have brought their high-school experience with them. Some of them are under the impression that if they now come to their college classes every day, they will pass these courses. Many of these students are stunned whey they fail their first major test or receive a D for what they thought was an award-winning essay.
Even when academic advisors warn, "college is not high school," many of these under-prepared students continue to believe that they will receive A's for a token effort. Clear class objectives and strongly worded syllabi are often ignored as students continue to overestimate their capabilities based on past performance. After the first major assessment, many of these students clutch at their professors' arms, lamenting, "But I got A's in high school."
Colleagues often commiserate about this particular student response. After all, it's almost impossible to respond to. Often we can only repeat that our expectations are clearly outlined in the syllabus and course outline, that we would be happy to define these further, and that they may want to drop the course if they cannot afford to dedicate time outside of class for study. One professor friend often tells students that the A's they received in high school are simply a step toward admittance to the local university-not a guarantee of grades.
Another colleague says that the level of competition has changed from high school to college; until freshman understand that, they will be inaccurately predicting performance. And the vilification of competition has set up many students to believe that they are all doing well -- regardless of outcome. As a friend of mine in teacher education says, "It's the result of the 'feel good 70's' where every child was deemed a winner. Competition was considered demoralizing. The result was a continuing trend in the 90's which focuses on reward across the board. Today, we have turned out a glut of students who not only can't assess themselves, but who have received awards for every little thing." When they enroll in college, students often still have no idea how they fare when compared with other undergraduates.
A good friend on staff at a university library says that helicopter parenting also contributes to the problem. When he escorts tour groups of grade school students through his facility for a hands-on learning tour, he often sees parents and grandparents hovering so much that instead of helping young students stay focused on assignments, the children end up being spectators instead of participants in what should be their chance to "try out" a college experience. The urge to spare children from the ego blows of failure, too, often results in parents actually doing homework for children -- not only in primary and secondary grades, but in college, as well. Some parents, perhaps perfectionists, have rationalized that if they "assist" their child, the task will be done in a much shorter time. Unfortunately for these children, their formative years do not allow for effort, failure, increased effort, failure, and another attempt which results in success. This set up may produce college students who can only do the most superficial work before becoming discouraged.
Another academic friend says that an inability to focus and an overwhelming desire to multi-task make it almost impossible for students to succeed academically. Staff who manage study rooms and carrels often report that students seem to work "in dribs and drabs" while in the library. Backpacks in hand, they often loiter at computers and chat at tables instead of actually working. Dependent on high-tech gadgets, these same students often feel compelled to answer phones while in study groups, and constantly check e-mail or view sites such as Facebook or MySpace during hours they had dedicated to working on assignments or doing research.
One reference desk librarian reported that she would see students "studying for four minutes, goofing off for a half an hour, and then studying for another four minutes." Of course, these students often report to faculty that they've been studying for hours -- which in some ways must seem like an accurate appraisal. After all, they were in the library; therefore, they must have been studying. In the end, a diminished attention span combined with the feeling that doing one thing at a time is a waste of time almost guarantees that they will not be turning in top A-level work to their professors.
This narrative is very incomplete as a study. I'm sure that sociologists, education specialists and other experts have outlined a long history and a number of interrelated causes that explain this drop out in students' knowledge.
As an instructor of undergraduate core classes, however, I realize that my responsibility does not stop at content. I cannot simply list assessment as a course objective and then feign ignorance when my students show me again and again that they cannot predict their own performance. Strategies -- not only for instruction, but also for exercises and assessment -- are integral in setting my students on the right path for the remainder of their college careers. To accomplish this, I realize that I will need to work much, much harder to help my undergraduates understand assignments and expectations, rubrics and assessments, in-class grades and the prediction of success.
Some is already in place. Like many English composition instructors, I do instill a peer-editing component to my writing courses -- not only to help students view writing as a process -- but to give them some tools and much-needed experience in evaluating student work. I provide instruction in how to apply rubrics to student work and often use past student work as "models." Some students are glad for the transparency of my courses; with a detailed 16-week course outline given out at the first class, they can start relating course objectives to specific assignments throughout the semester. Lessons scaffold one on another; assessment follows thorough instruction. Still, there is much to be done. It's clear that I need to develop more tools to help my students learn to assess their own work and predict academic performance more accurately.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.