Over the last generation, most colleges and universities have experienced considerable grade inflation. Much lamented by traditionalists and explained away or minimized by more permissive faculty, the phenomenon presents itself both as an increase in students’ grade point averages at graduation as well as an increase in high grades and a decrease in low grades recorded for individual courses. More prevalent in humanities and social science than in science and math courses and in elite private institutions than in public institutions, discussion about grade inflation generates a great deal of heat, if not always as much light.
While the debate on the moral virtues of any particular form of grade distribution fascinates as cultural artifact, the variability of grading standards has a more practical consequence. As grades increasingly reflect an idiosyncratic and locally defined performance levels, their value for outside consumers of university products declines. Who knows what an "A" in American History means? Is the A student one of the top 10 percent in the class or one of the top 50 percent?
Fuzziness in grading reflects a general fuzziness in defining clearly what we teach our students and what we expect of them. When asked to defend our grading practices by external observers -- parents, employers, graduate schools, or professional schools -- our answers tend toward a vague if earnest exposition on the complexity of learning, the motivational differences in evaluation techniques, and the pedagogical value of learning over grading. All of this may well be true in some abstract sense, but our consumers find our explanations unpersuasive and on occasion misleading.
They turn, then, to various forms of standardized testing. When the grades of an undergraduate have an unpredictable relevance to a standard measure performance, and when high quality institutions that should set the performance standard routinely give large proportions of their students “A” grades, others must look elsewhere for some reliable reference. A 3.95 GPA should reflect the same level of preparation for students from different institutions.
Because they do not, we turn to the GMAT, LSAT, GRE, or MCAT, to take four famous examples. These tests normalize the results from the standards-free zone of American higher education. The students who aspire to law or medical school all have good grades, especially in history or organic chemistry. In some cases, a student’s college grades may prove little more than his or her ability to fulfill requirements and mean considerably less than the results of a standardized test that attempts to identify precisely what the student knows that is relevant to the next level of academic activity.
Although many of us worry that these tests may be biased against various subpopulations, emphasize the wrong kind of knowledge, and encourage students to waste time and money on test prep courses, they have one virtue our grading system does not provide: The tests offer a standardized measure of a specific and clearly defined subset of knowledge deemed useful by those who require them for admission to graduate or professional study.
Measuring State Investment
If the confusion over the value of grades and test scores were not enough, we discover that at least for public institutions, our state accountability systems focus heavily on an attempt to determine whether student performance reflects a reasonable value for taxpayer investment in colleges and universities. This accountability process engages a wide range of measures -- time to degree, graduation rate, student satisfaction, employment, graduate and professional admission, and other indicators of undergraduate performance -- but even with the serious defects in most of these systems, they respond to the same problems as do standardized tests.
Our friends and supporters have little confidence in the self-generated mechanisms we use to specify the achievement of our students. If the legislature believed that students graduating with a 3.0 GPA were all good performers measured against a rigorous national standard applied to reasonably comparable curricula, they would not worry much about accountability. They would just observe whether our students learned enough to earn a nationally normed 3.0 GPA.
Of course, we have no such mechanism to validate the performance of our students. We do not know whether our graduates leave better or worse prepared than the students from other institutions. We too, in recognition of the abdication of our own academic authority as undergraduate institutions, rely on the GRE, MCAT, LSAT, and GMAT to tell us whether the students who apply (including our own graduates) can meet the challenges of advanced study at our own universities.
Partly this follows from another peculiarity of the competitive nature of the American higher education industry. Those institutions we deem most selective enroll students with high SATs on average (recognizing that a high school record is valuable only when validated in some fashion by a standardized test). Moreover, because selective institutions admit smart students who have the ability to perform well, and because these institutions have gone to such trouble to recruit them, elite colleges often feel compelled to fulfill the prophecy of the students’ potential by ensuring that most graduate with GPA’s in the A range. After all, they may say, average does not apply to our students because they are all, by definition, above average.
When reliable standards of performance weaken in any significant and highly competitive industry, consumers seek alternative external means of validating the quality of the services provided. The reluctance of colleges and universities, especially the best among us, to define what they expect from their students in any rigorous and comparable way, brings accreditation agencies, athletic organizations, standardized test providers, and state accountability commissions into the conversation, measuring the value of the institution’s results against various nationally consistent expectations of performance.
We academics dislike these intrusions into our academic space because they coerce us to teach to the tests or the accountability systems, but the real enemy is our own unwillingness to adopt rigorous national standards of our own.
In its 1966 declaration on professional ethics, the American Association of University Professors, the professoriate’s representation organization, states:
"Professors, guided by a deep conviction of the worth and dignity of the advancement of knowledge, recognize the special responsibilities placed upon them....They hold before them the best scholarly and ethical standards of their discipline.… They acknowledge significant academic or scholarly assistance from (their students)."
Notwithstanding such pronouncements, higher education recently has provided the public with a series of ethical solecisms, most spectacularly the University of Colorado professor Ward Churchill’s recidivistic plagiarism and duplicitous claim of Native American ancestry along with his denunciations of 9/11 victims. While plagiarism and fraud presumably remain exceptional, accusations and complaints of such wrong doing increasingly come to light.
Some examples include Demas v. Levitsky at Cornell, where a doctoral student filed a legal complaint against her adviser’s failure to acknowledge her contribution to a grant proposal; Professor C. William Kauffman’s complaint against the University of Michigan for submitting a grant proposal without acknowledging his authorship; and charges of plagiarism against by Louis W. Roberts, the now-retired classics chair at the State University of New York at Albany. Additional plagiarism complaints have been made against Eugene M. Tobin, former president of Hamilton College, and Richard L. Judd, former president of Central Connecticut State University.
In his book Academic Ethics, Neil Hamilton observes that most doctoral programs fail to educate students about academic ethics so that knowledge of it is eroding. Lack of emphasis on ethics in graduate programs leads to skepticism about the necessity of learning about ethics and about how to teach it. Moreover, nihilist philosophies that have gained currency within the academy itself such as Stanley Fish’s “antifoundationalism” contribute to the neglect of ethics education. . For these reasons academics generally do not seriously consider how ethics education might be creatively revived. In reaction to the Enron corporate scandal, for instance, some business schools have tacked an ethics course onto an otherwise ethically vacuous M.B.A. program. While a step in the right direction, a single course in a program otherwise uninformed by ethics will do little to change the program’s culture, and may even engender cynicism among students.
Similarly, until recently, ethics education had been lacking throughout the American educational system. In response, ethicists such as Kevin Ryan and Karen Bohlin have advocated a radical renewal of ethics education in elementary schools. They claim that comprehensive ethics education can improve ethical standards. In Building Character in Schools, Ryan and Bohlin compare an elementary school to a polis, or Greek city state, and urge that ethics be fostered everywhere in the educational polis.
Teachers, they say, need to set standards and serve as ethical models for young students in a variety of ways and throughout the school. They find that manipulation and cheating tend to increase where academic achievement is prized but broader ethical values are not. They maintain that many aspects of school life, from the student cafeteria to the faculty lounge, ought to provide opportunities, among other things, to demonstrate concern for others. They also propose the use of vision statements that identify core virtues along with the implementation of this vision through appropriate involvement by staff and students.
We would argue that, like elementary schools, universities have an obligation to ethically nurture undergraduate and graduate students. Although the earliest years of life are most important for the formation of ethical habits, universities can influence ethics as well. Like the Greek polis, universities become ethical when they become communities of virtue that foster and demonstrate ethical excellence. Lack of commitment to teaching, lack of concern for student outcomes, false advertising about job opportunities open to graduates, and diploma-mill teaching practices are examples of institutional practices that corrode rather than nourish ethics on campuses.
Competency-based education, broadly considered, is increasingly of interest in business schools. Under the competency-based approach (advocated, for example, by Rick Boyatzis of Case Western Reserve University, David Whetten of Brigham Young University, and Kim Cameron of the University of Michigan), students are exposed not only to theoretical concepts, but also to specific competencies that apply the theory. They are expected to learn how to apply in their lives the competencies learned in the classroom, for instance those relating to communication and motivating others. Important ethical competencies (or virtues) should be included and fostered alongside such competencies. Indeed, in applied programs such as business, each discipline and subject can readily be linked to ethical virtues. Any applied field, from traffic engineering to finance, can and should include ethical competencies as an integral part of each course.
For example, one of us currently teaches a course on managerial skills, one portion of which focuses on stress management. The stress management portion includes a discussion of personal mission setting, which is interpreted as a form of stress management. The lecture emphasizes how ethics can intersect with practical, real world decision making and how it can relate to competencies such as achievement orientation. In the context of this discussion, which is based on a perspective that originated with Aristotle, a tape is shown of Warren Buffett suggesting to M.B.A. students at the University of North Carolina that virtue is the most important element of personal success.
When giving this lecture, we have found that street smart undergraduate business students at Brooklyn College and graduates in the evening Langone program of the Stern School of Business of New York University respond well to Buffett’s testimony, perhaps better than they would to Aristotle’s timeless discussions in Nicomachean Ethics.
Many academics will probably resist integration of ethical competencies into their course curriculums, and in recent years it has become fashionable to blame economists for such resistance. For example, in his book Moral Dimension, Amitai Etzioni equates the neoclassical economic paradigm with disregard for ethics. Sumantra Ghoshal’s article “Bad Management Theories are Destroying Good Management Practices,” in Academy of Management Learning and Education Journal, blames ethical decay on the compensation and management practices that evolved from economic theory’s emphasis on incentives.
We disagree that economics has been all that influential. Instead, the problem is much more fundamental to the humanities and social sciences and has its root in philosophy. True, economics can exhibit nihilism. For example, the efficient markets hypothesis, that has influenced finance, holds that human knowledge is impotent in the face of efficient markets. This would imply that moral choice is impotent because all choice is so. But the efficient markets hypothesis is itself a reflection of a deeper and broader philosophical positivism that is now pandemic to the entire academy.
Over the past two centuries the assaults on the rational basis for morals have created an atmosphere that stymies interest in ethical education. In the 18th century, the philosopher David Hume wrote that one cannot derive an “ought” from an “is,” so that morals are emotional and cannot be proven true. Today’s academic luminaries have thoroughly imbibed this “emotivist” perspective. For example, Stanley Fish holds that even though academics do exhibit morality by condemning “cheating, academic fraud and plagiarism,” there is no universal morality beyond this kind of “local practice.”
Whatever its outcome, the debate over the rational derivability of ethical laws from a set of clear and certain axioms that hold universally is of little significance in and of itself. It will not determine whether ethics is more or less important in our lives; nor will it provide a disproof of relativism -- since defenders of relativism can still choose not to accept the validity of the derivation.
Yet ethics must still be lived -- even though the knowledge, competency, skill or talent that is needed to lead a moral life, a life of virtue, may not be derived from any clear and certain axioms. There is no need for derivation of the need, for instance, for good interpersonal skills. Rather, civilization depends on competency, skill and talent as much as it depends on practical ethics. Ethical virtue does not require, nor is it sustained by, logical derivation; it becomes most manifest, perhaps, through its absence, as revealed in the anomie and social decline that ensue from its abandonment. Philosophy is beside the point.
Based on much evidence of such a breakdown, ethics education experts such as Thomas Lickona of the SUNY's College at Cortland have concluded that to learn to act ethically, human beings need to be exposed to living models of ethical emotion, intention and habit. Far removed from such living models, college students today are incessantly exposed to varying degrees of nihilism: anti-ethical or disembodied, hyper-rational positions that Professor Fish calls “poststructuralist” and “antifoundationalist.” In contrast, there is scant emphasis in universities on ethical virtue as a pre-requisite for participation in a civilized world. Academics tend to ignore this ethical pre-requisite, preferring to pretend that doing so has no social repercussions.
They are disingenuous – and wrong.
It is at the least counterintuitive to deny that the growing influence of nihilism within the academy is deeply, and causally, connected to increasing ethical breaches by academics (such as the cases of plagiarism and fraud that we cited earlier). Abstract theorizing about ethics has most assuredly affected academics’ professional behavior.
The academy’s influence on behavior extends, of course, far beyond its walls, for students carry the habits they have learned into society at large. The Enron scandal, for instance, had more roots in the academy than many academics have realized or would care to acknowledge. Kenneth Lay, Enron’s former chairman, holds a Ph.D. in economics from the University of Houston.Jeff Skilling, Enron’s former CEO, is a Harvard M.B.A. who had been a partner at the McKinsey consulting firm, one of the chief employers of top-tier M.B.A. graduates. According to Malcolm Gladwell in The New Yorker, Enron had followed McKinsey’s lead, habitually hiring the brightest M.B.A. graduates from leading business schools, most often from the Wharton School. Compared to most other firms, it had more aggressively placed these graduates in important decision-making posts. Thus, the crimes committed at Enron cannot be divorced from decision-making by the best and brightest of the newly minted M.B.A. graduates of the 1990s.
As we have seen, the 1966 AAUP statement implies the crucial importance of an ethical foundation to academic life. Yet ethics no longer occupies a central place in campus life, and universities are not always run ethically. With news of academic misdeeds (not to mention more spectacular academic scandals, such as the Churchill affair) continuing to unfold, the public rightly grows distrustful of universities.
It is time for the academy to heed the AAUP’s 1915 declaration, which warned that if the professoriate “should prove itself unwilling to purge its ranks of … the unworthy… it is certain that the task will be performed by others.”
Must universities learn the practical value of ethical virtue by having it imposed from without? Or is ethical revival possible from within?
Candace de Russy and Mitchell Langbert
Candace de Russy is a trustee of the State University of New York and a Hudson Institute Adjunct Fellow. Mitchell Langbert is associate professor of business at Brooklyn College of the City University of New York.
"They're just darlings," my co-worker said. "Absolute darlings."
"Uh-huh," I agreed, staring at my grading sheet. She is discussing five athletes at the private, four-year university where we teach. As another part-time foreign-language instructor comes in, I overhear their conversation.
"Well, they'll never get through nine chapters," said the "darling" woman.
"Oh," responded my friend, a woman who teaches Spanish.
"I'm going back to Chapter Five," said the first instructor, "I just love teaching these darling, darling boys."
I sat there, stunned. Was I hearing correctly? Was she simply dropping half of the curriculum to cater to a few students who couldn't do the work? Later, when we were alone in the office, I commented, "It's so hard to get them to work, but I keep pushing. I've got to get them through the whole book or they're sunk next semester."
"Oh, well, that's how it is with English comp, I'm sure," said the "darling" woman. "I mean you've got to cover the material."
"How is that different with Spanish?" I finally asked.
"Oh, well, I've got to make sure that they really get it." She responded. Frustrated, I couldn't think of anything else to say. This adjunct had developed a curriculum based on department-approved course objectives. She had turned in copies of her syllabus to the academic dean for approval. Then, frustrated by her students' inability or unwillingness to learn, she had simply chopped off the back end of her course.
Later she had confided that there were a few students who were "getting it," but that they would simply have to review the same materials over and over until the end of the semester because she was catering to the athletes. That morning, as my colleague left for her class, I jotted the note, "curriculum rip-off" in my notebook. Something would come of this, I thought. Something.
At lunch that day, with the provost at the head of the table, I commented that a fellow instructor wasn't teaching the curriculum. "What do you mean," the provost asked, voice surprisingly kind for a man in power.
"She said the athletes in her class weren't learning," I paused, unsure if I should go on, "so she cut out the last four chapters of the book."
"You're kidding!" said a physics instructor to my right.
"She'll review them later, right?" the provost asked.
Trembling, I kept my hands in my lap, "I got the impression that she wasn't going to teach the last four chapters at all."
"Really," said the provost. "What's her name?"
"Oh, I really couldn't say," I mumbled, gathering up my half-finished tray.
Face reddening, I made my way to drop off my tray. What had made me speak up? Me, an adjunct? A part-timer with no tenure, no security, no voice. I didn't bring it up again. In the next days, I asked co-workers innocuous questions about their classes. I found it hard to make eye contact with the provost.
What had made me speak up? Anger. A feeling that not only would the next instructors to teach these students be frustrated, their jobs only made that much more difficult, but the students were being ripped off in a wholesale fashion.
According to the students, the less they were taught, the better. But I knew better. And I had been on the receiving end of some of these half-taught students. One of my colleagues at a large community college in California had confessed that he passed any student who would sit through his course. With no work to grade them, he simply gave them all C's. He was not the only one, I realized.
When I had struggled with a student whose grammar was shockingly poor and who could not form a decent paragraph or essay, I sometimes wondered if they had simply tested well on the eligibility exam or if an unwitting colleague had passed them on to me.
And what did the students get out of this? Yes, their semester was easier. Yes, they had less homework. Yes, they could spend more time on sports. But at what cost? Their education was being whittled away by instructors who could not or would not insist on the curriculum. It was a simple matter of trading the short-term for the long-term goal. Given the choice, I knew that a smaller percentage of the students would vote for learning all that they were promised. Yes, some would complain and wheedle, but I must believe that instructors know better.
We are in a position of power and we must not misuse that power by stealing. And when we lop off a part of the curriculum that is too bothersome or too difficult for some students, we are stealing from all of the students. One colleague confessed that she often had to switch lesson plans around to teach what she needed to -- but she always covered the chapters that she had promised.
I'm not sure if she had been burned by a colleague or if she simply knew what the right thing to do was, but I admire her stance. I, too, frequently find that I need to "borrow from Peter to pay Paul" in lesson making, but I always cover the curriculum. Even in the classroom, when I am tempted to cut out a section that once seemed important, I review the materials later in my office and talk to senior instructors who can guide me.
It is dangerous to make impromptu decisions at the chalkboard. More often than not, I am dreaming of new ways to teach something that seems tedious -- a new essay, a new exercise, or examples taken from my own classes. Anything to get them to see the lesson in a new way. My struggle sometimes reminds me of my effort to clip my terrier's nails. After an hour my struggling and his howling, I finally brought my dog to the local veterinarian and paid the $15. His nails did get clipped. In the same way, I struggle with curriculum, but in the end, it gets taught.
My last concern was a big one -- what about our accreditation? This four-year university already had a poor reputation. Once known as a feeder campus for Stanford University, its price tag now seemed to have no correlation to its rigor or value. What if our accreditors found that we were not teaching the curriculum? What if they somehow found out that we were not achieving the course objectives that they had originally approved. What then?
After working on committees at the large community college in California, I had learned a healthy respect for the powers that be. Whether one was a tenured full-time instructor or an adjunct, we simply did not have the right to make such decisions on our own.
Suddenly I was thankful for those who had mentored me -- even those kind souls who sat at lunch with me. Their opinions, ideas and suggestions were helping to shape me. Every day, every semester. So many teachers, struggling, wrangling, working to be sure that curriculum gets taught. What a blessing to be one of those who hold the line. And those who benefit? We do. Instructors, administrators, and, most importantly, the students.
Shari Wilson is the pseudonym of an adjunct who has taught at many colleges in California. In a column last month, she wrote about the unintended consquences of the "six year rule" on faculty members who are off the tenure track.
Ethical lapses are in the news again. Former CEOs on trial. Journalists receiving secret payments. Congress revising its ethics rules to protect one of its own. In such a troubling environment, we understandably hear calls for colleges and universities to incorporate more ethics into their programs of study. But amid such calls, I see little appreciation of the deeply personal challenge of teaching ethics.
After almost 20 years of teaching ethics, I'm still trying to get my footing. This isn't because of my lack of familiarity with the subject. I've long studied the classics in my field and eagerly devour the latest in journal articles. Nor am I at a loss for ways to bring the subject to my students. I've taught ethics in a variety of settings, from an elite business school to a struggling community college to the selective liberal arts institution that's currently my home. In each academic setting, I've been able to discover a set of pedagogical techniques that fostered lively class discussions.
My struggle in teaching ethics involves something more. It involves, as Parker Palmer states in The Courage to Teach, "the self you bring to the project, your identity and integrity." In the morally shifting and conflicted world in which we live, my identity as an ethicist has always been a precarious enterprise.
This is because my identity as an ethicist is tied to the moral coherence of the culture in which I live. James Boyd White once defined culture as "a set of ways of claiming meaning for experience." Without a common wellspring of values, it's difficult for an ethicist to claim a definitive meaning for the work he does. Yet I teach within a culture that's morally at odds with itself. Contemporary life combines a pervasive skepticism about traditional morality with the strident reemergence of such morality. This moral dissonance can even come to reside in our individual psyches. My guess is that more than a few Christian fundamentalists watch "Desperate Housewives."
Within this divided moral culture, academia offers only limited options. Thus, David Brooks writes that young people seeking moral guidance on college campuses typically encounter two possibilities. There are those mostly on the left "who tell them to renounce commercialism, materialism, and vulgar endeavoring." There are those mostly on the right who counsel "a consciousness of ... original sin" and a commitment to "the fixed truth of natural law" and "the traditions of orthodox faith."
Within such a cultural mix, my choice of identity as an ethicist has always posed a risk. This is because the wholeness that individual integrity presupposes is difficult in a culture so morally divided. Within the dichotomy that Brooks identifies, for example, where does someone like me -- someone who regularly both recycles and prays -- fit?
The cultural awkwardness of my identity as an ethicist poses a distinctive challenge in the classroom. This is because "the self you bring to the project" is central to teaching ethics. Moral education is about more than the information you convey to students or the skills you help them develop. It is ultimately about the persons they become in the process. With a subject as intimately linked to students' development as ethics, the self you bring to a course is crucial to its outcome. Students view all I do and say in the classroom through the lens of who they think I am.
The cultural sensibilities they bring to their assessments of my character also complicate matters. My students' outlooks are often unreceptive to the possibilities of a common moral dialogue. They've grown up in a world in which the country has been carved up into red states and blue states. They log on to blogs that reinforce their own tastes and ignore those of others. They tune in to Fox News with its portrayal of issues from stem cell research to affirmative action as an ongoing morality play between secular humanists and religious conservatives. Thus, if I appear to come at things from either side of a cultural divide, I'll lose at least half the members of the class, even if they are unwilling to tell me exactly why.
Teaching across our cultural divides as an ethicist today requires drawing on an understanding of the moral life that is noticeably absent from our public media. The flaming practices of cyberspace and the shouting matches of talk radio encourage us to see the essence of our moral lives as residing in the views we espouse. Across much of our vast electronic commons, you are, morally speaking, what you believe. Speak favorably, for example, of gay marriage and you become, depending on who's judging, either morally enlightened or morally corrupt.
But away from the public airwaves, a different and deeper understanding of the moral life prevails in the more intimate relations of our daily lives. It is an understanding of the moral life that allows us to continue to talk to our neighbors, swap recipes, borrow drills, and enjoy our kids playing together, even if we voted differently in the last presidential election. This is an understanding of the moral life as depending more on the dispositions you have than the views you hold. The moral life, after all, is primarily something we do rather than something we talk about. It depends on traits deeper than the views we hold of the hot-button moral issues of our time. It centers instead on what Aristotle would have called virtues, our basic dispositions or ways of being in the world. Asked to describe the moral life, we typically include traits such as our capacity for kindness, our aspirations toward integrity, our respect for principle, our desire for worthy accomplishments.
More and more, I am drawing on this deeper understanding of the moral life in my courses. In a morally polarized world, it offers a classroom identity that keeps open the potential for a common moral dialogue. There are daily opportunities. A few minutes spent listening to a student relate his anxieties over an upcoming exam. Stooping down to help a student when she drops her books on the floor. When my actions reveal I care about my students before they discover my views of capital punishment, I take on for them an identity that still has resonance in a morally fragmented world. I'm a person who is trying, however imperfectly, to lead a moral life.
Acknowledging the moral life as a practice we all imperfectly engage in engenders a distinctive understanding of the moral life. As an imperfectly realized practice, the moral life is always richer than our conceptions of it. We are always learning the meaning of kindness as we encounter it in its myriad manifestations. Each time we are able to stand on principle, we deepen our appreciation of integrity.
Recognizing the moral life as richer than our conceptions has a poignant value in the classroom. This is because of the way this recognition can cultivate our respect for our moral differences. In order to be meaningful, this respect must be genuine, not a lazy or indifferent tolerance. It needs to be a respect that compels us to want to learn more about why we disagree with each other.
Lately, I've noticed a reoccurring reaction I get from my students. They put it in different ways, but its essence is this: "You always treated everything we said as if it had value." In whatever form this reaction takes, it's one of my favorite compliments. For, as is true of us all, everything a student says has value. Not equal value, of course. There are some classroom comments that are arrestingly insightful. There are plenty that are downright silly. But errors, even of the grievous sort, have value, even if for no other reason than they force us to better articulate the truth. Recognizing this, my students are beginning the kind of genuine moral conversation so many of our public pundits seemingly no longer believe is possible.
Jeffrey Nesteruk is a professor of legal studies at Franklin & Marshall College.
For some time, it has seemed that we in the education world have come to define "quality" mainly in terms of "quantity.” We encourage students to add a minor or a second major, assuming more credits equals more learning. We advise students to add co-curricular activities to their growing list of academic credits, trusting that these additional experiences will enrich their lives and build attractive resumes, making a college education even more valuable.
The same emphasis on quantity has marked the evaluation of professors. The more articles published, the more classes taught, the more committees chaired, the more worthwhile the contribution.
Too often today the assumption is the busier the student, the more he is learning, and the busier the professor, the more she is contributing.
The tragedy and irony in this perspective is that when we stop to think for a moment (and, of course, we don’t have time to) we acknowledge -- especially those of us in liberal arts colleges -- that the wisdom we claim to value above all can only come when we have time to reflect. Activity and busyness, the gods of our culture, are demons in the life of those seeking the mind and the spirit. No matter how good the individual academic or co-curricular experience may be, the cumulative affect of so many experiences is destructive.
So what can be done?
The growing awareness among accrediting agencies that learning is not based on "seat time" -- time spent in a classroom seat -- has opened the door to new, creative ways to maximize time in higher educational institutions. In a recent meeting with the North Central Association, an association official expressed interest in working with my college, a Christian liberal arts institution in Iowa, to explore alternative ways of doing college: the goal being to encourage more reflection and make room for the kind of learning that will one day blossom into wisdom.
What might "a new way of doing college" look like?
One of the questions we will be exploring at Northwestern is this: Is it possible to organize a student’s four years in a more developmental manner, gradually cultivating a way of life that uses time effectively for lifelong learning -- rather than just lifelong busyness?
Here is one possibility: The freshman year would be much like it is today with a structured academic schedule and opportunities to participate in co-curricular activities. But as students move through their sophomore, junior and senior years, they would be weaned from a structured but busy schedule of many curricular and co-curricular experiences to a less structured schedule with more time for critical reflection and synthesis. The focus would be more on overall learning than on particular activities -- more on growing internal student discipline than on relying on external direction.
As sophomores they might replace typical 200-level general education courses with interdisciplinary seminars that integrate service learning and independent study with traditional classroom content. Significant time would be allocated for individual reflection and small group interaction -- the desire being to nurture a dialogue within the students themselves, and with each other and their professors, on what truly matters in life. Faculty and student life personnel might work together in guiding student learning. Knowledge, experience and personal development would merge to help shape the student’s view of the world as she embarks on courses in her major. The other segment of the sophomore year would introduce the foundational content of various academic majors.
The junior year becomes an in-depth exploration of the world through the lens of one particular academic discipline. This might be done best by studying one course at a time, for at least one of the semesters. It might also include an ongoing seminar throughout the year to stimulate reflection on the moral and spiritual implications of the material being explored.
For the senior year, the goal would be synthesis -- academic, professional and personal. Bridges would be built to the world students will encounter after graduation. A senior project culminating in a personal mission statement, incorporating both career and life goals, would provide an appropriate climax.
Developing an effective means of assessing student learning is, perhaps, the most important and difficult task we face once “seat time” is unseated as a critical measure for academic credit. But this challenging task also provides the opportunity to rethink what it is we want 21st century students to learn and how we ought to teach it. “Seat time” allowed us to avoid the central question of learning: What is important to know? The unspoken answer in our current educational culture is: whatever a professor can cover in three or four hours of class hours per week, plus two hours of reading per class period.
Beyond mastering information and acquiring the skills to communicate it, what is it we desire students to understand about our world, ourselves, God? What is wisdom? How do we introduce it to our students? How can we tell if they are moving toward it?
With time opened up and vision restored, new pedagogical questions arise: How many hours should students spend in class? How many in the library, online, off campus ... in another part of our country or the world? What kinds of experiential and service learning would enhance understanding and excellent performance in our given field of study? In this new expanded world of learning, what is the role of the professor, staff, other students, practitioners off campus, and the individual student? Are there not better ways than only traditional letter grades to evaluate student learning and assist them in their life choices?
"Doing college in a new way" would also provide faculty with the opportunity to take a fresh look at how they spend their time. Where in the current configuration of faculty loads is time for reflection and the growth of wisdom? Is there room in our definition and practice of scholarship to create wisdom? Is the role of wise mentors to our students cultivated and rewarded?
In recent years many significant changes have come to higher education: interdisciplinary classes, innovative first-year programs, undergraduate research, service and distributive learning, to name a few. But is it not true that undermining even these admirable innovations as well as more traditional educational experiences is the problem of time? All too often, hasn’t change come by adding something new to an already crowded educational menu?
Isn’t it time to revisit time itself?
Bruce G. Murphy
Bruce G. Murphy is president of Northwestern College, in Orange City, Iowa.
With the return of students to campuses this month comes annual hand wringing over the lack of diversity in our science and engineering classes. The United States is at a 14-year low in the percentage of women (16.3 percent) and African Americans (7.1 percent) enrolling in engineering programs.
An engineering student body that is composed largely of white males is problematic not only because of its narrow design perspective, but also because failing to recruit from large segments of the population means the number of new engineers we produce falls well short of our potential.
Although this is not a new problem, it is becoming ever more urgent. We are faced with an engineering juggernaut emanating from India and China, with more than 10 Asian engineers graduating for every one in the United States. Educated at great institutions like the Indian Institutes of Technology or Tshingua University, these engineers are every bit as technically competent as their American counterparts.
So here we sit at the beginning of the 21st century, in the most technologically advanced nation on the planet, with a comparatively small supply of home grown engineers, facing an explosion of technical mental horsepower overseas.
Why fight the tide? Couldn’t we simply import all the engineering we need? Couldn’t we play the economic advantage and close our expensive colleges of engineering? Do we gain anything by educating engineers in the United States?
I would argue that, with a few exceptions, we really don’t. As they are currently trained, American engineers are at relative parity with their foreign-born counterparts, are more expensive, and offer no competitive advantage. But there is a way out of this predicament, one that would provide a raison d’etre for American engineering programs, and make for the kind of design the planet now so urgently needs.
Faced with the increasingly complex design challenges of the 21st century -- an era where resources of every kind are reaching their limit, human populations are exploding, and global-warming related environmental catastrophe beckons -- engineers need to grow beyond their traditional roles as problem-solvers to become problem-definers.
To catalyze this shift, our engineering curriculum, now packed with technical courses, needs a fresh start. Today’s engineers must be educated to think broadly in fundamental and integrative ways about the basic tenets of engineering. If we define engineering as the application of math and science in service to humanity, these tenets must include study of the human condition, the human experience, the human record.
How do we make room in the crowded undergraduate engineering curriculum for students to explore disciplines outside math and science – literature and economics, history and music, philosophy and languages – that are vital if we are to create a competitive new generation of engineering leaders? By scaling back the number of increasingly narrow, and quickly outmoded technical courses students are now required to take -- leaving only those that teach them to think like engineers and to gain knowledge to solve problems. Students need to have room to in their schedules for wide ranging elective study.
There is a need for advanced engineering training, to be sure, but the place for that is at the graduate level -- in one of the growing number of nine-month masters programs, perhaps.
Teaching engineers to think, in the broadest, cross-disciplinary sense, is critical. Consider two examples of the failures of the old way.
The breach of the levees in New Orleans, which has unleashed a torrent of human suffering, came about not solely because engineers designed for a category 3, rather than a category 4, hurricane. It was caused by decades of engineering and technical hubris, which resulted in loss of wetlands and overbuilding on a grand scale. Would engineers who had studied economics, ecology, anthropology, or history have acted the same?
Or consider Love Canal (or any of a thousand other environmental debacles of the last 50 years). Would designers who had read Thoreau’s Walden, studied Beethoven’s Pastoral Symphony, or admired Monet’s poppies have allowed toxic chemicals to be dumped into the environment so remorselessly?
To prepare our engineers to engage in the major policy decisions we’ll face over the next 25 years -- many of which hinge heavily on the implications of technological design -- we must truly rethink what they need to know when they graduate.
If we do, our progeny stand a fighting chance of having a life worth living. And by giving engineering a larger, more socially relevant framework, expanding it beyond the narrow world of algorithms, the field should prove more attractive to women, minorities, and other underrepresented groups.
Just imagine. A growing and increasingly diverse number of domestically trained engineers -- equipped with the broad insight and critical thinking skills the world needs, which will also give them a competitive advantage over their foreign counterparts.
Overhauling the engineering curriculum would be challenging to be sure, but it’s a design worth building.
Domenico Grasso is dean of engineering and mathematical sciences at the University of Vermont. He was the founding director of the Picker Engineering Program at Smith College and is vice chair of the U.S. Environmental Protection Agency Science Advisory Board.
My nephew always wanted to design things, and as a teenager he seemed to be on the fast track to a good engineering degree. Took a community college calculus course while in high school. Worked for a computer aided design (CAD) company part-time. Got admitted to a top-notch engineering program.
In the middle of his first year, however, he dropped out. Now he’s enrolled in a technical institute, learning CAD skills and only CAD skills. He’s very happy, apart from the fact that he’s had to move back home.
Where did I, his college-professor aunt, go wrong? Or did I?
What upsets me is that he doesn’t see any inherent value in a liberal education, in a college degree rather than a tech-school certificate. But I also wonder whether mine isn’t a narrow attitude -- after all, the kid wants to do CAD, so why should he have to get a degree? What doors would a bachelor’s degree open for him? He knows what kind of work he likes to do, and he can start earning money at it a lot faster with a certificate in “Drafting/CAD.”
I checked out the Web site for the school he’s chosen. The school’s URL is a .com, not a .edu. It boasts that "Our short term curriculums focus only on courses directly related to your field of study, without any fluff. These classes are taught by instructors that have professional experience in the industry."
Besides bringing out my usually-held-in-check pedantry (“the plural of curriculum is not curriculums, dammit” and “instructors WHO, not instructors that!”), the Web site rankles because it’s pitching itself directly against the idea of a liberal arts education or “fluff.”
I worry that I’m being a snob. Why shouldn’t he study a job skill instead of spending his time reading books he doesn’t care about? Why not go to class at a place that teaches auto body repair instead of philosophy? I’m sure they’ll get him a job when he graduates, which is more than I do for my students.
My nephew is a working-class kid. Neither of his parents went to college, and most of his friends won’t. Still, I had assumed that because he got good grades in high school and wanted to be an engineer he’d want to get a degree. The problem is, I think, no one ever told him what a degree was. No one ever talked to him about the difference between higher education and job training. No one ever said why he should want to read books he wouldn’t choose on his own or take a class in chemistry or go to a lecture by a political scientist. And now he probably will never do any of those things.
As a professor I have to mourn that choice. Yet as an aunt who wants to see her nephew happy, I have to agree with my brother that the tech school is probably an OK place for my nephew right now. I don’t mean to imply that he’s doomed. I just worry that he’s severely limited his future choices.
I wonder how it is that higher education is still, in the 21st century, failing to get its message across. Failing to explain the difference between a college degree and a tech-school certificate to the high school student whose parents never went to college. Failing to convince the engineering-school freshman that he should bother to stick around and learn the things that don’t seem immediately applicable to his future career. We have not convinced legislatures that the state has a stake in higher education, that a citizenry trained in critical thinking, writing, and research skills beyond the high-school level is citizenry better able to make informed decisions.
Both my nephew and I want him to be happy in his work. But he sees short-term, where I am trained to see long-term. He wants a job, very soon, in the computer work he has come to like in his part-time job. His model is his father, a union worker who will spend his whole adult life doing the same well paid work with good union benefits. But such jobs are fast becoming extinct in the United States, and the jobs that are replacing them, especially information-industry jobs, are nowhere near as secure.
As the sister who didn’t start building up her retirement fund until 15 years after her younger brother (all those years of college and graduate school), I may have limited credibility here. Nevertheless, I believe that a degree would offer my nephew more options in the long run, more opportunities years down the line, in an age in which most people change careers multiple times in their working lives.
The problem is that I cannot be convincing in a larger culture that does not actively promote the value of higher education. Cushioned in a liberal arts college whose mostly middle- and upper-class students enroll (presumably) because they already understand that there’s value in an education that is not job training, I sometimes forget that a college degree can still be a tough sell. That’s why my nephew’s rejection of college came as such a jolt. But it’s reminded me that American anti-intellectualism can have personal consequences. It’s reminded me that I can’t assume the product I’m selling will advertise itself. If I believe that a better educated citizenry would make for a better state or country or world, then I’d better start writing letters and contacting legislators and talking to kids. Guess I should have started with my nephew.
Paula Krebs is professor of English at Wheaton College, in Massachusetts.
A well-known geology professor asks to see the manager of the Media Center at a university in northern California. He is angry because Media Services has dumped the 16mm film collection and inadvertently retired his favorite plate tectonics film from the 1970s. Initially sympathetic, the manager searches several collections and vendors to find that the film is out of print. When she tries to show him some newer titles on the topic, he storms out of the library, shaking his head.
At a large community college, students have been filing into the department chair’s office every semester for seven years complaining about a poetry class. Seems that the long-time professor of a particular course has been using the same typed-and-copied handouts for over 20 years. He not only refused to use a computer to type up handouts, but didn't see the rationale for updating the information "since the poets were dead." The complaints continued and each successive chair finds a reason not to bother this in-house scholar.
An established speech professor can't understand why students can’t seem to get assignments in on time -- much less refer to sources properly. Although many students have complained about his WebCT supplemental, he never makes time to update information. The current undated syllabus, handouts and course documents are dated 2003. None of the links refer students to information properly -- and frustrated students finally formed a study group to unravel the mess.
One math professor has copied handouts that were original purple dittos produced in 1974. Her syllabus is two pages long and does not reflect the current course. She assigns rote-memorization exercises from many-times-copied sheets dated 1976. Not only do students avoid her, but her colleagues have also accepted her refusal to participate in committees as a sign that she is no longer a viable member of the teaching team at this university.
When do instructors decide that teaching is "doing time?" When do research and other activities become so important that teaching becomes such a chore that any effort is considered wasted time? At times I have been shocked that colleagues with so much knowledge seem intent on making that information indecipherable to their students. What is the motivation here? Under pressure, a few colleagues have admitted that they simply don't care. A few others simply seem oblivious that good teaching requires effort. After watching a senior professor hold committee members hostage to pontificate and grouse, a colleague scratched the words "God-complex" on a tablet and flashed it to me. Could this also be the reason that our older colleague's materials are decades old?
One colleague has confided that he thinks this refusal to update materials, assessment methods and teaching methods basically makes difficult graduate-level courses into "academic cod liver oil." I have to agree.
Yes, some topics are decades old -- does that mean there truly is nothing new on the horizon? Maybe. Maybe not. In graduate school, I took courses from a tenured professor who was well known as a Joycean scholar. True, Ulysses was finally published in 1922 –- yet he made this course exciting. Not only did he continue to publish, he also constantly read current interpretations on this writer’s work. Yes, he was an expert, but he was fresh. He squeezed our brains until we spit out something worthwhile -- and on occasion an original idea surfaced.
Excited that a tiny new thread was developing, he'd showcase it and develop it in the next class. True, he almost never gave handouts, but his quizzes, midterms and final were constantly changing to reflect developments in current theory. His courses were always overflowing with students fighting to get in -- a true reflection of his desire to learn and develop as an instructor after thirty years of teaching.
Why rework old lessons and develop new exercises and experiments to teach subjects that were being taught as early as the 19th century? Several reasons:
1. Students feel valued.
2. After identifying a student population (or even a particular class), targeting their area of need is simply effective teaching. It also gives rise to flexibility in teaching difficult subjects -- not that syllabi are discarded, but lessons may be developed to re-teach an area that students find difficult.
3. Textbooks are updated -- so why not professor’s lessons?
4. It's invigorating to find new ways to teach old materials. By researching and redoing assignments, instructors often feel "refreshed" and able to teach it more effectively to students. This also may encourage instructors to use less passive teaching methods.
5. New examples of old subjects can make an instructor seem reachable -- and interested in his or her students. This often opens the door for communication.
Yes, sometimes the information is as old as the hills; but the way that we teach it does not have to be antiquated. Reaching for new worksheets, developing new handouts and assessment tools is simply good teaching. At the least, the willingness to develop as an instructor can stave off the dreaded replacement when younger instructors are being interviewed at record pace.
I know one woman who, at 72, commands a lively classroom and is constantly being recommended by students. Her trick? She often digs up current information from the media (and at times from her own students) to develop topics. Newspaper clippings circulate the classroom -- and the use of less traditional teaching techniques keeps her students interested. No longer content to rely on lectures, she has incorporated short audio or video tape clips into readings, has students on debate teams for tricky subjects, and often has students reply either in small groups -- or through index cards -- which gives students the anonymity they need to really express themselves. After hearing her students roar with laughter one day, I sat in on her class and was excited by the energy she generated. Now I have borrowed some of her techniques, and often drop by her office to see what she is developing for her classes. I can only hope to be this interested in teaching at 72! The department chair has warned her that she will never be allowed to retire -- she jokingly says she will "die at the chalkboard." And when she does, it will be quite a loss.
I understand that in some cases, professors have their hands tied. A department-approved syllabus may give an instructor very little "wiggle" room. Those who haven't yet earned tenure, non-tenure track professors, or adjuncts may give in to pressure to simply adapt a syllabus from a template or existing format. A list of "acceptable" texts or only one particular text allowed in a course can be stifling. Still, I know many instructors who have worked with materials, topics, and experiments within the curriculum to pep up what could be a dull subject.
Once asked to teach business management at a small, private university, I was limited to one text and a standardized test bank. I not only worked to choose questions that reflected our in-class work, but also choose current (and often edgy) business news stories to represent the less interesting business practices outlined in our text. Students were much more engaged when they could see how the concepts in the book were taking place in the real business world. Instead of just memorizing facts for standardized tests, these students had the opportunity to apply what they were studying to case studies in several papers. Many went through several drafts to ensure that they were drawing the correct inferences. I was thrilled to see a variety of topics with each set of papers -- it was obvious that these were thinking adults with interests of their own.
I’m not sure if it's because I've only been teaching for six years, or because I've never had the security that comes with tenure, but I am constantly developing new handouts and materials for my courses. In fact, to secure the job I have now at a university in the Midwest, I created a sentence structure handout that included several local landmarks. Butt-kissing? Sure. But the students paid attention and were able to identify dependent and independent clauses by last example.
In addition to this mini-lesson, I also developed a new lesson for them on logical fallacies. Since the students had done a somewhat dry section in the textbook requiring the memorization of approximately 20 forms of ineffective argument, I focused on review. Realizing that JimCarey's Liar Liar had a wonderful scene full of faulty arguments, I reviewed a short three-minute scene of the movie many times and typed up a transcript of the dialogue, word for word. I then created a handout with instructions, this dialogue and a quick review of the faulty arguments from the textbook. In class, I put on the video, and played the three-minute scene a few times. Not only were students thrilled to be "watching a movie" in class -- they quickly started to put into place what they know and correctly identified line after line of faulty arguments. The discussion was lively.
While on staff at this university, I’ve developed a number of handouts based on in-class discussion and my students' assignments. Last week I asked students to write a well-structured paragraph on a current television show, "My Name is Earl." After following my instructions online, students read several short articles on the show and were instructed to write a solid topic sentence and supporting details -- including three short, direct quotes from the linked articles. They were instructed to e-mail the paragraph directly from their computers to my office e-mail account. In addition to immediately e-mailing them concrete suggestions with what worked and what needed improvement, I then developed a worksheet for use in class the next day.
I chose a paragraph that had a somewhat confused topic sentence, a few good supporting details, some off-topic writing, and wonderful sentence structure. I featured this paragraph at the top of the handout (anonymously, of course), and using MSWord, set up columns for my typed comments. In the columns, I typed in specific comments and questions about each sentence and used short lines with arrows to point to the sentence in question. Below I typed up four other examples of paragraphs -- to be used for group work in class. After going over the first sample paragraph, students felt much more confident in evaluating the remaining paragraphs and reporting back to the class. It was a rousing success, with students able to see their own writing evaluated and use those tools to evaluate classmates’ work (and later their own). I was rewarded with visible improvement in paragraph development in the next stack of essays from this class.
I realize that this development of assignments and handouts takes time. It's true that I don’t always have time and sometimes rely on standard tools that have been in use for years. But when I take the time to look at my students’ abilities and develop something just for that class, I am always greatly rewarded in my effort. And I suspect that the students feel important, too. Students really enjoy seeing their own work in print -- even if they're going to be constructively criticized in class. I am always careful to start out with the positive and gently guide the discussion as we go along.
On occasion, I "mix up" the examples from several sections of the same course so that students don't feel threatened. But my experience is that students love the attention and often brag to friends that they "made the handout." At a campus where I taught in California, my developmental writing students often turned in work with the comment, "Oh, please use mine." They really wanted to improve -- and the reward they got when the class praised them built confidence. Specific comments in the context of a lesson helped them more than a vague C at the top of a page. I’ll admit that at this college, a posse of undergraduates followed me from course to course -- and when they ran the gamut, they asked me to recommend "another cool instructor."
An online colleague accuses me of being an "academiwhore." She believes I am pandering to students rather than looking toward the loftier goal of asking students to learn about history while doing English grammar exercises. She may be right. I’ve never been the pompous type. But I have enjoyed teaching these last six years -- and deans seem to want to hire me. So I have no complaints. To me, recent texts, up-to-date syllabi and materials are part of the "teaching solution." Good education, a desire to help people, and the ability to be open-minded seem integral as well. Like many instructors, I rely on in-services where instructors share effective lesson plans, industry publications, am sure I need all of it to avoid becoming disdainful of my students -- and ultimately un-teachable myself.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
Some months ago I started asking friends, colleagues from my teaching days, researchers in higher education, faculty members of various ages and ranks, deans, provosts and presidents, and focus groups of students: “What’s the status of the Big Questions on your campus?” Quite deliberately I avoided defining “Big Questions,” but I gave as examples such questions as “Who am I? Where do I come from? What am I going to do with my life? What are my values? Is there such a thing as evil? What does it mean to be human? How can I understand suffering and death? What obligations do I have to other people? What does it mean to be a citizen in a democracy? What makes work, or a life, meaningful and satisfying?” In other words, I wanted to know what was happening to questions of meaning and value that traditionally have been close to the heart of a liberal education.
Some of what I found puzzled me. People pointed out quite properly that some Big Questions were alive and well in academia today. These included some questions about the origin of the universe, the emergence of life, the nature of consciousness, and others that have been raised by the scientific breakthroughs of the past few decades.
In the humanities and related social sciences the situation was rather different. Some friends reminded me that, not all big questions were in eclipse. Over the past generation faculty members have paid great attention to questions of racial, ethnicity, gender and sexual identity. Curricular structures, professional patterns, etc. continue to be transformed by this set of questions. Professors, as well as students, care about these questions, and as a result, write, teach and learn with passion about them.
But there was wide agreement that other big questions, the ones about meaning, value, moral and civic responsibility, were in eclipse. To be sure, some individual faculty members addressed them, and when they did, students responded powerfully. In fact, in a recent Teagle-sponsored meeting on a related topic, participants kept using words such as “hungry,” “thirsty,” and “parched” to describe students’ eagerness to find ways in the curriculum, or outside it, to address these questions. But the old curricular structures that put these questions front and center have over the years often faded or been dismantled, including core curricula, great books programs, surveys “from Plato to NATO,” and general education requirements of various sorts. Only rarely have new structures emerged to replace them.
I am puzzled why. To be sure, these Big Questions are hot potatoes. Sensitivities are high. And faculty members always have the excuse that they have other more pressing things to do. Over two years ago, in an article entitled “Aim Low,” Stanley Fish attacked some of the gurus of higher education (notably, Ernest Boyer) and their insistence that college education should “go beyond the developing of intellectual and technical skills and … mastery of a scholarly domain. It should include the competence to act in the world and the judgment to do so wisely” ( Chronicle of Higher Education, May 16 2003). Fish hasn’t been the only one to point out that calls to “fashion” moral and civic-minded citizens, or to “go beyond” academic competency assume that students now routinely achieve such mastery of intellectual and scholarly skills. We all know that’s far from the case.
Minimalist approaches -- ones that limit teaching to what another friend calls “sectoral knowledge -- are alluring. But if you are committed to a liberal education, it’s hard just to aim low and leave it at that. The fact that American university students need to develop basic competencies provides an excuse, not a reason, for avoiding the Big Questions. Students also need to be challenged, provoked, and helped to explore the issues they will inevitable face as citizens and as individuals. Why have we been so reluctant to develop the structures, in the curriculum or beyond it, that provide students with the intellectual tools they need to grapple thoughtfully over the course of a lifetime with these questions?
I see four possible reasons:
1. Faculty members are scared away by the straw man Stanley Fish and others have set up. Despite accusations of liberal bias and “brainwashing” no faculty member I know wants to “mold,” “fashion” or “proselytize” students. But that’s not what exploring the Big Questions is all about. Along with all the paraphernalia college students bring with them these days are Big Questions, often poorly formulated and approached with no clue that anyone in the history of humankind has ever had anything useful to say about any of them. There’s no need to answer those questions for students, or to try to fashion them into noble people or virtuous citizens for the republic. There is, however, every reason to help students develop the vocabularies, the metaphors, the exempla, the historical perspective, the patterns of analysis and argument that let them over time answer them for themselves.
2. A second possible reason is that faculty are put off by the feeling they are not “experts” in these matters. In a culture that quite properly values professional expertise, forays beyond one’s field of competence are understandably suspect. But one does not have to be a moral philosopher to raise the Big Questions and show some of the ways smart people in the past have struggled with them. I won’t pontificate about other fields, but in my own field -- classics and ancient history -- the Big Questions come bubbling up between the floor boards of any text I have ever taught. I don’t have to be a specialist in philosophy or political science to see that Thucydides has something to say about power and morality, or the Odyssey about being a father and a husband. A classicist’s job, as I see it, is to challenge students to think about what’s implicit in a text, help them make it explicit and use that understanding to think with.
3. Or is it that engaging with these “Big Questions” or anything resembling them is the third rail of a professional career. Senior colleagues don’t encourage it; professional journals don’t publish it; deans don’t reward it and a half dozen disgruntled students might sink your tenure case with their teaching evaluations. You learn early on in an academic career not to touch the third rail. If this is right, do we need to rewire the whole reward system of academia?
4. Or, is a former student of mine, now teaching at a fine women’s college, correct when she says that on her campus “It tends to be that … those who talk about morality and the big questions come from such an entrenched far right position … that the rest of us … run for cover.”
Some of the above? All of the above? None of the above? You tell me, but let’s not shrug our shoulders and walk away from the topic until we’ve dealt with one more issue: What happens if, for whatever reason, faculty members run for the hills when the Big Questions, including the ones about morality and civic responsibility, arise? Is this not to lose focus on what matters most in an education intended to last for a lifetime? In running away, do we not then leave the field to ideologues and others we cannot trust, and create a vacuum that may be filled by proselytizers, propagandists, or the unspoken but powerful manipulations of consumer culture? Does this not sever one of the roots that has over the centuries kept liberal education alive and flourishing? But, most serious of all, will we at each Commencement say farewell to another class of students knowing that for all they have learned, they are ill equipped to lead an examined life? And if we do, can we claim to be surprised and without responsibility if a few decades later these same graduates abuse the positions of power and trust in our corporate and civic life to which they have ascended?
W. Robert Connor
W. Robert Connor is president of the Teagle Foundation, which is dedicated to strengthening liberal education. More on the foundation's “Big Questions” project may be found on its Web site. This essay is based on remarks Connor recently made at a meeting of the Middle Atlantic Chapters of Phi Beta Kappa, at the University of Pennsylvania.
I just finished grading a hefty stack of final examinations for my introductory-level U.S. history survey course. The results were baleful.
On one section of the exam, for example, I supplied some identification terms of events and personages covered in class, asking students to supply a definition, date, and significance for each term. In response to “Scopes Monkey Trial,” one student supplied the following:
"The scopes monkey trial was a case in the supreme court that debated teaching evolution in the schools. It happened in 1925. Mr. Scope a teacher in a school wanted to teach about God and did not want to teach about evolution. The ACLU brought in lawyers to help with the case of Mr. Scopes. In the end Mr. Scopes side did not have the people's opinion. Evolution won. It is significant because now you have to teach evolution in school, you can't teach about God."
This answer might be considered a nearly perfect piece of evidence against intelligent design of the universe, since it gets just about everything (apart from the date) wrong: punctuation, spelling, grammar, and historical fact.
For those needing a refresher, Tennessee high school biology teacher John T. Scopes assigned a textbook informed by evolutionary theory, a subject prohibited by the state legislature. The court ruled against Scopes, who had, obviously, broken the law. But the defense won in the court of public opinion, especially after the ACLU’s lawyer, Clarence Darrow, tore apart William Jennings Bryan, the former Democratic presidential candidate, witness for the prosecution, and Biblical fundamentalist. The press dubbed it the "Scopes Monkey Trial" (inaccurately, since the theory of human evolution centered upon apes) and pilloried Bryan. As Will Rogers put it, "I see you can't say that man descended from the ape. At least that's the law in Tennessee. But do they have a law to keep a man from making a jackass of himself?"
An outside observer might ascribe my student’s mistakes to the political culture of this Midwestern city, where barely a day goes by without a letter to the editor in the local paper from some self-appointed foot soldier of the religious right.
That, however, wouldn’t explain another student who thought the 1898 war between the United States and Spain, fought heavily in Cuba, was about communism (not introduced into Cuba until after the 1959 revolution). Nor would it explain a third student who thought that the Scopes verdict condoned Jim Crow racial segregation.
A minority of students performed admirably, receiving grades in the range of A, while hewing, of course, to varied interpretations. Their success proved the exam was based upon reasonable expectations. However, the median exam grade was a C -- the lowest I’ve yet recorded, and fairly devastating for a generation of students who typically aspire to a B.
I was wondering what to make of this dispiriting but solitary data set when I read about the Education Department study released late last week that shows that the average literacy of college-educated Americans declined precipitously between 1992 and 2003. Just 25 percent of college graduates scored high enough on the tests to be deemed “proficient” in literacy.
By this measure, literacy does not denote the mere ability to read and write, but comprehension, analysis, assessment, and reflection. While “proficiency” in such attributes ranks above “basic” or “intermediate,” it hardly denotes rocket science. It simply measures such tasks as comparing the viewpoints in two contrasting editorials.
The error-ridden response I received about the Scopes Monkey Trial speaks less to the ideological clash of science and faith than to a rather more elemental matter. As students in the 1960s used to say, the issue is not the issue. The issue is the declining ability to learn. The problem we face, in all but the most privileged institutions, is a pronounced and increasing deficiency of student readiness, knowledge, and capacity.
Neither right nor left has yet come to terms with the crisis of literacy and its impact on higher education. The higher education program of liberals revolves around access and diversity, laudable aims that do not speak to intellectual standards. Conservatives, for their part, are prone to wild fantasies about totalitarian leftist domination of the campuses. They cannot imagine a failure even more troubling than indoctrination -- the inability of students to assimilate information at all, whether delivered from a perspective of the left or the right.
It would be facile to blame the universities for the literacy crisis, since it pervades our entire culture at every level. The Education Department’s statistics found a 10-year decline in the ability to read and analyze prose in high school, college, and graduate students alike.
However, the crisis affects the university profoundly, and not only at open-enrollment institutions like the regional campus on which I teach. Under economic pressure from declining government funding and faced with market competition from low-bar institutions, many universities have increasingly felt compelled to take on students whose preparation, despite their possession of a high-school degree, is wholly inadequate. This shores up tuition revenue, but the core project of the higher learning is increasingly threatened by the ubiquitousness of semi-literacy.
How can human thought, sustained for generations through the culture of the book, be preserved in the epoch of television and the computer? How can a university system dedicated to the public trust and now badly eroded by market forces carry out its civic and intellectual mission without compromising its integrity?
These questions cry out for answer if we are to stem a tide of semi-literacy that imports nothing less than the erosion of the American mind.
Â Christopher Phelps is associate professor of history at Ohio State University at Mansfield.