You are the best teacher in the world and you’ve just turned in your grades for the best class you’ve ever taught. If you are a college professor you know what comes next: the barrage of complaints about the low grade, the litany of excuses for why this or that missed assignment was due to health reasons, the pleading that the B+ be raised to an A- or medical school plans will be foiled and a life ruined, the thinly veiled threat that changing a grade is easier than dealing with a student judiciary complaint (or an irate parent). It's the most demoralizing part of being an educator today.
And here’s the paradox: If our students weren’t all tireless grade-grinders, we educators would have failed them. Yes, you read that right. They were well-taught and learned well the lesson implicit in our society that what matters is not the process or the learning but the end result, the grade. A typical college freshman today has been through 10 years of No Child Left Behind educational philosophy where "success" has been reduced to a score on a test given at the end of the course. For a decade, they have had the message that a good teacher is someone whose students succeed on those end-of-grade standardized tests. Teacher salaries can be docked in some states, whole schools can be closed or privatized in others, if students score too poorly. The message we're giving our students today is all that really counts is the final score. No wonder they fight for a good one!
Conversely, for all that colleges say about not being solely concerned with test scores, almost all boast their average score, and that score helps colleges with their own rankings in U.S. News & World Report and more serious collegiate ranking and accreditation systems. And, to go one step higher, aggregated scores on those tests are what make the world educational rankings in the 34 Organization for Economic Cooperation and Development (OECD) -- you know, the ones where our students humiliate us each year by coming in 14th in reading, 17th in sciences, and 25th in math.
It's not like an examiner is standing there really probing to see what each child in the world does or does not know, what they remember, or how well they apply their knowledge. All those rankings reduce all the skills and content one learns in a subject to how well one does on a standardized test that research shows might actually cover about 20 percent of the actual content of a course, demotivates actual learning, and can be "scammed" either through intensive cram sessions, pre-testing tutoring in the form of the test, or enormous amounts of class time dedicated to "teaching to the test." None of those are good educational philosophy, but in a world where the final score is what counts, those methods get the end result you want — not of more learning but of a higher score that opens doorways.
So don’t blame the next 18-year-old who calls, knocks on your door, or e-mails to boost that B- to an A-. He's been taught his whole life how to get the good final score that equals educational success. Why should he be able to forget that lesson just because it's a seminar and the grades are based on essays requiring eloquence, persuasive rhetoric, critical thinking, and analytical skills? If he has absorbed the educational philosophy of our nation that grade achievement constitutes educational success, then whining for an A- makes him ... what? Well, eloquent, rhetorically persuasive, and a final critical and analytic thinker. Right? Doesn’t he now have the grade on his transcript to prove it?
I wish I were being simply ironic and flippant here, but I think this is very serious. I know just how serious when I talk to corporate recruiters about the current crop of students and they tell me that, whereas it used to take six months for a great student to become a great coworker, it now takes a good year to two years. This generation of students is still waiting for the final grade, for the test score that shows they've aced a subject, not for some demonstrable achievement of mastery or — the most crucial workplace skill — an ability to survey one’s skills and knowledge, understand where one might be lacking, and then find someone to fill in that gap through a collaborative effort or to find some way, typically online, to learn the skill one needs in order to make up for previous educational losses.
It takes nearly two years because they’ve been educated in a system where the grade is all but have to live adult lives in a world where self-awareness, diagnosis of a problem, an ability to solve a problem by applying previous knowledge, and collaborative skills all count — along with eloquence, persuasive skills, critical thinking, and analytical skills.
Here’s the punch line for college profs out there: We will not eliminate the grade-grubbing until we change our current educational system. Until then, we will need to be putting up with a lot of whining by students who have mastered the system that educators and policy makers have created for them.
Here's the punch line for college students out there: Until you educate yourself beyond the assumptions of the system we’ve foisted off on you, you’ll be depriving yourself of the real skills and knowledge that constitute the only educational test worth anything: the test of how well your formal education prepares you for success in everything else. Cherish the great seminar teacher, even if she gives you a B-. It’s what went on inside that classroom — not the grade at the end of it — that truly constitutes achievement in the world beyond school.
As an editorial postscript, I should mention that I almost never have grade-grabbing and whining, but, for over a decade, I've been using peer-grading, contract grading, and other forms of participatory learning (such as the class writing its own standards and constitution at the beginning). I write about a lot of that in Now You See It.
And, if you never have a chance to take a class with a truly inspiring seminar teacher, you'd do worse than to master the "rules for students and teachers" offered by the great avant garde composer John Cage. You'll notice he never says anything about test scores, grades, teaching to the test, or OECD rankings. The test he wants you to pass is the big one: success in the rest of our life.
Cathy Davidson is co-founder of HASTAC (Humanities, Arts, Science and Technology Advanced Collaboratory) and co-director of the Ph.D. Lab in Digital Knowledge at Duke University. This essay first appeared on her HASTAC blog.
In Wisconsin, the average faculty member at the state's technical college system earned more in 2011-12 than the average faculty member at the state's university system, according to an analysis by Gannett Wisconsin Media. The reason is "overages," pay that faculty members in the state for teaching more than the required number of courses. Overage pay averaged $12,000 per technical college faculty member, compared to $1,400 for University of Wisconsin professor. And 67 technical college instructors earned more than $50,000 in overage pay.
Proposed rules issued by the Internal Revenue Service note concerns among some colleges about how to calculate when adjunct faculty members should be considered to be working close enough to full-time to be entitled to employee health insurance under the new health-care legislation. Some colleges -- worried about being required to provide health insurance -- have been cutting adjunct hours so the institutions can be sure that the adjuncts wouldn't fall under the new law. Faculty advocates have said that these moves are unfair and represent an over-reaction to the situation. (Most faculty leaders say that colleges should be paying the health insurance for these adjuncts anyway.)
The IRS proposed rules explain that "some commenters noted that educational organizations generally do not track the full hours of service of adjunct faculty, but instead compensate adjunct faculty on the basis of credit hours taught. Some comments suggested that hours of service for adjunct faculty should be determined by crediting three hours of service per week for each course credit taught. Others explained that some educational organizations determine whether an adjunct faculty member will be treated as a full-time employee by comparing the number of course credit hours taught by the adjunct faculty member to the number of credit hours taught by typical non- adjunct faculty members working in the same or a similar discipline who are considered full-time employees."
The proposed rules don't take a stand on how best to determine the hours actually worked by those who are not full-timers, and suggest that more guidance will be coming. However the IRS does state that colleges need to use "reasonable" methods for counting hours. It would "not be a reasonable method of crediting hours to fail to take into ... in the case of an instructor, such as an adjunct faculty member, to take into account only classroom or other instruction time and not other hours that are necessary to perform the employee’s duties, such as class preparation time," the document says.
The first distinguished speaker at the recent forum on "Justifying the Humanities" followed a recent trend by asserting that the humanities were invented in the American university of the 1930s as an organizational convenience. The second distinguished speaker explained that in their current "somewhat dated" form the humanities are a product of the Cold War, developed in the 1950s through courses in the Great Books and Western Civilization. By the time the final distinguished speaker began his remarks I feared that we would be told the humanities were invented yesterday in sudden meta-post-Postmodernist fabrication.
First, the good news. It is true that the familiar triadic American curricular structure of liberal education (natural science, social science and the humanities) is relatively recent. Hence, the form of humanistic studies is not chiseled in ancient marble, but has changed and can and should continue to change in response to new circumstances.
The bad news is that recent history is only a small part of the story. The foreshortening perspective on the humanities comes at a price. It’s not just that it overlooks a tradition that reaches back to the Stoic philosophers of ancient Greece, Cicero in ancient Rome, Petrarch and Boccaccio in Italy and the amazing scholars of the Renaissance. Nor is it just that we deprive ourselves of the benefits of breakthroughs in contemporary scholarship. It’s that we risk losing sight of what motivated the great era of humanism.
Renaissance humanists, such as Joseph Justus Scaliger, Marsilio Ficino and Lorenzo Valla, applied immense energy and learning to establishing reliable texts of ancient authors, commenting on them, making them accessible through translations, and teaching them in a way that created an understanding of human beings and moral agency not restricted by the dictates of medieval theology. Philosophy, literature, history and the visual arts were transformed by such humanism. Soon universities were transformed as well.
When I asked Paul Grendler, a professor of history emeritus at the University of Toronto and an expert on education in the Renaissance, about this transition, he reminded me that this change was revolutionary. "A group of 15th-century Italian scholars decided that the best way to train men (and a few women) to be learned, eloquent, and morally responsible leaders of society was to introduce them to the great authors and texts of ancient Greece and Rome.… They coined the phrase studia humanitatis (humanistic studies) for this new, revolutionary school curriculum." This transformative sense of purpose accounts, I believe, for the energy and enduring excitement of their work.
At the university level great changes began around 1425 when humanists began teaching in Italian universities such as Bologna, Florence and Padua. They taught rhetoric, poetry and what they sometimes called humanitas, meaning more or less what Cicero had meant by it, "the knowledge of how to live as a cultivated, educated member of society," as Grendler phrase it. In general these humanists connected this goal to the stadium humanitatis – we would say classical studies broadly conceived. That terminology spread from Italy to the British Isles where, for example, the Scotstarvit chair of humanity was established at the University of St. Andrews in 1620. By 1800 literae humaniores were part of examinations at Oxford. The pattern was revised in the mid-19th century into the famous "Greats" program, which later provided the model for "Modern Greats," that is, Oxford’s degree program Philosophy, Politics and Economics. Humanism, it turns out, is not only adaptable to modern circumstances; it can be infectious.
The term "humanities" did not, then, drop out of the sky into the unknowing laps of American academic bureaucrats. Leaders of colleges and universities in the early 20th century consciously and deliberately evoked the tradition of Renaissance humanism in an effort to develop some equivalent amid mass education in the modern world. We may argue about how successful they were, but they saw the challenge.
It's still the challenge today, almost a century later. In responding to it, we can still learn from those Renaissance scholars. If we neglect them, we overlook an important part of the background to contemporary humanistic studies, but we also we risk replicating, validating, and promulgating one of the gravest failings of the humanities as currently practiced – "presentism," that is, an exclusionary focus on the most highly modernized societies of the contemporary world, and the uncritical judging of the past by today’s interests and standards. In so doing one severs contact with what so motivated and energized these great humanist scholars and with the perspective on human life and conduct that they opened up.
If this root of the humanities is severed by ignorance, neglect or hostility, it will not be surprising if humane learning begins to look a little withered, and if students find what they have learned soon wilts and leaves them without the perspective and depth of understanding that a rigorous and wide-ranging education in the humanities should provide.
W. Robert Connor is senior advisor at the Teagle Foundation.
Gerda Lerner, considered one of the pioneers of women's history, died Wednesday at the age of 92. An obituary in The New York Times detailed her career, much of which was spent at the University of Wisconsin at Madison. She focused on the history of women in the United States when such a focus was highly unusual among historians. In 1972, when Lerner was teaching at Sarah Lawrence College, she created a master's degree in women's history -- the first graduate degree in the field.
Academics ask all kinds of questions and make all kinds of judgments about parts of colleagues' or potential colleagues' lives that are irrelevant to their jobs, writes Nate Kreuter. He says it's time to stop.