You are the best teacher in the world and you’ve just turned in your grades for the best class you’ve ever taught. If you are a college professor you know what comes next: the barrage of complaints about the low grade, the litany of excuses for why this or that missed assignment was due to health reasons, the pleading that the B+ be raised to an A- or medical school plans will be foiled and a life ruined, the thinly veiled threat that changing a grade is easier than dealing with a student judiciary complaint (or an irate parent). It's the most demoralizing part of being an educator today.
And here’s the paradox: If our students weren’t all tireless grade-grinders, we educators would have failed them. Yes, you read that right. They were well-taught and learned well the lesson implicit in our society that what matters is not the process or the learning but the end result, the grade. A typical college freshman today has been through 10 years of No Child Left Behind educational philosophy where "success" has been reduced to a score on a test given at the end of the course. For a decade, they have had the message that a good teacher is someone whose students succeed on those end-of-grade standardized tests. Teacher salaries can be docked in some states, whole schools can be closed or privatized in others, if students score too poorly. The message we're giving our students today is all that really counts is the final score. No wonder they fight for a good one!
Conversely, for all that colleges say about not being solely concerned with test scores, almost all boast their average score, and that score helps colleges with their own rankings in U.S. News & World Report and more serious collegiate ranking and accreditation systems. And, to go one step higher, aggregated scores on those tests are what make the world educational rankings in the 34 Organization for Economic Cooperation and Development (OECD) -- you know, the ones where our students humiliate us each year by coming in 14th in reading, 17th in sciences, and 25th in math.
It's not like an examiner is standing there really probing to see what each child in the world does or does not know, what they remember, or how well they apply their knowledge. All those rankings reduce all the skills and content one learns in a subject to how well one does on a standardized test that research shows might actually cover about 20 percent of the actual content of a course, demotivates actual learning, and can be "scammed" either through intensive cram sessions, pre-testing tutoring in the form of the test, or enormous amounts of class time dedicated to "teaching to the test." None of those are good educational philosophy, but in a world where the final score is what counts, those methods get the end result you want — not of more learning but of a higher score that opens doorways.
So don’t blame the next 18-year-old who calls, knocks on your door, or e-mails to boost that B- to an A-. He's been taught his whole life how to get the good final score that equals educational success. Why should he be able to forget that lesson just because it's a seminar and the grades are based on essays requiring eloquence, persuasive rhetoric, critical thinking, and analytical skills? If he has absorbed the educational philosophy of our nation that grade achievement constitutes educational success, then whining for an A- makes him ... what? Well, eloquent, rhetorically persuasive, and a final critical and analytic thinker. Right? Doesn’t he now have the grade on his transcript to prove it?
I wish I were being simply ironic and flippant here, but I think this is very serious. I know just how serious when I talk to corporate recruiters about the current crop of students and they tell me that, whereas it used to take six months for a great student to become a great coworker, it now takes a good year to two years. This generation of students is still waiting for the final grade, for the test score that shows they've aced a subject, not for some demonstrable achievement of mastery or — the most crucial workplace skill — an ability to survey one’s skills and knowledge, understand where one might be lacking, and then find someone to fill in that gap through a collaborative effort or to find some way, typically online, to learn the skill one needs in order to make up for previous educational losses.
It takes nearly two years because they’ve been educated in a system where the grade is all but have to live adult lives in a world where self-awareness, diagnosis of a problem, an ability to solve a problem by applying previous knowledge, and collaborative skills all count — along with eloquence, persuasive skills, critical thinking, and analytical skills.
Here’s the punch line for college profs out there: We will not eliminate the grade-grubbing until we change our current educational system. Until then, we will need to be putting up with a lot of whining by students who have mastered the system that educators and policy makers have created for them.
Here's the punch line for college students out there: Until you educate yourself beyond the assumptions of the system we’ve foisted off on you, you’ll be depriving yourself of the real skills and knowledge that constitute the only educational test worth anything: the test of how well your formal education prepares you for success in everything else. Cherish the great seminar teacher, even if she gives you a B-. It’s what went on inside that classroom — not the grade at the end of it — that truly constitutes achievement in the world beyond school.
As an editorial postscript, I should mention that I almost never have grade-grabbing and whining, but, for over a decade, I've been using peer-grading, contract grading, and other forms of participatory learning (such as the class writing its own standards and constitution at the beginning). I write about a lot of that in Now You See It.
And, if you never have a chance to take a class with a truly inspiring seminar teacher, you'd do worse than to master the "rules for students and teachers" offered by the great avant garde composer John Cage. You'll notice he never says anything about test scores, grades, teaching to the test, or OECD rankings. The test he wants you to pass is the big one: success in the rest of our life.
Cathy Davidson is co-founder of HASTAC (Humanities, Arts, Science and Technology Advanced Collaboratory) and co-director of the Ph.D. Lab in Digital Knowledge at Duke University. This essay first appeared on her HASTAC blog.
The first distinguished speaker at the recent forum on "Justifying the Humanities" followed a recent trend by asserting that the humanities were invented in the American university of the 1930s as an organizational convenience. The second distinguished speaker explained that in their current "somewhat dated" form the humanities are a product of the Cold War, developed in the 1950s through courses in the Great Books and Western Civilization. By the time the final distinguished speaker began his remarks I feared that we would be told the humanities were invented yesterday in sudden meta-post-Postmodernist fabrication.
First, the good news. It is true that the familiar triadic American curricular structure of liberal education (natural science, social science and the humanities) is relatively recent. Hence, the form of humanistic studies is not chiseled in ancient marble, but has changed and can and should continue to change in response to new circumstances.
The bad news is that recent history is only a small part of the story. The foreshortening perspective on the humanities comes at a price. It’s not just that it overlooks a tradition that reaches back to the Stoic philosophers of ancient Greece, Cicero in ancient Rome, Petrarch and Boccaccio in Italy and the amazing scholars of the Renaissance. Nor is it just that we deprive ourselves of the benefits of breakthroughs in contemporary scholarship. It’s that we risk losing sight of what motivated the great era of humanism.
Renaissance humanists, such as Joseph Justus Scaliger, Marsilio Ficino and Lorenzo Valla, applied immense energy and learning to establishing reliable texts of ancient authors, commenting on them, making them accessible through translations, and teaching them in a way that created an understanding of human beings and moral agency not restricted by the dictates of medieval theology. Philosophy, literature, history and the visual arts were transformed by such humanism. Soon universities were transformed as well.
When I asked Paul Grendler, a professor of history emeritus at the University of Toronto and an expert on education in the Renaissance, about this transition, he reminded me that this change was revolutionary. "A group of 15th-century Italian scholars decided that the best way to train men (and a few women) to be learned, eloquent, and morally responsible leaders of society was to introduce them to the great authors and texts of ancient Greece and Rome.… They coined the phrase studia humanitatis (humanistic studies) for this new, revolutionary school curriculum." This transformative sense of purpose accounts, I believe, for the energy and enduring excitement of their work.
At the university level great changes began around 1425 when humanists began teaching in Italian universities such as Bologna, Florence and Padua. They taught rhetoric, poetry and what they sometimes called humanitas, meaning more or less what Cicero had meant by it, "the knowledge of how to live as a cultivated, educated member of society," as Grendler phrase it. In general these humanists connected this goal to the stadium humanitatis – we would say classical studies broadly conceived. That terminology spread from Italy to the British Isles where, for example, the Scotstarvit chair of humanity was established at the University of St. Andrews in 1620. By 1800 literae humaniores were part of examinations at Oxford. The pattern was revised in the mid-19th century into the famous "Greats" program, which later provided the model for "Modern Greats," that is, Oxford’s degree program Philosophy, Politics and Economics. Humanism, it turns out, is not only adaptable to modern circumstances; it can be infectious.
The term "humanities" did not, then, drop out of the sky into the unknowing laps of American academic bureaucrats. Leaders of colleges and universities in the early 20th century consciously and deliberately evoked the tradition of Renaissance humanism in an effort to develop some equivalent amid mass education in the modern world. We may argue about how successful they were, but they saw the challenge.
It's still the challenge today, almost a century later. In responding to it, we can still learn from those Renaissance scholars. If we neglect them, we overlook an important part of the background to contemporary humanistic studies, but we also we risk replicating, validating, and promulgating one of the gravest failings of the humanities as currently practiced – "presentism," that is, an exclusionary focus on the most highly modernized societies of the contemporary world, and the uncritical judging of the past by today’s interests and standards. In so doing one severs contact with what so motivated and energized these great humanist scholars and with the perspective on human life and conduct that they opened up.
If this root of the humanities is severed by ignorance, neglect or hostility, it will not be surprising if humane learning begins to look a little withered, and if students find what they have learned soon wilts and leaves them without the perspective and depth of understanding that a rigorous and wide-ranging education in the humanities should provide.
W. Robert Connor is senior advisor at the Teagle Foundation.
Academics ask all kinds of questions and make all kinds of judgments about parts of colleagues' or potential colleagues' lives that are irrelevant to their jobs, writes Nate Kreuter. He says it's time to stop.
A scholar committed to the digital humanities once summed up his long-term strategy for winning their acceptance with a terse, sardonic comment. “We will advance,” he said, “funeral by funeral.” It's the kind of sentiment that's often felt, but seldom so well expressed -- or so brutally.
But assuming that time is on digital culture’s side also tempts fate. The humanities include bodies of knowledge that have developed over periods ranging from a decade to a couple of millennia and more. Digital technologies can emerge and eclipse one another in the time it takes to write a single monograph. The wisdom of reorganizing one around the other is at least questionable.
A paper in the December issue of Literary & Linguistic Computing called “Toward Modeling the Social Edition: An Approach to Understanding the Electronic Scholarly Edition in the Context of New and Emerging Social Media” manages to be forward-looking but not triumphalistic. It also poses the interesting question of whether the turnover in the stock of digital tools might actually have a productive relationship with long-established ways that scholarly communities engage with their primary sources.
The list of its authors is headed up by Ray Siemens and Meagan Timney, both of the Electronic Textual Cultures Laboratory at the University of Victoria, in British Columbia. It includes, as appendices, a couple of substantial bibliographical essays that were posted online a couple of months before the paper itself was published. Siemens et al. have been venturing the concept of the “social edition” for at least a couple of years. At this point, it still refers to something potential or emergent, rather than fully realized: a speculation more than a blueprint.
But the paper offers a logical extrapolation from existing trends -- a plausible glimpse of the shape of things to come. Siemens and his colleagues point out that there is a gap between how electronic editions of texts are prepared, on the one hand, and how scholars use the available technology, on the other. “The types of electronic scholarly editions we see prominently today,” they write, “were largely developed before the ubiquity of the web that we now enjoy and do not accurately reflect the full range of useful possibilities present for academic engagement and interaction around the textual materials that are our focus.”
At the same time, gaining legitimacy for electronic editions has for a long time meant adhering fairly closely to established formats for definitive editions of texts. Siemens and his coauthors sketch a typology that begins with material prepared more or less along the lines of a scholarly edition in print, with its features made available in slightly different form. The reader of such a “dynamic text” could click around to find annotations, variant readings, cross-references, and so on.
Subsequent formats for scholarly e-texts incorporated links to pertinent primary and secondary sources -- whether as part of the edition itself or elsewhere online. This meant, in effect, grafting a good research library onto the text. The edition would reflect the state of the existing scholarship – or the state of the editors’ scholarship while preparing it, in any event.
Just when the later species of “hypertextual” and “dynamic” scholarly e-editions arrived on the scene is not indicated, but probably not much later than the early ’00s, to go by the authors’ descriptions. In the meantime we’ve had the arrival, for good and for ill, of social media, which have insinuated themselves into academic communication so extensively that it’s easy to overlook their ubiquity.
Hence the emerging potential for the “social edition” -- which, if I’m following the argument correctly, is not some newfangled travesty of established protocols for preparing important texts. It doesn’t mean tweeting Being and Time, though someone is bound to do so, sooner or later.
Rather, the social edition would offer the same features available from earlier scholarly editions of e-texts (glosses, links to appropriate material, etc.) while also acknowledging the ongoing nature of serious engagement with the material so preserved and annotated. The participants in preparing a social edition would generate commentary and analysis; help compile and update the bibliography; and create “folksonomic” tags (as when you use Delicious to store and categorize the link for an article you want to cite later).
“The initial, primary editor,” Siemens and company write, would serve “as facilitator, rather than progenitor, of textual knowledge creation…. Relying on dynamic knowledge building and privileging process over end result, [the social edition’s] expansive structure offers new scholarly workflows and hermeneutical methods that build, well, on what we already do.”
That last point is particularly significant. For one thing, scholars are already using social media – group bookmarks, blogs, etc. -- to share references and ideas. (The paper and its appendices identify an enormous array of them.) But more importantly, such tools are increasingly experienced by those using them “as natural extensions of the way in which they had always carried out their work.”
Novelty, then, is not the issue. “The core of activities traditionally involved in humanities scholarship,” the authors say, “have altered very little since the professionalization of academic study during the nineteenth century.” And those basic activities (finding texts, comparing and analyzing them, circulating them, etc.) are finally collaborative, or at least dialogical. A social edition will presumably foreground that reality, assuming one wriggles up on shore sometime soon, breathing air and able to find funding.