I want to believe that when I was taking my favorite professors’ classes, those great men and women were at their peak.
I was a little disconcerted, then, when an older friend recently told me about how good my hero and mentor, the critic Marvin Mudrick, had been 20 years before I had taken him. “But … but I was there at the end,” I whined to myself. For eight years (until he died in 1986), as an undergraduate and graduate student at the University of California at Santa Barbara, I took his classes or sat in on them.
Even so, I knew there were quarters and classes during that time where he was better than others. But I wanted him to have been at his best during those years and I guess he fed into that conceit himself. He would make fun of some of his own old views about books, writers, and teaching — so I believed I was taking him at his peak. He seemed to think he was at his peak.
This past week, with my 11-year-old daughter sitting in on a few of my classes during her school break, I was perhaps at my worst. She was looking at me with expectation, attentively — an encouraging, demanding student. I watched my language and I hoped the students would watch theirs, not that she hasn’t heard everything. And then the next day she was supposed to come again to my classes, but she stayed back to hang out with another professor’s daughter at a campus closer to home, and I was free, and I was in the classroom, happy to be free, aware, by this point in the semester, how far I could push the students and hoping, in a couple of them, to keep them engaged. I was funnier than I had been in a long while — telling tangential stories that then led into better conversations than we would’ve got.
“We tell you everything — what about you?” teased one student.
And that day I was not old Bob — that is, paternal, avuncular Bob — I was young Bob, the one I’ve been missing, and I was willing to tell them things about my life that I wouldn’t have told them if my daughter had been in the room.
I was younger without my daughter than with her — I was free again, and teaching like that, by my wits rather than by my deliberate, this-is-for-your-own-good friendliness and deadpan, I was better for a day.
I was better.
I’ve told a few young teachers this and a few young adjunct professors, that I was happier teaching as an adjunct than as a full-time professor. It’s something like the difference I felt as a student writing for one professor over another, or occasionally now, by writing for one publication instead of another. In one I’m loose, myself, giddy, and in the other I’m responsible and sober. I’m better unsober. I don’t drink alcohol, but I’m better and smarter when I’m funny, when my funniness loosens up the class and makes my developmental students, so very self-conscious, so very cautious, lean out a little for a look, go out on a limb.
Don’t things change for us as teachers? Don’t we have to deal with that damned aging in a way that our friends in non-teaching professions don’t? We get older, but the students stay the same age. Mr. Mudrick used to tell us, his students, that he talked about love and sex a lot in the classroom because it was the only thing we all shared an interest in. I’m not so daring or funny as he was, so I don’t go very far that way, except … sometimes.
Every semester I can still get the recent immigrants and 18-year-olds hot under the collar about William Carlos Williams’s “The Knife of the Times,” a story he wrote in the early 1930s about a potential affair between two women, who were girlhood friends, and are now middle-aged and married with families. Some of my students unashamedly express prejudices about homosexuality, but outrage as well about such affairs, and yet there are not one in five intact parental marriages in the room.
Fictional characters are somehow supposed to behave! Better than real people! Most of my non-literary students hate conflicted people, people struggling to make romantic decisions that will cripple them. Political decisions, social decisions, those are too easy, in my opinion. But dare to tell someone you love that you love her? In my experience that’s the biggest drama. Call me a Jane Austenite. But also call me old.
Aging athletes, like aging professors, also like to say that they’re better now than they used to be. But people who really pay attention know that sometimes the young superstar is best when young; that he doesn’t just get better and better as some artists do; he hits his physical peak, and, lacking steroids — are there steroids for artists or professors? — he deteriorates and becomes a coach.
As I’ve proceeded as a teacher of developmental English I’ve become, to my thinking, more like a coach, an encourager, a butt-slapper (but because we’re not on a field, I do so only metaphorically). But I was better, I think, as a young professor, someone in-between, as someone questioning what we as a classroom of friendly strangers are doing, as the guide who occasionally stops and wonders out loud, “Where are we going and why?”
No, I’m older and so aware of time passing that I get as anxious as a sheepdog and herd them along.
Mr. Mudrick may have stayed younger by getting himself more and more aware of the constraints on him as provost of a large college program and professor. That is, he deliberately wouldn’t let himself hold back. Because he was my unorthodox model, perhaps it’s inevitable that I slide rather toward more conventionality than further away. And perhaps this has come to mind because last week I started listening again to old tape-recordings of his classes.
He was so much himself in those classes, so happy to be there, so interested in us, in our reactions to what we read, in our reactions to what he provokingly and amusingly said, that those hundreds of hours in his classes continue to make me happy. But having had those stirring experiences, on my teaching days where my students and I have slogged through something that my college or department or my own old-age practicality has decided is necessary, I despair!
And then I have a good day, and I remind me of my old self, and I know I’ve lost something.
But like an athlete on the long decline, I stick around because I really still do like the game. I grimace when I miss a good teaching moment! Like a batter missing a fat pitch, I wince, “Oh, I should’ve nailed that!” A while ago, back in the day, I would’ve! So I’m slower, more watchful and deliberate, and because I can’t afford to miss as much as I used to, I’ve become more likely to take advantage of the little things that come up and go my way.
Bob Blaisdell is a professor of English at City University of New York’s Kingsborough Community College.
Since 2000, I've been the host of the Wimba Distinguished Lecture Series, shouting from the rooftops (well, desktops) about how to use modern educational technologies to teach effectively online. But now, after evangelizing for the last decade, I'm switching sides. I am teaching creative writing online as an adjunct professor for Holmes Community College, in Goodman, Mississippi. How the tables have turned.
I've probably led more webcasts than anyone on the planet. Seriously. I've hosted webcasts at least once a week for 10 years and I've also given thousands of other online presentations. From presentations about educational technologies and policies, to effective instructional techniques, I've done it. But now I'm tasked with teaching – online – creative writing, a topic that traditionally uses a workshop format, a format that is quite difficult to replicate in a virtual environment. Yet it's not the format that worries me.
You see, this is my first time teaching a college course. Though I've led writing workshops, collaborated with writers and journalists here in New York, contributed to numerous publications, and even penned my own book, I now fretfully ready myself to formally – and virtually – mold young (and a few moldier) minds at a college more than 1,000 miles away from my life here in New York. But I can’t wait. I can't wait to familiarize my students with exemplary works of poetry, fiction and nonfiction. I can't wait to answer my students' questions and hear their insights. I can't wait for my students to learn from me and for me to learn from them. I'm nervous. But I'm ready. I think. So in the immortal inquiry asked by David Byrne: Well, how did I get here?
Let’s start by looking at the Ed Tech industry first.
When wearing my Wimba hat, I often remind my audience that it’s only been about a decade since the modern format of online courses was put into place. The current configuration of combining course management systems, web conferencing, instant messaging, message boards, etc. to teach a class to students in a classroom and/or their pajamas barely existed in the 20th century, so when one stops to consider the idea that collegiate courses had been taught (more or less) in the exact same manner since ancient Egypt, Greece, and Mesopotamia, it’s quite startling to see how quickly this transformation has transpired.
Obviously this format of modern courses is still being tweaked, but it certainly appears that much of the technological and pedagogical foundation is firmly in place. As of today, the dawn of the ‘10s, tens of thousands of postsecondary faculty, either because of or in spite of their ability and/or willingness, have already taken the plunge and incorporated technologies into their courses – often with a great deal of success.
I’ve written numerous research documents boasting both the tangible and intangible benefits of technology-enabled courses. Countless examples of institutions around the globe that have seen benefits such as increased retention rates of students, increased enrollments, improved graduation rates, and dollars saved on time and travel, all thanks to technology in the classroom, fill the pages of these documents. In fact, I’ve seen so many positive examples of technology-enabled education over the years that I now have an extremely difficult time understanding why any institution wouldn’t beef up its current online offerings. The downside is just so negligible while the upside is so great.
But I digress. After all, I’ve now got my own class to worry about.
A couple of months ago I left my comfy big-city confines and headed south to tiny Goodman for an on-site orientation for new faculty. I didn’t really know what to expect. I knew I’d have a big leg-up in terms of my knowledge of online course technologies, but I also knew I’d have a big leg-down in terms of my knowledge of classroom instruction. Turns out I was dead-on.
My two Holmes Community College trainers that day explained the ins and outs of being an online instructor to me and the approximately 10 others in the room, all of whom had collegiate teaching experience. At least my tech savviness made up for the in-front-of-a-class-savviness I lacked. But even though I was already familiar with the Blackboards and SunGuards of the world, I didn’t realize how much about them I didn’t know. As my girlfriend always says, it’s hard to know what you don’t know.
My HCC trainers spent hours teaching me about Bb’s enrollment tools, grading and assessment functions, and how to withdraw students who need to drop out. Despite being around instructors for so much of my life, I guess I never truly grasped how much of teaching is actually administering. After a full day of technology training I left the campus very excited, but also very nervous. I kept picturing myself pushing the wrong button and accidentally unenrolling an eager student and then having to sheepishly write an email to the Holmes IT staffers informing them of my blunder.
But on the flipside, my nervousness also translated to eagerness. As I learned more about my prospective students – fervent 18-21-year-olds as well as working adults from around the country – I plotted the numerous ways in which I could engage them online. While driving from my orientation back to the Jackson airport I thought of at least 20 assignments that would combine best practices of teaching creative writing face-to-face with best practices of teaching online. In fact, by the time I reached the rental car return desk I could envision the thank-you letters I hoped to receive from my happy students who affably learned a few tips and tricks about writing with some flair.
Which brings me to today.
My lesson plans are done. My syllabus is up. My books are in the bookstore. But my mind still bursts with uncertainties (after all, I am a writer).
How well can they write? What do they already know? What don’t they know? From what kinds of experiences will they draw when they put pen to paper? Have they been to William Faulkner’s house up the road in Oxford? Will they mind if I occasionally swear? Will I understand them if they speak with thick drawls? Will their writing be better than mine?
The waiting is the hardest part. I wish I could invent time travel and get the first class over with.
The funny part is that I’m never this nervous when preparing and/or waiting to give presentations for Wimba, but I guess that’s because of my experience at the company. Hopefully I’ll read this op-ed a few years from now and laugh at how nervous I was. Man, I can’t wait to be a veteran writing teacher brimming with the confidence only gained from years of experience! Oh, how worn will the elbow pads of my tweed jackets be. Some day.
I discussed my trepidations with my family over the holidays, and my dad, drawing upon his 30 years of teaching experience, asked, “Do you have your opening speech ready?” I told him I did, but I lied, I guess because I don’t really need one. And this already demonstrates the difference between online and face-to-face.
When my class is ready to begin, someone from HCC’s technology department will simply hit a button in Blackboard, and then, in an instant, the class will be active. It won’t be the same as the first class of a face-to-face course. I won’t write my name in big letters on the chalkboard and won’t give a big dramatic speech about the wonders of writing creatively. Instead, my students will receive a message in their inboxes notifying them to watch the archive of a lecture I’ll record later this week. Sure, they’ll still see my talking head and hear the inflection of my nervous-yet-excited voice, but the impact might not be as a great as watching me forcefully pace back-and-forth in front of full lecture hall. Then again, perhaps the impact will be even greater because they’ll be equally nervous as they embark on a new class in a new medium.
Stay tuned for more as I tell my tales from the other side….
Matt Wasowski is senior director of customer programs at Wimba.
At a recent gathering of junior faculty, convened by the Teagle Foundation to discuss the future of liberal education, a remarkable fact appeared so clearly that it went unremarked. Discussions about the value and purpose of higher education had lost the acrimonious and partisan tone that defined the culture wars of the '80s and '90s. To be sure, those present (myself included) were no doubt fairly homogeneous in our political and academic backgrounds. And we were a self-selecting group, as all had expressed interest in the value of liberal education, even if we did not agree on (or even know for sure) what exactly it was. It was nonetheless an encouraging sign – no doubt prepared by such reasoned criticisms of the academy by the likes of Derek Bok – that liberal education no longer appeared as a minefield of partisanship, but rather as the site of constructive and rational debate.
One reason, I suspect, for this development may be that some of the institutions most committed to liberal education have transformed the way in which it is taught. At Chicago, Harvard, and Stanford, for instance, freshmen are still required to take a version of a "core curriculum." But unlike Columbia’s venerable core, these newer versions all allow students to make their own choices from a selection of classes. At Chicago, students compose a three-course meal from offerings in the humanities, civilization studies, and the arts. Stanford’s “Introduction to the Humanities” (IHUM) program presents students with a slightly leaner diet: they choose from a collection of starters chosen to “demonstrate the [...] productive intellectual tensions generated by different approaches,” before tucking into a two-quarter entrée that “promote[s] depth of study in a single department or discipline.” Finally, Harvard just introduced last fall a "Program in General Education" that is more buffet style: students select courses from eight different groups, roughly half of which satisfy humanities requirements.
While in no way revolutionary, these curricular developments, I argue here, may justly be regarded as harbingers of a third way in liberal education. This new way bypasses the old battleground of the culture wars — the canon — by recognizing the privileged place that certain works and events occupy in past and present societies, without dictating which of these must absolutely pass before every student’s eyes. As opposed to the more common "general education requirements," moreover, the courses in this model also provide students with an intellectual meta-narrative, that is, a synoptic perspective linking different periods, cultures, and even (ideally) disciplines. Finally, this model can offer scholars, administrators and policy makers a new language with which to define the goals and ideals of liberal education, and to help define criteria for their evaluation.
The language currently employed to discuss liberal education has itself proven remarkably apt for avoiding partisan flare-ups. Who can object to a pedagogical program designed to improve thinking, moral reasoning, and civic awareness? Glaringly absent from such skills-oriented definitions is, of course, curricular content. While this strategy of omission has conciliatory advantages, it also carries risks: Discussions about liberal education can end up sounding terribly formalist, as though students were destined to perform ghostly mental operations in a vacuum (“practice citizenship!”). The very idea of liberal education can suffer from such excessive formalism, since, emptied of content, it risks becoming little more than a talking point or sales pitch.
This approach also ignores a penetrating criticism, made with particular (if somewhat hysterical) emphasis by Allan Bloom in The Closing of the American Mind. In the absence of any overarching curricular structure, students can easily end up losing themselves in a labyrinth of unrelated courses. These courses may individually belong to disciplines traditionally associated with liberal education, and may each, in their own way, contribute to the development of important and worthy skills. But they may also leave puzzled students wondering how, say, their knowledge of Russian history relates to their classes on French literature. Of course, there are not always clear bridges between disparate subjects. And finding your way from one point to another can itself be an intrinsic part of education. At the same time, teaching students how to integrate knowledge from different fields is a valuable skill, one which we would be rather perverse to withhold from them, particularly when it is requested.
Beneath the geographical metaphors proliferating in the above paragraph lurks, of course, the familiar fault line of curricular content. But this is precisely where the reforms of core curriculum courses at the universities listed above can provide a less contentious framework for discussion. Indeed, the dominant feature of these courses is that they combine requirement and choice; students are obliged to choose from a selection of courses. This means that a) there is a degree of personal tailoring: for instance, hardcore “techies” at Stanford can take a course on the history of science and technology; and b) the emphasis is shifted from a debate over which exact texts every student should read – inevitably a source of heated disagreement – to a debate over which different sets of texts (or historical events, or works of art, etc.) form a coherent and meaningful syllabus.
The advantages of this system are numerous, but I would like to emphasize two ways in which it offers a valuable framework for liberal education. First, in addition to the benefits gained from studying individual texts or topics, these courses provide students with an overarching narrative. It is not necessarily a teleological or master-narrative, nor need it even be a story of progress with a happy end. But it is a narrative that allows students to perceive how events or ideas transform over a considerable stretch of time and space. The IHUM course that my department offers, for example, takes the students from the Mesopotamia of Gilgamesh to the Caribbean of Maryse Condé’s Crossing the Mangrove. Our syllabus is primarily literary, but the lectures draw heavily on each text’s historical, religious, cultural, and philosophical context. In this way, such narratives also illustrate how frontiers between humanistic disciplines are not closed borders, but can be freely crossed.
Ironically, the narratives transmitted in these classes are ultimately destined to fade away, or at least be significantly transformed, over the course of a student’s education and life. Their purpose is primarily structural: to borrow a hallowed metaphor, they allow students to attach the ideas they will later acquire onto different, yet connected branches of a single tree of learning. But this metaphor is somewhat misleading, since narratives are far less wooden frames. Subsequent coursework will complicate or contradict episodes of the story students began with; and at the end of their college education, they will ideally have written their own narrative with the knowledge they have gained. But even if the initial story they were told disappears in the process, it will have served its purpose, and taught the students a valuable lesson along the way – namely, that to be persuasive citizens and scholars, we need to know how to tell a compelling narrative. The ability to piece disparate facts and ideas into a coherent whole is a critical part of liberal education. We are always putting Humpty Dumpty together again.
Second, an important criterion for composing the syllabus of these courses is that their contents be sufficiently authoritative. Here we brush up again against the touchy subject of the canon, which cannot be completely avoided, even if the model under discussion does not advocate including specific books at all costs. But the inclusion of “authoritative” works or events – and I choose this word deliberately – does strike me as a necessary part of liberal education. This is not because some works contain The Truth and others only pale reflections of it. This argument of Bloom’s, and of his predecessor at the University of Chicago, Robert Maynard Hutchins, is more likely to puzzle than to offend today (how do you teach Homer as "the truth"?). But as John Guillory pointed out in Cultural Capital, certain texts simply have (or had) greater authority in our societies: not to engage with at least some of them leaves students at a social disadvantage.
I would also argue that understanding these authoritative texts is key for achieving what Montaigne identified as the ultimate goal of education – the ability to challenge existing authorities, an ability we would today call critical thinking. If students are to challenge authorities, they must begin by knowing who those authorities are and what they argued. Only in this fashion can the students acquire both a better understanding of how and why our societies came to be the way they are, and the ability to counter authoritative accounts in a knowledgeable and evidence-based manner.
It is to be hoped that liberal education will always remain a fertile topic of discussion, and the model that the universities discussed here have adopted – with a number of differences, to be sure, which I did not address – is certainly not the only solution. Indeed, I hope that other colleges will experiment with different models, so that our collection of experience continues to grow. But the promise of the current model is that it does offer a way past the opposing camps of the canon wars, and in this regard, may come to be regarded as a third way in liberal education.
Dan Edelstein is assistant professor of French at Stanford University.
When the two of us were students at Georgetown University more than 20 years ago, we noticed that our friends in American studies didn’t study the same way we did. They gathered in clusters of three or four and their energetic conversations — whether about Puritanism or the 1893 Chicago World’s Fair or 1960s counterculture — shuttled between focused inquiry and playful digressions. As they memorized facts, reviewed texts and wrestled with ideas, they seemed to be having more intellectually vibrant exam prep sessions — and more fun — than we were.
The habits of those American Studies majors were, we think, shaped not just by the interdisciplinary nature of their field but by the kind of exams they were preparing for: collaborative oral exams.
Many years later, we both find ourselves making such oral exams a part of the classes we teach. We’re doing so in different fields (English and political science), at different kinds of institutions (one large and public, one small and private), using different tactics. But we’ve both discovered ways to make them work, even within overcommitted schedules, and we’ve stayed with them, even though they interrupt traditional academic rhythms, because we see how small-group, in-person exams encourage a spirit of inquiry and collaboration among our students.
In higher education oral exams are rare but not extinct. They are often required as senior thesis defenses and sometimes woven into some foreign language courses. Occasionally a case is made for them in the teaching journals of various disciplines and at academic conferences. Still, most faculty think of them, if they think of them at all, as the province of graduate school (comprehensive exams, dissertations defenses).
In-class oral reports and group presentations are common enough, but we are talking about something different. Our approach involves sorting students into small groups, giving them a set of challenging questions or texts (along with strong incentives for studying together), and evaluating how they perform in compressed, collaborative, interactive exam settings.
The common perception is that such exams are great in theory but simply too time consuming, subjective, and tricky to grade. The prospect of having to schedule, in the midst of a crowded semester, a series of oral exams rather than a single sit-down test is, on its own, enough to send most running. Moreover, technology is steering us in the other direction: more and more we are asking students to complete tests in the pixelated spaces of course management software, which makes the prospect of holding collaborative exams in faculty offices seem quaint and unrealistic.
Still, we want to argue for the value of collaborative oral exams as a complement to (not a replacement of) more typical forms of assessment. Our two approaches share core values but differ in some significant ways.
In his literature courses, which typically enroll about 35 students, Tom schedules a collaborative oral exam one month into the semester, followed later by a paper and two traditional exams. In a Shakespeare class, for example, the oral exam focuses on the sonnets, which are covered during the first few weeks, and the exam is mainly about gauging how well students can enact the strategies for close, critical reading being modeled in class. Unlike Jamie, who frames his oral exams as the culminating experience for students in a small seminar, Tom schedules his orals at the front end of the semester, both to test foundational skills and get to know students in a relatively large class a little better.
A few weeks before the exam, Tom sorts students into groups of three, asks group members to sit together in class, and has them work together on occasional low-stakes in-class activities. This warm-up period helps the groups gel (and allows for some reshuffling of groups — the chronically absent get clustered together); the trios also make for smaller hubs of engagement in the larger class.
All along students are urged to annotate the sonnets in their books and told that they will be able to use those notes because the exam will be open-book (this is often the nudge they need to start annotating). A week before the exam students are given a list of 20 sonnets, any one of which could be chosen for their exam.
Students are encouraged, though not required, to study together. They’re told that while they will be graded individually, exam teams that study together and play to one another’s strengths typically perform better, individually and collectively. They are also assured that the exams shouldn’t be competitions for airtime; supporting, extending, qualifying and challenging each others’ interpretations are valued more than showing off.
In the days leading up to the exam, nearly all the groups take up the recommendation that they study together outside of class. Most teams meet just once, but some meet several times or blend online and in-person communication.
Exams are 15 minutes but scheduled at 20-minute intervals to allow for quick grading between sessions. Each trio is informed of its sonnet 20 minutes before its exam (right before the group in front of them is going in), which allows a final chance for them to gather in a nearby lounge to review and plan.
The exams tend to be a hybrid of testing, talking and teaching. First we walk through the poem, books open, each student taking the lead on explicating one section. After that, Tom poses questions that students have been told to expect, such as What are the key tensions and oppositions in play? Can you analyze how one or two aspects of form generate meaning, complexity or pleasure? Which strategies from class can you apply to this poem? He challenges them on vague responses but also points them in productive directions when they get stuck.
When filing out of the office, students generally express relief but many also remark on having learned a lot both while studying together and during the exam.
This approach clearly takes some careful planning. Eleven or twelve exam sessions tallies up to about four hours. Even with one class canceled for the exam, this means that most students take the exam outside of regular class time, which is fine for those living on campus but often a problem for commuters (but they get first shot at the exam slots scheduled during the canceled class).
The grading is done immediately after each exam on a simple scale (excellent/good/fair/wanting/failing) and with no comments except the notes Tom took during the exam. And the grading is easier than you might think: after 15 minutes of exchange, the stronger and weaker parts of their interpretations have been hashed out. It’s pretty clear to everyone in the room how each has performed.
In end-of-semester teaching evaluations, most students report that they found the oral exam valuable. More telling is how the last Shakespeare class voted with its feet: for the second exam, when given the option of taking it as a traditional sit-down or a collaborative oral (the same questions would be asked in either case), two-thirds opted to go with the collaborative oral format.
In a first-year course designed to acculturate students to a liberal arts education, Jamie has used oral exams as the course final. About 10 days before the final, he assigns students to groups of three (groups of two are better than groups of four for resolving rosters that don’t divide neatly by three) and distributes nine open-ended, essay-type questions. When students are asked to discuss a question like “What is the relationship between creativity and certainty?” they are encouraged to apply ideas encountered in the course materials and take ownership of their own arguments.
This is the first oral exam for most students, and they are generally eager for advice on how to approach it. While being careful to remind them that each member of the team must contribute to each answer, Jamie suggests that the team designate a point person, one student who bears primary responsibility for developing specific strategies for each question. Studying should be done together, with discussions of each question and collaborative decisions about the arguments to be made if the question comes up in the exam.
The students often find these study sessions the most valuable part of the experience. One student from the first year course remarked in a course evaluation, “I really liked the oral exam at the end of the semester … because I was able to discuss the questions with my group. I found that the discussion that my group had the night before the exam was a lot of fun and really insightful. It was almost better than the actual exam because we were not nervous and we were just talking and discussing the questions and arguing, a little.”
Whereas Tom’s students may get different individual grades despite being in the same exam group, Jamie’s students, if in the same trio, all get the same group grade. Because that system explicitly rewards teamwork (along with ability to use specific ideas from specific texts and explore them in depth), students collaborate and really listen to each other’s interpretations and perspectives. They don’t need to agree, but they do need to fit divergent ideas into a single framework that answers the question.
Exams take place in Jamie’s office and run 25 minutes, which provides enough time to get through two or three of the questions from the list. While the question’s point person may initiate an answer, every member of the group must contribute to the conversation on each question. Ideally, the first words of the answer should be “A text from the course that best helps us deal with this question is …”, but students know that discussion should not be fully scripted. The best answers meander through references, examples and follow-up questions until the conversational thread has run out, at which time another question is examined.
The unstructured nature of this process can be intimidating for those used to exams as tests of packets of information. Because conversations can flow unpredictably and groups often steer their answers toward themes and examples they know best, group oral exams are best at measuring how students work together to connect texts, ideas and experiences. This quality makes them particularly well-suited to seminars, where a group oral exam feels like a natural extension of the shared responsibility for the health of the discussions throughout the semester. As another student wrote in her course evaluation, the oral exam "seems to fit with the class better."
When the group oral exam process works, students build on each other’s insights and create answers that none of them would have come up with in isolation. They work together to make connections and play with the material in a way that they don’t for written exams or prepared presentations. The dual challenge is to prepare well and to adjust to the flows of conversation, both of which demand dexterity with the texts and themes of the course.
We tend to forget before the 20th Century, oral modes of learning and assessment (recitation, declamation, oratory, debate) were dominant in the American college. The role of oral performance ebbed with the spread of the research university ideal, the rapid expansion of higher education, and wider use of testing and writing. Many courses carry on the oral tradition by valuing classroom discussion, but perhaps it is time for us to clear more space for spirited talk, even in our exams.
Tom Deans and Jamie Frueh
Tom Deans is associate professor of English at the University of Connecticut. Jamie Frueh is associate professor of history and political science at Bridgewater College.
I just finished my last final exam. But for the first time in 25 years I wasn’t grading those papers; I was writing them. At age 58 I just completed my first semester at New York Law School. And while I may not be the oldest One-L in America, I’m certainly one of the very few to take on the challenge after two-plus decades as an adjunct professor of marketing and management at several graduate programs.
Law school was an accident. While I had considered it some 35 years ago when graduating college, it was my recent jury duty service -- on a major trial – that triggered my renewed interest. Soon after the verdict, I was at a dinner party and told another guest about an article I had just written about the trial. She was intrigued, and asked me whether I was a lawyer. No, I admitted, but I then speculated aloud about the upcoming terror of becoming an empty-nester as my younger son was about to go off to college. Before my second glass of wine, I had a tentative admission; unbeknownst to me, the woman I had been talking to turned out to be a dean at the law school.
My new perspective from the other side of the lectern is also colored by now having two sons in college. To say that education is wasted on the young – apologies to George Bernard Shaw – is a temptation. But I’ll hold back with that broad brush, and instead proclaim that as a result of going back to school, my own teaching will change pretty radically in the future.
First, an admission: I’m working harder in law school than I ever worked as an undergraduate or graduate student. Moreover, the workload is heavier, and the expectations tougher, than at any of the programs I taught in. As a night student, I go to class four evenings a week – two classes each night, totaling three hours. In addition to an hour of review before each class, my preparation on weekends is never less than 12 hours, and often more.
Interestingly, this time commitment is somewhat less than that of most of my colleagues. (Perhaps the one advantage of age is that I suspect I work more efficiently than most young people do. Unfortunately, it is often offset by middle-age memory loss.) Which leads me to my first observation: I have a new respect for night students.
Evening Students -- Many of my classmates are incredibly hard-working, mature, and dedicated. This shouldn’t surprise me – all of my graduate students as an adjunct at New York University and Fordham University were night students – but it does. As a teacher, I saw my students one night a week for a total of just under three hours. What I never really appreciated was the bigger picture of their commitment. Now I recognize that I never had a sense of how much time and effort their overall program was demanding. When I grumbled about their lack of preparation, I really had no sense of competing demands.
The Bigger Picture -- As an adjunct, I don’t think I was ever invited to a department-wide or program-wide meeting. I met with my dean to review – cursorily – my syllabus, and I knew what students were required to take in order to get a degree. I didn’t know – and really didn’t care -- if another professor had assigned a paper or a group project. I was in my own little world. When I teach again, it will be with the proviso that my dean better integrate the adjuncts into the overall program.
Mature But Still Clueless – Before contracts class one night, I was talking with a classmate, who, based on her class participation, I knew to be pretty sharp. And her chit-chat was no less articulate. So, relevant to our conversation, I thought she would find an anecdote about Theodore H. White interesting. (I had met White years ago.) I tell the story, and get a blank stare in response. I ask her if she knew who Teddy White was; or if she had ever heard of The Making of the President books. No clue. And she was not alone in what I had assumed was basic “popular cultural literacy.” I got similar blank looks from more than a few classmates when I referred to North by Northwest, Paul Newman’s favorite role in The Verdict, Dr. Kildare, or our former-haberdasher president. In the future, I’ll take more care in making cultural references.
Stress – From the very first week of law school, assorted deans stressed that our job prospects upon graduation would be directly related to our first-year grades. This is particularly salient inasmuch as we attend a “second tier” law school. Our grades are almost entirely dependent on the four-hour, closed-book final exam. (A tiny number of classes include a midterm exam that counts for about 25 percent of one’s grade.) And class participation can affect a student’s grade only marginally.
So, coupled to a seriously demanding workload, a largely new “language,” and the need to learn how to “think like a lawyer,” is the regular reminder that grades really count. Not surprisingly, the stress level among first-year law students is scarily high.
The Curve – Last year our law school changed its grading curve. Where the previous curve allocated only 6 percent of grades be an A or A+, this year there is an 8 percent target -- with a 12 percent maximum. (I told some classmates the old joke about the two law students confronted by an angry bear. “We’re not fast enough to outrun a bear,” said one. “You’re right,” said the other. “But I can outrun you.” Very few found it funny.) I like the curve. As an adjunct, I was under pressure – from both students and my program director – to inflate grades. When I next teach, I’ll push for a published curve.
Competition vs. Cooperation -- Among night students, there is healthy amount of cooperation and very little (outward) competition. We study together, respond to midnight e-mails about complicated cases, and are generally quite supportive. That does not seem to be the case among the (younger) day students. Many of us have heard stories of day students hiding research materials from their classmates. Whether apocryphal or not, it reinforces our desire to see our fellow night students do well. As a teacher, I’ve often assigned group projects. I’ll continue to do so in the future, but with more sensitivity toward scheduling problems.
Kill Computers in the Classroom – I am utterly shocked by the number of students who spend the entire class on their Blackberry or Facebook account. I find it both stupid and rude. Some surfers actually have the chutzpah to say to the professor they need their computers during class because their handwriting/note-taking is so poor. The professors aren’t oblivious, but only once did a prof tell a student to put away her Blackberry. I wanted to climb over the desk and dope-slap my classmate.
A young friend of mine -- a very smart, thoughtful, respectful Stanford Law School grad – argued that before computers, students would do crossword puzzles in class. And, he argued, it is up to the professor to keep people engaged. I agree: it is my job to make class time interesting and productive. But part of my job is to elicit the observations and ideas of students; to help them learn from each other. But if they are not listening, they cannot contribute, and that is a detriment to all.
My niece, a graduate student at a top program, admitted she surfs the Web during her classes. Seeing I was appalled, she tried to argue that it is valuable to have instant access to “factual” information (i.e. Wikipedia) in order to challenge her professors’ assertions. I concede that point – in the abstract. So in the future, I’ll institute a compromise: I will keep one computer available in the classroom for students to access -- after they’ve raised an objection. But otherwise I plan to ban all electronic devices from class.
Don’t Teach to the Stragglers – I had only one “unsuccessful” course this semester. It was a large lecture with weekly written homework assignments. The classes were largely a waste of time; a not-very-good review of material we read in the text. And the homework assignments were checked off for submission but never graded. The professor was a very knowledgeable guy, but seemed to be going through the motions during class. Twice during the semester I approached him and made suggestions – first, to post the answers to the homework questions online – and then perhaps to assign more challenging research problems in class which we would solve and discuss. His answer surprised me. He couldn’t do either because he felt he had to teach to those in the bottom 20 percent of the class who were struggling with the basic material. I was shocked: the 80 percent of us who understood the material were being penalized by the few who did not. Perhaps I’m too Darwinian, but in my classes it is going to be sink or swim. I’m available -- as are TAs -- for extra help. But the pace is not going to be dictated by the stragglers.
TAs Can’t Grade – In an unusual practice for law school, one of our professors gave us weekly writing assignments. They were graded by teaching assistants, but the grades didn’t count. Midway through the semester, concerned that I was getting pretty mediocre grades – but pretty sure that I was understanding the material – I talked with the professor. He suggested I send him a copy of the next paper. I did, and sent a copy to the TA as well. Not surprisingly, I received my usual C from the TA; an A from the professor. The writing exercises were enormously useful. But if I use TAs in the future, the real challenge will be to establish clear rubrics for grading and ensure that I closely monitor the TA’s actual critiques as well as the grading.
On-Call vs. Called-On – Most of my professors have referenced The Paper Chase, the 1973 movie about Harvard Law School where John Houseman won a best supporting actor Oscar as the terrifying Professor Kingsfield. (Almost none of my classmates had ever heard of the film, though several have now watched it.) Very few law professors terrorize students as the fictional Kingfield did. Most now go around the room calling on students in order, and if a student is unprepared he says, “Pass.” One very good professor gives five students two days' notice that they will be “on call” for an entire class, and drills them (fairly gently) on their assigned cases.
Both approaches work reasonably well. But I did have one mini-course where the old-fashioned Kingsfield approach was used. I never worked harder preparing for that class, and my learning curve soared. As one tenured colleague reminded me, there is no need to embarrass students by not letting them off the hook when they choose to pass. (Though he notes in his grade book who is not prepared.) But the fear of embarrassing oneself by not being prepared for every case really did motivate me to work harder. As a teacher, I’m inclined to find my inner Kingsfield.
Drinking – I am not a teetotaler. But I am amazed by how much people in their 20s drink. And how often they drink to excess. Getting sick from alcohol is neither a badge of honor – as it is among underage drinkers – or a stigma. But for me it is disturbing. I just don’t get it. It is one area of my law school experience where I can’t find common ground with my classmates. Maybe I’ll just have to reinstitute sherry hour as a means to encourage moderation.
As I finish writing this piece, it is now just after Christmas break, and we start classes again next week. Along with my classmates, I’ve spent much of the past two weeks checking the law school Web site every few hours to see if final grades have been posted. We are not quite obsessive about it; wait, I take that back, yes we are. Finally, grades are in, and I’ve done really well. Except in legal writing. Despite having written six books – including three best-sellers – and numerous award-winning articles, I just can’t get the hang of the repetitive, structured format required by the class. My professor tells me not to worry -- too much. Now I have to brag to my kids that the old man has standing to nag them on to better grades.
Steve Cohen is a founding director of iCollegeBound.org and is ambassador-at-large for IndividualU.
It was hot and bone dry. It was the desert and over 100 degrees. We kept going — over slickrock, across sand, gulping water and quietly contemplating if perhaps we weren’t just a wee bit crazy. The hike was not a long one, but we’d been on so many during our time in Utah that we wondered if this more “touristy” one was really worth it. But we soldiered on.
And then, as we snaked around an exposed rock cliff, there it was — Delicate Arch! Nothing on the trek had prepared us for the emotional and aesthetic splendor of Utah’s signature natural wonder. Was it the contrast of the barren desert hike to the striking elegance of the red rock arching against the sky that struck us so profoundly? Yes and no. It is at once a tough and a delicate question. Of course it was the beauty of the rock, the perfection of the sky. Or was it perhaps more? And what in the world does all this have to do with education?
Many psychologists believe that a child’s capacity to delay gratification is an indicator that the child might someday grow to be a reasonably well-adjusted, content, and mature adult. What is it about the ability to delay gratification that makes it vital? It is the necessary precursor to innovation, development, change, and sagacity. The ability to wait for a reward is at the basis of hard work, scientific inquiry, artistic creation, and intellectual achievement. Because we as a society seem in danger of forgetting this fact, many of our young people are not learning to wait: rather, they have come to expect instant results. And that we are abetting their impulse is nowhere more evident than in our ongoing attempts to reform our educational system.
Plans to reform the American system of education have been largely ineffectual for several reasons. Perhaps most important is that the conversation regarding what education ought to be in this country has shifted radically, substantively, and, we believe, wrong-mindedly, from a concern about what our young citizens ought to be learning to how quickly we can rifle them through. Book after book, study after study, monograph after monograph bemoans the fact that not enough students graduate and those that do don’t graduate quickly enough.
Private foundations tantalize politicians and academics alike with millions of dollars to entice them to find ways to speed up the process of education: do away with the “wasteful” senior year in high school (why don’t we just make it “useful and valuable”?); go to a three-year college curriculum (because?); move students into college after the 10th grade (huh?); increase online offerings — they’re quicker and more efficient (and?).
The Bush administration gave us No Child Left Behind; Obama’s given us the Race to the Top. Different administration; same message: “Let’s get them graduated as quickly as possible!” We are clearly in panic mode! Clifford Adelman refers to these advocates of speed over process as the “get it over with and get it over with fast” school (The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence). No wonder the world might suspect that we Americans seem less interested in the quality of the diploma than we are in the quantity of those awarded. And why do we expect that our young people would think differently?
For earlier generations, the means of the educational process — that is, what a student actually gained by going to college and navigating the processes (bureaucratic as well as academic) — was fundamentally somewhat more important than the mere end — the degree — which simply asserted that the student had completed a course of study. The diploma was merely affirmation that the graduate had mastered a particular fund of knowledge to carry along when entering the larger social arena.
The emphasis in recent years, however, has been on rapid credential attainment and “seamless transitions” rather than on actual learning. For evidence of this, simply review some of the marketing claims — even from prestigious colleges and universities — which hawk convenience and speed of completion rather than the actual process of education.
In this age in which we glamorize easy money, easy fame, easy everything, we have lost sight of the truth that for an experience to be truly worth something — and truly educational — it must be consciously lived and grappled with. Study after study has shown that delaying gratification allows space for experience and learning and leads to a psychologically healthier, more mature, more sophisticated individual and, by extension, to a psychologically healthier, more mature and sophisticated society.
As educators, parents, and adult role models we have a vital role to play. You may ask why, especially in a capitalist and entrepreneurial society, the promise of more money or fame should not be reward enough. It is because, as a species, humans have always asked “Why?” and then sought the answer. In the processes of answering that question, thousands of discoveries have been made, books written, paintings painted, and yes, beauty discovered and wisdom gained.
Americans have always felt pride in their Americanism, in the “can do” spirit driven by hard work, honor, and creativity. We love the images of the cowboy and the pioneer, the early astronauts landing on the moon, the “greatest generation” storming the beaches at Normandy. But as we have become richer, more reliant on technology, and more populist in our relationship to intellect, we have rushed to produce more graduates and forgotten to help those graduates learn to honestly value their achievement. One often-unacknowledged irony regarding this trend is that many of its exponents — that is, those same policy experts who push for speedier graduation and more degrees — seem to have little hesitation in spending upwards of $50,000 per year to send their own children to America’s elite institutions, where traditional educational values remain a significant element of the core mission.
No other species has as long a developmental stage as do humans. Our complex prefrontal cortex takes more than 20 years to fully develop. We are not able to survive on our own until far into our teens. Why is that? Because the process of becoming human is a complex one. We have a lot to learn and we need time to learn from our social and familial elders. It is we as adults who have, then, the responsibility to teach in such a way that our young will mature and develop the requisite abilities to run the complex human society that we all rely on. We believe that this responsibility has been eroding.
Instead we seem to have surrendered to a youthful notion of impetuosity, creating an educational environment whose values sometimes seem predicated on impatience and expediency. Students are neither customers nor clients to whom we must guarantee instant delivery of knowledge and wisdom. Rather, they are charges to whom we have a moral responsibility to help them grow into people capable of making rational decisions about the world they — and we — inhabit.
We are in danger of failing them, of failing to provide models for our youth to emulate. We are in danger of failing to model for them that learning often has value precisely because it takes time to acquire. Instead, we idolize instant reality stars and often pay scant attention to Nobel Prize winners who have toiled for a lifetime.
When someone decides to climb Mt. Everest, they do so for the sense of achievement, accomplishment, and specialness that the trek will afford. It is not an easy thing to do and they understand they are not the first to climb the mountain. But once they reach the top they know it has been a thing worth doing.
Nancy Rosenbach and Peter Katopes
Nancy Rosenbach is a clinical psychologist. Peter Katopes is vice president for academic affairs at LaGuardia Community College.
It's May again. The flowers are growing, the birds are singing, and I’m getting ready to comment on my last stack of student papers of the term. When I finish, I’ll assign my students their grades. I’d love to be able to skip that last task and wish them all good luck, so it was with great interest that I read about Professor Cathy Davidson’s bold experiment with having her students grade one another. Let me say first that I'm all for the experimentation and the creative study of learning that Davidson is doing at Duke University, and I’ve long been interested in innovative teaching by Davidson’s former colleague Jane Tompkins (who also tried student self-grading) and research by educators like Alfie Kohn, who argues that competition interferes with the learning process. I admire Davidson’s scholarship, and I’ll look forward to her findings.
But Davidson, Kohn, and others can’t increase the number of spots available at medical schools, and they can’t allot a company more job openings than its revenue allows. Those entities depend on professors for our judgment of students, and until we can come up with a different way to apportion limited resources, we have to work within the system we have.
Grading certainly has its problems, and I’ve never met a teacher who enjoyed it. But just as Winston Churchill described democracy as "the worst form of government" except for all the others, so too with grading.
Let me put it more directly. I think avoiding grading (or some comparable form of rigorous evaluation by the instructor) shirks necessary responsibility, avoids necessary comparison, and puts the humanities at even greater risk of bring branded "soft" than they already face.
It doesn’t surprise me that 15 of Davidson’s 16 students signed off on others' work, eventually entitling them to As. Such an outcome brings to mind Garrison Keillor’s description of Lake Wobegon as a community where "all the children are above average."
The bottom line question is this: if everyone gets As, does that mean that Yale Law School will simply accept them all?
If an average class grade is an A, then graduate and professional schools will have to look elsewhere to find out how applicants differ. If I were an admissions officer, the first place I’d look would be to other courses with wider grade distributions, where the instructors rank and compare. Those other courses would weigh more heavily, and the professors who teach them would gain disproportionate influence in the decision process. Put simply, Professor Davidson’s colleagues who grade their students would be helping them more than she would.
Perhaps Davidson plans to make distinctions in the recommendations that she’ll write for the students when they apply for professional schools and jobs. But isn't that the grading that she was supposed to be avoiding in the first place, now done in secret? Davidson’s practice also fuels grade inflation, which disproportionately harms a college’s best students by devaluing their high marks. We need to be wary of such trends, and many colleges already are. Harvard recently moved to limit the percentage of its students who graduate with honors, which had swollen to a watery seventy-plus percent. Columbia University includes on a student’s transcript the percentage of students who got As in each class that the student took. Dartmouth and McGill are two universities that also contextualize their students’ grades. These elite institutions want to create a basis for discernment.
That discernment is personal, and it starts in each classroom. We need to be able to say to students in effect, "You did good work, but not the best in the class." It’s a way to be fair to the students and allow them to gain from their achievements.
The goal is not, of course, to make the classroom red in tooth and claw. I work harder at creating learning communities for my undergraduate and graduate students than at anything else I do, and it’s been well worth my effort over the years. I know that I have to keep seeking new ways to do this, because I agree with Davidson, Kohn, and others that students learn better when they can share the enterprise with each other.
There’s plenty of value to Davidson’s collaborative experiment, then — but grading is still part of her job, and mine, and all professors’. If we stop doing it, colleges and universities will eventually lose the esteem of the society that funds us. The humanities, already at risk, will be the chin that absorbs the direct hit.
Parents know that our children respect us when we save our highest praise for the achievements that merit it. I’m a big fan of Cathy Davidson’s work, and I’ve taught it to my own students. But abstaining from giving grades to students isn’t one of her better ideas. I say this with all due respect — and discernment. And that’s the same respect and discernment that we owe to the work of our students.
Leonard Cassuto is a professor of English at Fordham University, where he was named Graduate Teacher of the Year in 2009.
Another huge stack of papers to grade. So as good teachers of writing we bundle them up in our arms and take them home, make a big pot of coffee (or brew some tea), and spend countless hours commenting and grading, alone. What are we doing? And why? More importantly, what are the students doing (or not doing)? And why?
A recent Inside Higher Ed article discussed the experimental work of Duke University’s Cathy Davidson, involving students grading themselves. According to Davidson, when students are held responsible for assessing their own — and their peers’ — writing performances and products, they learn to take more responsibility for their own learning, and consequently apply themselves much more energetically to their work. In response, Leonard Cassuto of Fordham University points to the fact that at least 15 of Davidson’s 16 students in this experiment earned As for the course. Cassuto sees that as a problem and argues that professors need to be the ones saying “You did good work, but not the best in the class.”
I think I may have somewhat of a compromise when it comes to assessing student written work. I was in the same situation as many writing instructors for years. Students write, write, write. Then I would spend about five minutes per page supplying written commentary individually on each of their papers. But about a year ago I started doing things differently. And I don’t plan on going back any time soon.
First students in all my courses form groups of three the first week of the term. I allow students to initially form their own groups, but sometimes I have to make adjustments as the term progresses. They exchange contact information. These "home groups" become the basis of their peer review and response writing groups (as well as other collaborative activities). All the writing for the course also goes into online file sharing space. Students post first drafts of their papers into a file. Then they peer-review each other’s papers. Next they discuss and consider their partners’ commentary. (This usually occurs in-class since we are fortunate enough to be in wired, computer-equipped classrooms. There I am also able to circulate among the groups and contribute commentary and answer questions.) Then students rewrite their papers and resubmit to a second draft file. Then I go in for my commentary.
But not alone -- not anymore.
Since I scaffold my sequences of writing assignments so that smaller papers serve to build up to larger papers, I do not comment on the smaller ones myself — although they do undergo the peer review process described above and are included in the midterm and final collected-works portfolio. (This is a method advocated by John Bean in his influential book Engaging Ideas, for all teachers of writing.) Instead, once students have written the larger paper (usually three per term) I meet with each writing group in person. We all sit around the computer screen and read each person’s paper and supply interactive commentary. Usually I write in the notes and commentary for the students. But sometimes, depending, I’ll have students write the notes and commentary themselves. I end up spending about the same amount of time per paper as I would commenting in isolation. (I save all of my grading until the midterm portfolio, then again at the end with the final portfolio. One of the key elements of the portfolio is substantial student critical self-reflection of their writing strengths and weaknesses. Like Davidson, I do ask students what they feel they’ve earned as a grade for the course. But I do not depend solely on students’ self-assessments.)
What I’ve found over the past year is that this method brings together everything I value most about the teaching and learning of writing. Students learn to become better readers of their own papers through this collaborative, iterative and dynamically recursive process. The movement of group commentary from the classroom, to my office, back to the classroom, and into the students’ papers mimics the social construction of knowledge made popular by teacher-scholars like John Dewey and Kenneth Bruffee and educational and learning researchers like Albert Bandura, Jerome Bruner and Lev Vygotsky (and practiced every day in writing centers and other peer tutoring programs across the country). It also makes my job more interesting. It takes what I had started to consider the somewhat dreary act of grading countless papers and turns it into a synergistic, multi-vocal, live conversation.
And I am starting to share and hone this method with my graduate teaching assistants and fellow instructors. A former TA of mine, Stephanie Serenita, comments on a particularly successful experience with this method: "I found that throughout the group tutoring session, the three students were offering more help and insight than I could try to muster, in between their vigorous comments and thoughts on the paper at hand. Although I felt as though I wasn’t participating as a teacher ‘should’ during this tutoring session, I couldn’t help but be astounded and proud that these three students were teaching each other, and in turn, themselves. It wasn’t all about me and my intellectual ability and what I thought could help their papers. Rather, it was about the students — what they know, how they can help their papers grow and, in turn, how they were growing as writers and co-learners of the craft."
As a group we can see each other’s facial expressions, hear the tone and quality of our words, qualify our statements, and answer questions, concerns or confusions immediately.
I’ve had students repeatedly comment on how much they appreciate this method. One student recently said, "I really like this way of getting feedback because a lot of times I don’t know what the teacher means in their comments on my paper."
Granted, this can be a physically intensive method, just as one-to-one conferences and tutorials are. It also requires a bit of scheduling and organizing that can sometimes be tricky. And, of course, like any other teaching-and-learning activity, sometimes students won’t want to play happily along like the students Serenita speaks of above.
Commenting on what she felt was a less-successful session, Serenita observes, "In the second group session I held that day, none of the three students came prepared. The first student who we focused on had his first draft instead of his second, which we had just looked at as a class during our last session. Already having revised this paper three days prior, how productive could his session be? The second student only had half of his first draft written, with little to no focus in his paper. Is it then the instructor’s job to point him in the direction he needs? My knee-jerk reaction would be ‘yes,’ but why should a teacher give a student answers and a direction if he came without any questions prepared concerning his paper and especially with the lack of work he had put in? The second student didn’t seem to mind that he was wandering in the dark, which eventually led the entire group to amble with his half-hearted paper. The third student, like the second, came in with under half of what was supposed to be the completed second draft. At this point I started to wonder if this group was ever going to find their way out of the shadows. It is at this point that I pointed out how unproductive this session had been. Their final draft was due soon, and with little to no changes or improvement in anyone’s paper, I didn’t really see the helpfulness of this peer tutoring session."
But even in this seemingly bleak situation Serenita came to see a glimmer of hope. She continues, "It was at that point that one of the group members started to speak about what he hopes to accomplish in his paper and how he means to get there. This jump-started a productive conversation between the group members about their papers and where they wanted to take them next. In the end, the maturity level of the group rose, I believe, leaving them understanding what went wrong in this group session and how it could be more productive the next time."
Yes, sometimes sessions don’t go as swimmingly as we might hope. But by and large my students and I are so happy with this method that I wouldn’t dream of going back to my old ways. And, granted, this method might not sound appealing to everyone. Many writing instructors, for example, teach several courses at several different colleges at once. These teachers may find it quite difficult to arrange and conduct face-to-face group tutorials.
Maybe some teachers like, or feel they need, to give written feedback (or even assign grades to every paper) the way they do. Perhaps they’ve made peace with it and developed feedback strategies that work well for them and their students. And, sure, “grading” papers will always be part of our jobs. But, personally and professionally, I would rather spend my physical and mental energies now experimenting with ways to make this method work better for me and my students, together.
Steven J. Corbett
Steven J. Corbett is assistant professor of English and co-coordinator of the composition program at Southern Connecticut State University. More tips on using student peer review may be found here.
The American university, like the nation’s other major social institutions — government, banks, the media, health care — was created for an industrial society. Buffeted by dramatic changes in demography, the economy, technology, and globalization, all these institutions function less well than they once did. In today’s international information economy, they appear to be broken and must be refitted for a world transformed.
At the university, the clash between old and new is manifest in profound differences between institutions of higher education and the students they enroll. Today’s traditional undergraduates, aged 18 to 25, are digital natives. They grew up in a world of computers, Internet, cell phones, MP3 players, and social networking.
They differ from their colleges on matters as fundamental as how they conceive of and utilize physical plant and time. For the most part, universities operate in fixed locales, campuses, and on fixed calendars, semesters and quarters with classes typically set for 50 minutes, three times per week. In contrast, digital natives live in an anytime/anyplace world, operating 24 hours a day, seven days a week, unbounded by physical location.
There is also a mismatch between institutions of higher education and digital natives on the goals and dynamics of education. Universities focus on teaching, the process of education, exposing students to instruction for specific periods of time, typically a semester for a course, and four years of instruction for a bachelor’s degree; digital natives are more concerned with the outcomes of education — learning and the mastery of content, achieved in the manner of games. which is why an online game pro will never boast about how long she was at a certain level, but will talk about the level that has been reached.
Higher education and digital natives also favor different methods of instruction. Universities have historically emphasized passive means of instruction — lectures and books — while digital natives tend to be more active learners, preferring interactive, hands-on methods of learning such as case studies, field study and simulations. The institution gives preference to the most traditional medium, print, while the students favor new media — the Internet and its associated applications.
This is mirrored in a split between professors and students, who approach knowledge in very different ways. Traditional faculty might be described as hunters who search for and generate knowledge to answer questions. Digital natives by contrast are gatherers, who wade through a sea of data available to them online to find the answers to their questions. Faculty are rooted in the disciplines and depth of knowledge, while students think in increasingly interdisciplinary or a-disciplinary ways, with a focus on breadth.
Universities and students also now see students in polar fashion. Higher education focuses on the individual, captured in 1871, by President James Garfield, who famously described the ideal college as Mark Hopkins, the 19th-century president of Williams College, at one end of a log and a student on the other. Today’s digital natives are oriented more toward group learning, multiple “teachers” or learning resources, and social networking, characterized by collaboration and sharing of content. This approach is causing an ethical challenge for universities, which under certain circumstances view collaboration as cheating and content sharing as plagiarism.
These are substantial gaps, complicated by the disparities in the way colleges and digital learners see their roles in education. Higher education is provider-driven in belief and practice. That is, the university, through its faculty, determines the curriculum, the content, the instructional methods, the study materials, and the class schedule. Digital natives tend to be consumer-driven, preferring to choose, if not the curriculum and content they wish to study, then the instructional method by which they learn best, the materials they use to learn, and the schedule by which they choose to study.
So what should be done? First, we need to recognize that this is not the first time colleges and their students have been out of step. In the early 19th century, as the industrial revolution gathered momentum, colleges in the main clung stubbornly to their classical curriculums, rooted in the ancient trivium and quadrivium, and to outmoded methods of instruction. College enrollments actually declined, and numerous institutions closed their doors. Bold colleges like Union, in Schenectady, New York — among the earliest adopters of modern language, science and engineering instruction — boomed in enrollment, topping Yale and Harvard combined.
Today, with college essential in obtaining most well-paying jobs, we will not see higher education enrollments drop. However, tardiness in acting will give impetus to the growth and expansion of alternative higher education — for-profit and nontraditional educational institutions that have been more successful in offering programs better geared to digital learners and their older counterparts.
Second, it is important to ask how much colleges and universities need to change. In 1828, facing industrialization and a Connecticut legislature that disapproved of Yale’s classical curriculum, the Yale faculty responded with a report which asked, in part, whether the college needed to change a lot or a little. This, Yale’s faculty said, was the wrong question. The question to be asked, they argued, was: What is the purpose of a college? This remains the right question today.
What is certain is that higher education needs to change, because students won’t, and the digital revolution is not a passing fad. To be sure, the purposes of the university have not changed. They remain the preservation and advancement of knowledge and the education of our students for humane, productive and satisfying lives in the world in which they will live. The activities of universities will continue to be teaching, research and service.
What must change, however, is the means by which we educate the digital natives who are and will be sitting in our classrooms — employing calendars, locations, pedagogies, and learning materials consistent with ways our students learn most effectively. It means that the curriculum must meet our students where they are, not where we hope they might be or where we are. All education is essentially remedial, teaching students what they do not know. This, for example, is a generation that is stronger in gathering than hunting skills. So let the curriculum begin with breadth and move to depth. Cheating and plagiarism violate the cardinal values of the academy, so let’s make it crystal clear to our students how and why they differ from sharing and collaboration.
It doesn’t make sense anymore to tie education to a common process; a uniform amount of seat time exposed to teaching and a fixed clock is outdated. We all learn at different rates. Each of us even learns different subject matters at different rates. As a consequence, higher education must in the years ahead move away from its emphasis on teaching to learning, from its focus on common processes to common outcomes. With this shift will come the possibility of offering students a variety of ways to achieve those outcomes rooted in the ways they learn best, an approach Alverno College in Milwaukee embraced four decades ago.
This needed transformation of the American university is merely the task of taking a healthy institution and maintaining its vitality. In an information economy, there is no more important social institution than the university in its capacity to fuel our economy, our society and our minds. To accomplish these ends, the university must be rooted simultaneously in our past and our present, with its vision directed toward the future.
"It’s not honors English. It’s honorable English," said Mr. McCann of La Jolla High School in 1979. Three thousand miles away and 30 years later, this principle is still true. So true that Mr. McCann’s wisdom has become something of a motto for Macaulay Honors College. Beyond just honors classes or programs, the concept of honorable behavior is one that is essential for all students -- but too often relegated to a page in the student handbook or a mandated paragraph on a syllabus forbidding plagiarism.
What is missing from such notifications is a comprehensive, ethical, and honorable approach to teaching and learning, especially when technology is involved and is as crucial to a program as it is to ours. This is something we learned the hard way.
All Macaulay students are provided with laptops and digital cameras as part of their honors scholarships. But we don’t just give out tech gifts and run. Our core belief is that, like scholars and explorers throughout history, students should make use of the latest, most innovative, productive tools of their age and understand that tools by themselves are not value-free. Although a student's laptop is not a tool on the order of magnitude of an atomic bomb, the principle is the same: With power, greater or lesser, comes responsibility. So we work with students from the moment they are handed their laptops to train them and to challenge them to understand the power they hold.
Of course, in the digital age, "tool" is an increasingly amorphous concept. Wikis, blogs, and social networking -- these days it’s the rare student who is not connected in these ways, ways unheard- and unthought-of just a few years ago. Perhaps precisely because students take it all for granted, our responsibility is to help them become thoughtful and self-critical. For while they're learning and researching and presenting their academic work through these tools, they're also doing a great deal more – with real potential for harm.
Recently, we had two instances where things went wrong, which gave us a chance to consider how to make them go right. The first incident involved a student collaborative Web site project. In one of our four New York City-focused seminars, students create neighborhood Web sites with audio, video, photographs, text, survey data, interview transcripts, and all the products of their research into historical immigration and present-day communities. In order to make these Web sites a truly collaborative product, students use wikis to gather the material, arrange it, and present it online. Unfortunately, because the wiki is open to editing, malicious vandalism is always a potential problem. Last year, one group of students found to their dismay that their hard work had been erased and replaced with the random rantings of pranksters from another campus.
There were immediate and practical remedies and responses. Because it’s the nature of a wiki that all changes can be rolled back and the previous state restored, no student work was ultimately lost. Still, we realized the need to lock down the wikis more securely so that while they can be as public as the students or instructors desire, only registered users can make changes. Even more importantly, we realized that student training needs to address the ethical "why" as well as the pragmatic "how." So our doctoral student instructional technology fellows, who work directly with our undergraduates in class, in the honors lounges, and via e-mail, came up with strategies to bring ethics to the students’ attention: an attempt to head off problems before they arise.
The second incident from last year, involving a different group of students, brought up even more directly the need for attention to ethics. A student who was unimpressed with the work of students from different campuses posted a negative review on his Facebook page. When one of the critiqued students responded, also on Facebook, the original poster escalated his comments into an attack on the students and their college phrased in racial and sexual terms. Things he probably would never have said in person were said electronically and disseminated widely. Because all of this happened on Facebook, it was outside the traditional channels of college communication and interaction and thus outside of our institutional control. However, because Facebook is a world also open to all, we were able to see the offense when the students who were attacked called on us to address it.
A number of consequences resulted, for the offender, his victims, and for the rest of the community. First, over the course of a series of conversations, a skilled student affairs professional led the offending student to understand that his Facebook interactions, far from being innocuous or private, had real effects, real impact, on real people – his classmates and peers. Second, when he accepted his responsibility and demonstrated empathy towards his victims, they in turn chose not to push for a public apology or formal sanctions. They were satisfied with knowing that the perpetrator had achieved an emotional connection to those he had hurt. In other words, all parties, including those not directly involved, learned from this incident, in ways important and somewhat unexpected. We all now recognize that while Facebook or other such sites may seem like the internet Wild West, without law, regulation, or consequences, in fact, there are people out there, people to whom responsibility and respect are owed.
Most importantly, students learned that honors takes place within a community. And that they could rely on the support of their friendly neighborhood sheriff: the college administration. So another consequence: we didn’t just ride into town to save the day, then ride out again in an e-version of Shane, but rather we insisted that everyone be part of our community, respect the standards of that community, and participate in enacting and enforcing them.
As part of this process of developing community awareness, we decided to develop a digital ethics code parallel to our existing honors code (see page 4 of this link). All members of the community, from students to instructional technology fellows to faculty to staff, are working together to develop this code. Importantly, it will go beyond a mere bulleted list of rules to include case studies, discussion questions, and practical exercises, as well as links to university policies, legal and copyright resources, and current news. Because the code will include open-ended and unresolved questions, it will naturally continue to evolve and serve as a place for further interaction and true inquiry. (The draft version of the new code, along with the open-ended case studies, is online here.) Beyond Macaulay, we hope this code will serve as a springboard for discussion in the wider honors and academic communities.
The academic and the e-community have this in common: both necessarily go beyond the classroom and beyond direct, physical interactions. So Facebook, for instance, is necessarily part of our honors community. It's not just "out there," it’s also "in here" – whether we invite it in or not. If we truly believe in technology as integral to teaching and learning (as we do), we must remain open to these tools – even when they’re misused. We don't, we won't, forbid their use (as if we could); rather, we promote their use – but in ethical ways.
It's ironic that the violation of feelings and ethical standards led to a new awareness of responsibility, human contact, principles of behavior: of the common decency that a civil society runs on but often takes for granted. Because the new tools gave students the opportunity to do harm, they also gave them the opportunity to see that harm and develop new standards which they could follow and reinforce in thoughtful and intentional ways.
As we teach students to use these new tools, it’s incumbent upon us to teach them how to use them not just in the practical sense but also in the ethical sense: not just how can they be used, but also when and where and why they should be used. While there are no guarantees against bad behavior, we want students to think before they post, before they e-mail, before they edit a wiki, before they blog, and so forth. As always, it's the thinking, not the tool, on which all education, including honorable education, rests.
Sylvia Tomasch and Joseph Ugoretz
Sylvia Tomasch is associate university dean of academic affairs and Joseph Ugoretz is director of technology and learning at Macaulay Honors College at the City University of New York.