You have /5 articles left.
Sign up for a free account or log in.

Just recently I got a set of teaching evaluations for a course that I taught in the fall of 2008 -- and another set for a course I taught in 2006.

This lag wasn't the fault of campus mail (it can be slow, but not that slow). Instead, the evaluations were part of small experiment with long-delayed course assessments, surveys that ask students to reflect on the classes that they have taken a year or two or three earlier.

I've been considering such evaluations ever since I went through the tenure a second time: the first was at a liberal arts college, the second two years later when I moved to a research university. Both institutions valued teaching but took markedly different approaches to student course evaluations. The research university relied almost exclusively on the summary scores of bubble-sheet course evaluations, while the liberal arts college didn't even allow candidates to include end-of-semester forms in tenure files. Instead they contacted former students, including alumni, and asked them to write letters.

In my post-tenure debriefing at the liberal arts college, the provost shared excerpts from the letters. Some sounded similar to comments I would typically see in my end-of-semester course evaluations; others, especially those by alumni, resonated more deeply. They let me know what in my assignments and teaching had staying power.

But how to get that kind of longitudinal feedback at a big, public university?

My first try has been a brief online survey sent to a selection of my former students. Using SurveyMonkey, I cooked up a six-item questionnaire. I'm only mildly tech-savvy and this was my first time creating an online survey, but the software escorted me through the process quickly and easily. I finished in half an hour.

Using my university's online student administration system, I downloaded two course rosters-one from a year ago, one from three years ago. I copied the e-mail address columns and pasted them into the survey. Eight clicks of the mouse later I was ready to send.

I sent the invitation to two sections of a small freshman honors English seminar I teach every other year. This course meets the first-year composition requirement and I teach it with a focus on the ways that writing can work as social action, both inside and outside the academy. During the first half of the semester students engage with a range of reading -- studies of literacy, theories of social change, articles from scholarly journals in composition studies, short stories and poems keyed to questions of social justice, essays from Harpers and The New York Times Magazine, papers written by my former students -- and they write four essays, all revised across drafts. During the latter part of the semester students work in teams on service-learning projects, first researching their local community partner organizations and then doing writing projects that I have worked out in advance of the semester with those organizations.

I taught the course pretty much the same in fall 2008 as I did in fall 2006, except that in 2008 I introduced a portfolio approach to assessment that deferred much of the final paper grading until the end of the course.

Through my online survey I wanted to know what stuck -- which readings (if any) continued to rattle around in their heads, whether all the drafting and revising we did proved relevant (or not) to their writing in other courses, and how the service experience shaped (or didn't) any future community engagement.

My small sample size -- only 28 (originally 30, but 2 students from the original rosters had left or graduated) -- certainly would not pass muster with the psychometricians. But the yield of 18 completed surveys, a response rate of over 60 percent, was encouraging.

I kept the survey short-just six questions -- and promised students that it would take five to ten minutes of their winter break and that their identities would be kept anonymous.

The first item asked them to signal when they had taken the course, in 2006 or 2008. The next two were open-ended: "Have any particular readings, concepts, experiences, etc. from Honors English 1 stayed with you? If so, which ones? Are there any ways that the course shaped how you think and/or write? If so, how?" and "Given your classwork and experiences since taking Honors English 1, what do you wish would have been covered in that course but wasn't?" These were followed by two multiple-choice questions: one about their involvement in community outreach (I wanted to get a rough sense of whether the service-learning component of the course had or hadn't influenced future community engagement); and another that queried whether they would recommend the course to an incoming student. I concluded with an open invitation to comment.

As might be expected from a small, interactive honors seminar, most who responded had favorable memories of the course. But more interesting to me were the specifics: they singled out particular books, stories, and assignments. Several of those I was planning to keep in the course anyway, a few of those I was considering replacing (each semester I fiddle with my reading list). The student comments rescued a few of those.

I also attend to what was not said. The readings and assignments that none of the 18 mentioned will be my prime candidates for cutting from the syllabus.

Without prompting, a few students from the 2008 section singled out the portfolio system as encouraging them to take risks in their writing, which affirms that approach. Students from both sections mentioned the value of the collaborative writing assignments (I'm always struggling with the proportion of individual versus collaborative assignments). Several surprised me by wishing that we had spent more time on prose style.

I also learned that while more than half of the respondents continued to be involved in some kind of community outreach (not a big surprise because they had self-selected a service-learning course), only one continued to work with the same community partner from the course. That suggested that I need to be more deliberate about encouraging such continuity.

In all, the responses didn't trigger a seismic shift in how I'll next teach the course, but they did help me revise with greater confidence and tinker with greater precision.

I am not suggesting that delayed online surveys should replace the traditional captive-audience, end-of-semester evaluations. Delayed surveys likely undercount students who are unmotivated or who had a bad experience in the course and miss entirely those who dropped or transferred out of the institution (and we need feedback from such students). Yet my small experiment suggests that time-tempered evaluations are worth the hour it takes to create and administer the survey.

Next January, another round, and this time with larger, non-honors courses.

Next Story

More from Views