Essay calling on faculty members to learn their students' names

As summer ends, professors across the country are gearing up for a new academic year: refurbishing old syllabuses, reviewing some alternate readings, perhaps adding service learning or a new assessment tool to their courses. I’m designing one entirely new seminar, plus working with colleagues to rethink our team-taught intro class. It all requires time and energy, and has to be done. But the best thing I do to improve students’ work in my courses is far simpler.

I will learn and use their names. It’s easy, and it works.

Using those names in class is uniquely powerful. As Dale Carnegie said, “Remember that a man’s [sic] name is to him the sweetest and most important sound in the English language.” (Of course we know today that this is true for a woman too.) A student who hears his name suddenly becomes completely alert; one who hears herself quoted (“As Hannah said, Machiavelli was just trying to be realistic”) will be replaying those words in her head, over and over, for at least a week.

I used to learn names by taking the class list and scribbling descriptions, and for a time I would videotape students actually speaking their names, then review the tape every morning over my Cheerios. My current technique, at least for larger classes, is flashcards. The first day I line up the students alphabetically (they’ll already be smiling at each other, with a nice excuse for meeting), then take their pictures one by one, bantering like a novice fashion photographer  (“Excellent!”  “You look sharp,”  “Nice t-shirt,”  “Great smile,” and so on).

After being photographed, the students write their preferred first and last name, with phonetic guides if needed, on a pressure-sensitive file label, a sheet of which lies on the desk. At the end of the day, I deliver the pictures to a one-hour development kiosk, and by morning have a full deck of photos, each with a name stuck on the back.  Before each class meeting I spend a few minutes going through the deck again, memorizing the names. Whenever I pick up a new tidbit about a student I’ll write it on the back: “Plays lacrosse,” “Civil War buff,” “always wears these glasses,” “from Vermont.” The names take maybe four class meetings to learn; last fall, when I had 82 students in two courses, it required about two weeks in total. 

And the technique, or at least its principle of individualized recognition, is scalable. With smaller classes (say, 29 students or less), you can make up nameplates – just a folded paper card will work, with names on the front. Within a few days not only will you know their names, the students will also know everyone else’s – a nice side benefit, and very helpful in seminars. With larger classes, learning the names certainly takes more work -- although a dean of students I once knew was famous for knowing and using the names of all 700 or so students at his college, from the day they matriculated. It’s impressive if you do learn so many; even if you can’t, your teaching assistants can learn students’ names in their sections. Or even without knowing any names, a lecturer who pays attention can spot a puzzled student and say, “Do you have a question?”  It is possible to connect well, with even a large class.

Why is knowing someone’s name or acknowledging them individually so important? Any person’s name is emotionally loaded to that person, and has the power to pull him or her into whatever is going on. By putting that person at the center of attention, naming takes only a moment from you – but for them, it is deeply affecting, and lasts.

But more than that, calling a student by name opens the door to a more personal connection, inviting the student to see the professor (and professors generally) as a human being, maybe a role model or even a kind of friend. In the 10-year longitudinal study that Chris Takacs and I did of a cohort of students moving through college (for our book How College Works), students who found congenial advisers, or even full-fledged mentors, were more likely to stay in school, to learn more, and to enjoy the entire experience.

Several years ago I saw Jon Stewart, the television show host, deliver a marvelous 74-minute stand-up comedy routine for an audience of 5,000 people, apparently with no notes whatsoever. Stewart worked the crowd, picking up on what we liked, playing off of a few local references, sensing groups in the audience who responded differently, asking questions, riding the laughs but knowing when to quiet our responses.  He connected with us; he made us part of the show. It was exciting and memorable.

I’m no Jon Stewart, nor a match for that dean of students. But once about 20 years ago I had a social psychology class of 144 students. Armed with the freshman facebook (small “f,” remember that?) photos and some scribbled hints, I worked on their names for a couple of weeks. Then one day I came into class and started pointing at each student, slowly speaking his or her name. Some were easy, others took a moment; still others I skipped, to return to when I remembered or had eliminated possibilities. As I progressed around the room, students became increasingly focused on what I was doing, smiling and laughing at who was remembered, and who took a minute.  Eventually I got to the last few, the people at the outer edge of my mnemonic ability. When I declared that last name – correctly -- the entire class hesitated, and then erupted in a long, sustained round of applause. Some cheers were thrown in.

And the course went well.


Daniel F. Chambliss is Eugene M. Tobin Distinguished Professor of Sociology at Hamilton College. He is the author, with Christopher G. Takacs, of How College Works (Harvard University Press).


Editorial Tags: 

U. of Saskatchewan Ends Presidential Veto on Tenure

As part of a deal with its faculty union, the University of Saskatchewan has agreed to end the right of the president to veto tenure decisions, The Star Phoenix reported. Faculty at the university see the veto as antithetical to academic freedom. The agreement comes in the wake of numerous disputes over the relative power of administrators and faculty members at the Canadian university.


Ad keywords: 

Survey suggests colleges are passing ACA-related cost increases on to employees

Smart Title: 

Unsure about how insurance costs will fare when Affordable Care Act is fully in place, institutions are passing on anticipated cost increases to employees, CUPA-HR survey suggests.

Academic Minute: An HIV-Resistant Flavor Enhancer

In today's Academic Minute, Stefan Sarafianos, a professor of microbiology and immunology at the University of Missouri, discusses his research on how compounds present in soy may be effective in helping resist the HIV virus. Learn more about the Academic Minute here.


Ad keywords: 

California tells insurance providers that they can't cut abortion

Smart Title: 

California tells insurers that they can't alter the plans they provide to Catholic colleges that wanted to drop coverage.

Essay on working 40 hours a week as an academic

It's possible to be a successful academic without working more than 40 hours a week, writes Trish Roberts-Miller.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

Essay on technology issues facing students and faculty members

Regular readers of the higher education press have had occasion to learn a great deal about digital developments and online initiatives in higher education. We have heard both about and from those for whom this world is still terra relatively incognita. And, increasingly, we are hearing both about and from those commonly considered to be to be “digital natives” –- the term “native” conveying the idea of their either having been born to the culture in question or being so adapted to it that they might as well have been.

When we think of digital natives, we tend to think of students. But lest we think that things are easy for them, let us bear in mind their problems. Notably, they share the general difficulty of reputation management or what we might consider the adverse consequences of throwing privacy away with both hands when communicating on the internet. More to the point in the world of higher education, many suffer from the unequal distribution of online skills most relevant to academic success –- yet another factor in the extreme socioeconomic inequality that afflicts our nation’s system of higher education.

But let us turn our attention to the faculty, and first to those relatively unschooled in new information technologies. At the extreme, there are those who view the whole business with fear and loathing. We must find ways to persuade them that such an attitude is unworthy of anyone who has chosen education as a vocation and that they would do well to investigate this new world with an explorer’s eye –- not uncritically, to be sure, given the hype surrounding it –- in order to reach informed positions about both the virtues and the limitations of new information technologies.

Others are more receptive, but also rather lost. They are fine with what Jose Bowen calls “teaching naked” (i.e., keeping technology out of the classroom itself), since they have been doing it all their working lives, but are unable to manage the other major part of the program (that is, selecting items to hang in a virtual closet for their students to try on and wear to good effect, so that they come to class well-prepared to make the most of the time together with one another and their instructor). What these faculty members need is the right kind of support: relevant, well-timed, and pedagogically effective –- something far less widely available than it should be.

Digitally adept faculty have challenges of their own, some of which are old problems in new forms. There is, for example, the question of how available to be to their students, which has taken on a new dimension in an age in which channels of communication proliferate and constant connectedness is expected.

And then there is the question of how much of themselves faculty members should reveal to students. How much of their non-academic activities or thoughts should they share by not blocking access online or perhaps even by adding students to some groups otherwise composed of friends?

Many of us have worked with students on civic or political projects –- though not, one hopes, simply imposing our own views upon them. Many of us have already extended our relationship into more personal areas when students have come to us with problems or crises of one sort or another and we have played the role of caring, older adviser. We have enjoyed relatively casual lunches, dinners, kaffeeklatsches with them that have included discussion of a variety of topics, from tastes in food to anecdotes about beloved pets. The question for digital natives goes beyond these kinds of interaction: To what extent should students be allowed in on the channels and kinds of communications that are regularly –- in some cases, relentlessly and obsessively –- shared with friends?

Not all of this, to be sure, is under a faculty member’s control. Possibilities for what sociologists call “role segregation” hinge on an ability to keep the audiences for different roles apart from one another –- hardly something to be counted on in these digital times. But leaving aside the question of how much online information can be kept from students, how much of it should be kept from them?

Will students be better-served, as some faculty members seem to believe, if they see ongoing evidence that their teachers are people with full lives aside from their faculty roles? Should students be recipients of the kinds of texts and tweets that faculty members may be in the habit of sending to friends about movies, shopping, etc.? Given how distracting and boring some of this may be even to friends, one might well wonder. Some students will perhaps get a thrill out of being in a professor’s “loop” on such matters, but do we need to further clutter their lives with trivia? This is an area in which they hardly need additional help.

To put this issue in a wider context: In her 1970 book Culture and Commitment, anthropologist Margaret Mead drew a distinction among three different types of culture: “postfigurative”, in which the young learn from those who have come before; “cofigurative”, in which both adults and children learn a significant amount from their peers; and “prefigurative”, in which adults are in the position of needing to learn much from their children. Not surprisingly, Mead saw us as heading in a clearly prefigurative direction –- and that years before the era of parents and grandparents sitting helplessly in front of computer screens waiting for a little child to lead them.

Without adopting Mead’s specific views on these cultural types, we can find her categories an invitation to thinking about the teaching and learning relationship among the generations. For example, should we just happily leap into prefigurativeness? 

Or, to put it in old colonialist terms, should we “go native”? Colonial types saw this as a danger, a giving up of the responsibilities of civilization –- not unlike the way the Internet-phobic see embracing the online world. The repentant colonizers who did decide to “go native”, motivated either by escapism or by a profound love and respect for those they lived and worked with, sometimes ended up with views as limited by their adopted culture (what is called “secondary ethnocentrism”) as were limited by their original one.  This aside from the fact that attempts to go native are not always successful and may even seem ridiculous to the real folks.

Perhaps it is helpful to think of ourselves first as anthropologists. We certainly need to understand the world in which we ply our trade, not only so that we can do our work, but also because we are generally possessed of intellectual curiosity and have chosen our vocation because we like working in a community. We believe that we have much to learn from the people we study and, at the same time, know that we can see at least some things more clearly because we have the eyes of outsiders.

But we are also missionaries, since we feel we have something of value to share –- to share, to be sure, not simply to impose. What might that something be?

In the most basic sense, it is the ability to focus, to pay attention, take time to learn, looking back at least as often as looking forward. Most of our students live in a noisy world of ongoing virtual connectedness, relentless activity, nonstop polytasking (how tired are we of the word “multitasking”?). Like the rest of us, they suffer from the fact that too much information is the equivalent of too little. Like the rest of us, they live in a world in which innovation is not simply admired, but fetishized.

So, even as we avail ourselves of the educational benefits of new information technologies, we might think of complementing this with a Slow Teaching movement, not unlike the Slow Food movement founded by Carlo Petrini in 1986 with the goal of preserving all that was delicious and nutritious in traditional cuisine.  We have such traditions to share with our students even as we become more knowledgeable about the world in which they move.

Our students and junior colleagues don’t need us to be them; they need us to be us. Or, as Oscar Wilde so engagingly put it: Be yourself; everyone else is already taken.

Judith Shapiro is president of the Teagle Foundation and former president of Barnard College.

Editorial Tags: 

Group wants to help professors fight against sexual assault on campus

Smart Title: 

Some faculty members want to play a bigger role in the fight against campus sexual assault. A new national advocacy group aims to help them do that.

Essay on using or ignoring teaching innovations


Just because a teaching idea is hot doesn't mean you need to embrace it, writes Rob Weir.

Job Tags: 
Editorial Tags: 
Show on Jobs site: 

Review of David R. Shumway, "Rock Star: The Making of Musical Icons from Elvis to Springsteen"

Most readers’ first response to David Shumway’s Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press) will be to scan its table of contents and index with perplexity at the performers left out, or barely mentioned. Speaking on behalf of (among others) Lou Reed, Joe Strummer, and Sly and the Family Stone fans everywhere, let me say: There will be unhappiness.

For that matter, just listing the featured artists may do the trick. Besides the names given in the subtitle, we find James Brown, Bob Dylan, the Rolling Stones, the Grateful Dead, and Joni Mitchell – something like the lineup for an hour of programming at a classic rock station. Shumway, a professor of English at Carnegie Mellon University, makes no claim to be writing the history of rock, much less formulating a canon. The choice of artists is expressly a matter of his own tastes, although he avoids the sort of critical impressionism (see: Lester Bangs) that often prevails in rock writing. The author is a fan, meaning he has a history with the music. But his attention extends wider and deeper than that, and it moves in directions that should be of interest to any reader who can get past “Why isn’t _____ here?”

More than a set of commentaries on individuals and groups, Rock Star is a critical study of a cultural category -- and a reflection on its conditions of existence. Conditions which are now, arguably, far along the way to disappearing.

The name of the first rock song or performer is a matter for debate, but not the identity of the first rock star. Elvis had not only the hits but the pervasive, multimedia presence that Shumway regards as definitive. Concurring with scholars who have traced the metamorphoses of fame across the ages (from the glory of heroic warriors to the nuisance of inexplicable celebrities), Shumway regards the movie industry as the birthplace of “the star” as a 20th-century phenomenon: a performer whose talent, personality, and erotic appeal might be cultivated and projected in a very profitable way for everyone involved.

The audience enjoyed what the star did on screen, of course, but was also fascinated by the “real” person behind those characters. The scare quotes are necessary given that the background and private life presented to the public were often somewhat fictionalized and stage-managed. Fans were not always oblivious to the workings of the fame machine. But that only heightened the desire for an authentic knowledge of the star.

Elvis could never have set out to be a rock star, of course – and by the time Hollywood came around to cast him in dozens of films, he was already an icon thanks to recordings and television appearances. But his fame was of a newer and more symbolically charged kind than that of earlier teen idols.

Elvis was performing African-American musical styles and dance steps on network television just a few years after Brown v. Board of Education – but that wasn’t all. “The terms in which Elvis’s performance was discussed,” Shulway writes, “are ones usually applied to striptease: for example, ‘bumping and grinding.’ ” He dressed like a juvenile delinquent (the object of great public concern at the time) while being attentive to his appearance, in particular his hair, to a degree that newspaper writers considered feminine.

The indignation Elvis generated rolled up a number of moral panics into one, and the fans loved him for it. That he was committing all these outrages while being a soft-spoken, polite young man – one willing to wear a coat and tails to sing “Hound Dog” to a basset hound on "The Milton Berle Show" (and later to put on Army fatigues, when Uncle Sam insisted) only made the star power more intense: those not outraged by him could imagine him as a friend.

Elvis was the prototype, but he wasn’t a template. Shumway’s other examples of the rock star share a penchant for capturing and expressing social issues and cultural conflicts in both their songs and how they present themselves, onstage and off. But they do this in very different ways – in the cases of James Brown and Bob Dylan, changing across the length of their careers, gaining and losing sections of their audience with each new phase. The shifts and self-reinventions were very public and sometimes overtly political (with James Brown's support for Richard Nixon being one example) but also reflected in stylistic and musical shifts. In their day, such changes were sometimes not just reactions to the news but part of it, and part of the conversations people had about the world.  

Besides the size of the audience, what distinguishes the rock star from other performers is the length of the career, or so goes Shumway’s interpretation of the phenomenon. But rewarding as the book can be – it put songs or albums I’ve heard a thousand times into an interesting new context – some of the omissions are odd. In particular (and keeping within the timespan Shumway covers) the absence of Jimi Hendrix, Janis Joplin, and Jim Morrison seems problematic. I say that not as a fan disappointed not to find them, but simply on the grounds that each one played an enormous role in constituting what people mean by the term “rock star.” (That includes other rock stars. Patti Smith elevated Morrison to mythological status in her own work, while the fact that all three died at 27 was on Kurt Cobain’s mind when he killed himself at the same age.)

I wrote to Shumway to ask about that. (Also to express relief that he left out Alice Cooper, my own rock-history obsession. Publishers offering six-figure advances for a work of cultural criticism should make their bids by email.)

“My choices are to some extent arbitrary,” he wrote back. “One bias that shaped them is my preference for less theatrical performers as opposed to people such as David Bowie (who I have written about, but chose not to include here) or Alice Cooper.” But leaving out the three who died at 27 “was more than a product of bias. Since I wanted to explore rock stars’ personas, I believed that it was more interesting to write about people who didn’t seem to be playing characters on stage or record. I agree with you about the great influence of Jim Morrison, Janis Joplin, and Jimi Hendrix, but I don’t think their personas have the complexity of the ones I did write about. And, they didn’t figure politically to the degree that my seven did. The main point, however, is that there is lots of work to be done here, and I hope that other critics will examine the personas the many other rock stars I did not include.”

The other thing that struck me while reading Rock Star was the sense that it portrayed a world now lost, or at least fading into memory. Rock is so splintered now, and the "technology of celebrity" so pervasive, that the kind of public presence Shumway describes might not be possible now. 

“The cause is less the prevalence of celebrity,” he replied, “than the decline of the mass media. Stars are never made by just one medium, but by the interaction of several. Earlier stars depended on newspapers and magazines to keep them alive in their fans hearts and minds between performances. Radio and TV intensified these effects. And of course, movie studios and record companies had a great deal of control over what the public got to see and hear. The result was that very many people saw and heard the same performances and read the same gossip or interviews. With the fragmentation of the media into increasingly smaller niches, that is no longer the case. The role of the internet in music distribution has had an especially devastating effect on rock stardom by reducing record companies’ income and the listeners’ need for albums. The companies aren’t investing as much in making stars and listeners are buying songs they like regardless of who sings them.”

That's not a bad thing, as such, but it makes for a more compartmentalized culture, while the beautiful thing with rock 'n' roll is when it blows the doors off their hinges.



Editorial Tags: 


Subscribe to RSS - faculty
Back to Top