Regular readers of the higher education press have had occasion to learn a great deal about digital developments and online initiatives in higher education. We have heard both about and from those for whom this world is still terra relatively incognita. And, increasingly, we are hearing both about and from those commonly considered to be to be “digital natives” –- the term “native” conveying the idea of their either having been born to the culture in question or being so adapted to it that they might as well have been.
When we think of digital natives, we tend to think of students. But lest we think that things are easy for them, let us bear in mind their problems. Notably, they share the general difficulty of reputation management or what we might consider the adverse consequences of throwing privacy away with both hands when communicating on the internet. More to the point in the world of higher education, many suffer from the unequal distribution of online skills most relevant to academic success –- yet another factor in the extreme socioeconomic inequality that afflicts our nation’s system of higher education.
But let us turn our attention to the faculty, and first to those relatively unschooled in new information technologies. At the extreme, there are those who view the whole business with fear and loathing. We must find ways to persuade them that such an attitude is unworthy of anyone who has chosen education as a vocation and that they would do well to investigate this new world with an explorer’s eye –- not uncritically, to be sure, given the hype surrounding it –- in order to reach informed positions about both the virtues and the limitations of new information technologies.
Others are more receptive, but also rather lost. They are fine with what Jose Bowen calls “teaching naked” (i.e., keeping technology out of the classroom itself), since they have been doing it all their working lives, but are unable to manage the other major part of the program (that is, selecting items to hang in a virtual closet for their students to try on and wear to good effect, so that they come to class well-prepared to make the most of the time together with one another and their instructor). What these faculty members need is the right kind of support: relevant, well-timed, and pedagogically effective –- something far less widely available than it should be.
Digitally adept faculty have challenges of their own, some of which are old problems in new forms. There is, for example, the question of how available to be to their students, which has taken on a new dimension in an age in which channels of communication proliferate and constant connectedness is expected.
And then there is the question of how much of themselves faculty members should reveal to students. How much of their non-academic activities or thoughts should they share by not blocking access online or perhaps even by adding students to some groups otherwise composed of friends?
Many of us have worked with students on civic or political projects –- though not, one hopes, simply imposing our own views upon them. Many of us have already extended our relationship into more personal areas when students have come to us with problems or crises of one sort or another and we have played the role of caring, older adviser. We have enjoyed relatively casual lunches, dinners, kaffeeklatsches with them that have included discussion of a variety of topics, from tastes in food to anecdotes about beloved pets. The question for digital natives goes beyond these kinds of interaction: To what extent should students be allowed in on the channels and kinds of communications that are regularly –- in some cases, relentlessly and obsessively –- shared with friends?
Not all of this, to be sure, is under a faculty member’s control. Possibilities for what sociologists call “role segregation” hinge on an ability to keep the audiences for different roles apart from one another –- hardly something to be counted on in these digital times. But leaving aside the question of how much online information can be kept from students, how much of it should be kept from them?
Will students be better-served, as some faculty members seem to believe, if they see ongoing evidence that their teachers are people with full lives aside from their faculty roles? Should students be recipients of the kinds of texts and tweets that faculty members may be in the habit of sending to friends about movies, shopping, etc.? Given how distracting and boring some of this may be even to friends, one might well wonder. Some students will perhaps get a thrill out of being in a professor’s “loop” on such matters, but do we need to further clutter their lives with trivia? This is an area in which they hardly need additional help.
To put this issue in a wider context: In her 1970 book Culture and Commitment, anthropologist Margaret Mead drew a distinction among three different types of culture: “postfigurative”, in which the young learn from those who have come before; “cofigurative”, in which both adults and children learn a significant amount from their peers; and “prefigurative”, in which adults are in the position of needing to learn much from their children. Not surprisingly, Mead saw us as heading in a clearly prefigurative direction –- and that years before the era of parents and grandparents sitting helplessly in front of computer screens waiting for a little child to lead them.
Without adopting Mead’s specific views on these cultural types, we can find her categories an invitation to thinking about the teaching and learning relationship among the generations. For example, should we just happily leap into prefigurativeness?
Or, to put it in old colonialist terms, should we “go native”? Colonial types saw this as a danger, a giving up of the responsibilities of civilization –- not unlike the way the Internet-phobic see embracing the online world. The repentant colonizers who did decide to “go native”, motivated either by escapism or by a profound love and respect for those they lived and worked with, sometimes ended up with views as limited by their adopted culture (what is called “secondary ethnocentrism”) as were limited by their original one. This aside from the fact that attempts to go native are not always successful and may even seem ridiculous to the real folks.
Perhaps it is helpful to think of ourselves first as anthropologists. We certainly need to understand the world in which we ply our trade, not only so that we can do our work, but also because we are generally possessed of intellectual curiosity and have chosen our vocation because we like working in a community. We believe that we have much to learn from the people we study and, at the same time, know that we can see at least some things more clearly because we have the eyes of outsiders.
But we are also missionaries, since we feel we have something of value to share –- to share, to be sure, not simply to impose. What might that something be?
In the most basic sense, it is the ability to focus, to pay attention, take time to learn, looking back at least as often as looking forward. Most of our students live in a noisy world of ongoing virtual connectedness, relentless activity, nonstop polytasking (how tired are we of the word “multitasking”?). Like the rest of us, they suffer from the fact that too much information is the equivalent of too little. Like the rest of us, they live in a world in which innovation is not simply admired, but fetishized.
So, even as we avail ourselves of the educational benefits of new information technologies, we might think of complementing this with a Slow Teaching movement, not unlike the Slow Food movement founded by Carlo Petrini in 1986 with the goal of preserving all that was delicious and nutritious in traditional cuisine. We have such traditions to share with our students even as we become more knowledgeable about the world in which they move.
Our students and junior colleagues don’t need us to be them; they need us to be us. Or, as Oscar Wilde so engagingly put it: Be yourself; everyone else is already taken.
Judith Shapiro is president of the Teagle Foundation and former president of Barnard College.
Most readers’ first response to David Shumway’s Rock Star: The Making of Musical Icons from Elvis to Springsteen (Johns Hopkins University Press) will be to scan its table of contents and index with perplexity at the performers left out, or barely mentioned. Speaking on behalf of (among others) Lou Reed, Joe Strummer, and Sly and the Family Stone fans everywhere, let me say: There will be unhappiness.
For that matter, just listing the featured artists may do the trick. Besides the names given in the subtitle, we find James Brown, Bob Dylan, the Rolling Stones, the Grateful Dead, and Joni Mitchell – something like the lineup for an hour of programming at a classic rock station. Shumway, a professor of English at Carnegie Mellon University, makes no claim to be writing the history of rock, much less formulating a canon. The choice of artists is expressly a matter of his own tastes, although he avoids the sort of critical impressionism (see: Lester Bangs) that often prevails in rock writing. The author is a fan, meaning he has a history with the music. But his attention extends wider and deeper than that, and it moves in directions that should be of interest to any reader who can get past “Why isn’t _____ here?”
More than a set of commentaries on individuals and groups, Rock Star is a critical study of a cultural category -- and a reflection on its conditions of existence. Conditions which are now, arguably, far along the way to disappearing.
The name of the first rock song or performer is a matter for debate, but not the identity of the first rock star. Elvis had not only the hits but the pervasive, multimedia presence that Shumway regards as definitive. Concurring with scholars who have traced the metamorphoses of fame across the ages (from the glory of heroic warriors to the nuisance of inexplicable celebrities), Shumway regards the movie industry as the birthplace of “the star” as a 20th-century phenomenon: a performer whose talent, personality, and erotic appeal might be cultivated and projected in a very profitable way for everyone involved.
The audience enjoyed what the star did on screen, of course, but was also fascinated by the “real” person behind those characters. The scare quotes are necessary given that the background and private life presented to the public were often somewhat fictionalized and stage-managed. Fans were not always oblivious to the workings of the fame machine. But that only heightened the desire for an authentic knowledge of the star.
Elvis could never have set out to be a rock star, of course – and by the time Hollywood came around to cast him in dozens of films, he was already an icon thanks to recordings and television appearances. But his fame was of a newer and more symbolically charged kind than that of earlier teen idols.
Elvis was performing African-American musical styles and dance steps on network television just a few years after Brown v. Board of Education – but that wasn’t all. “The terms in which Elvis’s performance was discussed,” Shulway writes, “are ones usually applied to striptease: for example, ‘bumping and grinding.’ ” He dressed like a juvenile delinquent (the object of great public concern at the time) while being attentive to his appearance, in particular his hair, to a degree that newspaper writers considered feminine.
The indignation Elvis generated rolled up a number of moral panics into one, and the fans loved him for it. That he was committing all these outrages while being a soft-spoken, polite young man – one willing to wear a coat and tails to sing “Hound Dog” to a basset hound on "The Milton Berle Show" (and later to put on Army fatigues, when Uncle Sam insisted) only made the star power more intense: those not outraged by him could imagine him as a friend.
Elvis was the prototype, but he wasn’t a template. Shumway’s other examples of the rock star share a penchant for capturing and expressing social issues and cultural conflicts in both their songs and how they present themselves, onstage and off. But they do this in very different ways – in the cases of James Brown and Bob Dylan, changing across the length of their careers, gaining and losing sections of their audience with each new phase. The shifts and self-reinventions were very public and sometimes overtly political (with James Brown's support for Richard Nixon being one example) but also reflected in stylistic and musical shifts. In their day, such changes were sometimes not just reactions to the news but part of it, and part of the conversations people had about the world.
Besides the size of the audience, what distinguishes the rock star from other performers is the length of the career, or so goes Shumway’s interpretation of the phenomenon. But rewarding as the book can be – it put songs or albums I’ve heard a thousand times into an interesting new context – some of the omissions are odd. In particular (and keeping within the timespan Shumway covers) the absence of Jimi Hendrix, Janis Joplin, and Jim Morrison seems problematic. I say that not as a fan disappointed not to find them, but simply on the grounds that each one played an enormous role in constituting what people mean by the term “rock star.” (That includes other rock stars. Patti Smith elevated Morrison to mythological status in her own work, while the fact that all three died at 27 was on Kurt Cobain’s mind when he killed himself at the same age.)
I wrote to Shumway to ask about that. (Also to express relief that he left out Alice Cooper, my own rock-history obsession. Publishers offering six-figure advances for a work of cultural criticism should make their bids by email.)
“My choices are to some extent arbitrary,” he wrote back. “One bias that shaped them is my preference for less theatrical performers as opposed to people such as David Bowie (who I have written about, but chose not to include here) or Alice Cooper.” But leaving out the three who died at 27 “was more than a product of bias. Since I wanted to explore rock stars’ personas, I believed that it was more interesting to write about people who didn’t seem to be playing characters on stage or record. I agree with you about the great influence of Jim Morrison, Janis Joplin, and Jimi Hendrix, but I don’t think their personas have the complexity of the ones I did write about. And, they didn’t figure politically to the degree that my seven did. The main point, however, is that there is lots of work to be done here, and I hope that other critics will examine the personas the many other rock stars I did not include.”
The other thing that struck me while reading Rock Star was the sense that it portrayed a world now lost, or at least fading into memory. Rock is so splintered now, and the "technology of celebrity" so pervasive, that the kind of public presence Shumway describes might not be possible now.
“The cause is less the prevalence of celebrity,” he replied, “than the decline of the mass media. Stars are never made by just one medium, but by the interaction of several. Earlier stars depended on newspapers and magazines to keep them alive in their fans hearts and minds between performances. Radio and TV intensified these effects. And of course, movie studios and record companies had a great deal of control over what the public got to see and hear. The result was that very many people saw and heard the same performances and read the same gossip or interviews. With the fragmentation of the media into increasingly smaller niches, that is no longer the case. The role of the internet in music distribution has had an especially devastating effect on rock stardom by reducing record companies’ income and the listeners’ need for albums. The companies aren’t investing as much in making stars and listeners are buying songs they like regardless of who sings them.”
That's not a bad thing, as such, but it makes for a more compartmentalized culture, while the beautiful thing with rock 'n' roll is when it blows the doors off their hinges.